WorldWideScience

Sample records for volume source based

  1. Source-based morphometry reveals distinct patterns of aberrant brain volume in delusional infestation.

    Science.gov (United States)

    Wolf, Robert Ch; Huber, Markus; Lepping, Peter; Sambataro, Fabio; Depping, Malte S; Karner, Martin; Freudenmann, Roland W

    2014-01-03

    Little is known about the neural correlates of delusional infestation (DI), the delusional belief to be infested with pathogens. So far, evidence comes mainly from case reports and case series. We investigated brain morphology in 16 DI patients and 16 healthy controls using structural magnetic resonance imaging and a multivariate data analysis technique, i.e. source-based morphometry (SBM). In addition, we explored differences in brain structure in patient subgroups based on disease aetiology. SBM revealed two patterns exhibiting significantly (pdisorder) and "organic" DI (DI due to a medical condition). In contrast, aberrant white matter volume was only confirmed for the "organic" DI patient subgroup. These results suggest prefrontal, temporal, parietal, insular, thalamic and striatal dysfunction underlying DI. Moreover, the data suggest that aetiologically distinct presentations of DI share similar patterns of abnormal grey matter volume, whereas aberrant white matter volume appears to be restricted to organic cases. © 2013.

  2. Finite Volume Based Computer Program for Ground Source Heat Pump System

    Energy Technology Data Exchange (ETDEWEB)

    Menart, James A. [Wright State University

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled ?Finite Volume Based Computer Program for Ground Source Heat Pump Systems.? The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump

  3. Recovery Act: Finite Volume Based Computer Program for Ground Source Heat Pump Systems

    Energy Technology Data Exchange (ETDEWEB)

    James A Menart, Professor

    2013-02-22

    This report is a compilation of the work that has been done on the grant DE-EE0002805 entitled Finite Volume Based Computer Program for Ground Source Heat Pump Systems. The goal of this project was to develop a detailed computer simulation tool for GSHP (ground source heat pump) heating and cooling systems. Two such tools were developed as part of this DOE (Department of Energy) grant; the first is a two-dimensional computer program called GEO2D and the second is a three-dimensional computer program called GEO3D. Both of these simulation tools provide an extensive array of results to the user. A unique aspect of both these simulation tools is the complete temperature profile information calculated and presented. Complete temperature profiles throughout the ground, casing, tube wall, and fluid are provided as a function of time. The fluid temperatures from and to the heat pump, as a function of time, are also provided. In addition to temperature information, detailed heat rate information at several locations as a function of time is determined. Heat rates between the heat pump and the building indoor environment, between the working fluid and the heat pump, and between the working fluid and the ground are computed. The heat rates between the ground and the working fluid are calculated as a function time and position along the ground loop. The heating and cooling loads of the building being fitted with a GSHP are determined with the computer program developed by DOE called ENERGYPLUS. Lastly COP (coefficient of performance) results as a function of time are provided. Both the two-dimensional and three-dimensional computer programs developed as part of this work are based upon a detailed finite volume solution of the energy equation for the ground and ground loop. Real heat pump characteristics are entered into the program and used to model the heat pump performance. Thus these computer tools simulate the coupled performance of the ground loop and the heat pump. The

  4. Volume and Surface-Enhanced Volume Negative Ion Sources

    CERN Document Server

    Stockli, M P

    2013-01-01

    H- volume sources and, especially, caesiated H- volume sources are important ion sources for generating high-intensity proton beams, which then in turn generate large quantities of other particles. This chapter discusses the physics and technology of the volume production and the caesium-enhanced (surface) production of H- ions. Starting with Bacal's discovery of the H- volume production, the chapter briefly recounts the development of some H- sources, which capitalized on this process to significantly increase the production of H- beams. Another significant increase was achieved in the 1990s by adding caesiated surfaces to supplement the volume-produced ions with surface-produced ions, as illustrated with other H- sources. Finally, the focus turns to some of the experience gained when such a source was successfully ramped up in H- output and in duty factor to support the generation of 1 MW proton beams for the Spallation Neutron Source.

  5. Simulation study of a magnetocardiogram based on a virtual heart model: effect of a cardiac equivalent source and a volume conductor

    Institute of Scientific and Technical Information of China (English)

    Shou Guo-Fa; Xia Ling; Ma Ping; Tang Fa-Kuan; Dai Ling

    2011-01-01

    In this paper, we present a magnetocardiogram (MCG) simulation study using the boundary element method (BEM) and based on the virtual heart model and the realistic human volume conductor model. The different contributions of cardiac equivalent source models and volume conductor models to the MCG are deeply and comprehensively investigated. The single dipole source model, the multiple dipoles source model and the equivalent double layer (EDL) source model are analysed and compared with the cardiac equivalent source models. Meanwhile, the effect of the volume conductor model on the MCG combined with these cardiac equivalent sources is investigated. The simulation results demonstrate that the cardiac electrophysiological information will be partly missed when only the single dipole source is taken, while the EDL source is a good option for MCG simulation and the effect of the volume conductor is smallest for the EDL source. Therefore, the EDL source is suitable for the study of MCG forward and inverse problems, and more attention should be paid to it in future MCG studies.

  6. Effects of volume conductor and source configuration on simulated magnetogastrograms

    Energy Technology Data Exchange (ETDEWEB)

    Komuro, Rie; Qiao Wenlian; Pullan, Andrew J; Cheng, Leo K, E-mail: l.cheng@auckland.ac.n [Auckland Bioengineering Institute, University of Auckland, Auckland (New Zealand)

    2010-11-21

    Recordings of the magnetic fields (MFs) arising from gastric electrical activity (GEA) have been shown to be able to distinguish between normal and certain abnormal GEA. Mathematical models provide a powerful tool for revealing the relationship between the underlying GEA and the resultant magnetogastrograms (MGGs). However, it remains uncertain the relative contributions that different volume conductor and dipole source models have on the resultant MFs. In this study, four volume conductor models (free space, sphere, half space and an anatomically realistic torso) and two dipole source configurations (containing 320 moving dipole sources and a single equivalent moving dipole source) were used to simulate the external MFs. The effects of different volume conductor models and dipole source configurations on the MF simulations were examined. The half space model provided the best approximation of the MFs produced by the torso model in the direction normal to the coronal plane. This was despite the fact that the half space model does not produce secondary sources, which have been shown to contribute up to 50% of the total MFs when an anatomically realistic torso model was used. We conclude that a realistic representation of the volume conductor and a detailed dipole source model are likely to be necessary when using a model-based approach for interpreting MGGs.

  7. A highly detailed FEM volume conductor model based on the ICBM152 average head template for EEG source imaging and TCS targeting.

    Science.gov (United States)

    Haufe, Stefan; Huang, Yu; Parra, Lucas C

    2015-08-01

    In electroencephalographic (EEG) source imaging as well as in transcranial current stimulation (TCS), it is common to model the head using either three-shell boundary element (BEM) or more accurate finite element (FEM) volume conductor models. Since building FEMs is computationally demanding and labor intensive, they are often extensively reused as templates even for subjects with mismatching anatomies. BEMs can in principle be used to efficiently build individual volume conductor models; however, the limiting factor for such individualization are the high acquisition costs of structural magnetic resonance images. Here, we build a highly detailed (0.5mm(3) resolution, 6 tissue type segmentation, 231 electrodes) FEM based on the ICBM152 template, a nonlinear average of 152 adult human heads, which we call ICBM-NY. We show that, through more realistic electrical modeling, our model is similarly accurate as individual BEMs. Moreover, through using an unbiased population average, our model is also more accurate than FEMs built from mismatching individual anatomies. Our model is made available in Matlab format.

  8. OFF, Open source Finite volume Fluid dynamics code: A free, high-order solver based on parallel, modular, object-oriented Fortran API

    Science.gov (United States)

    Zaghi, S.

    2014-07-01

    OFF, an open source (free software) code for performing fluid dynamics simulations, is presented. The aim of OFF is to solve, numerically, the unsteady (and steady) compressible Navier-Stokes equations of fluid dynamics by means of finite volume techniques: the research background is mainly focused on high-order (WENO) schemes for multi-fluids, multi-phase flows over complex geometries. To this purpose a highly modular, object-oriented application program interface (API) has been developed. In particular, the concepts of data encapsulation and inheritance available within Fortran language (from standard 2003) have been stressed in order to represent each fluid dynamics "entity" (e.g. the conservative variables of a finite volume, its geometry, etc…) by a single object so that a large variety of computational libraries can be easily (and efficiently) developed upon these objects. The main features of OFF can be summarized as follows: Programming LanguageOFF is written in standard (compliant) Fortran 2003; its design is highly modular in order to enhance simplicity of use and maintenance without compromising the efficiency; Parallel Frameworks Supported the development of OFF has been also targeted to maximize the computational efficiency: the code is designed to run on shared-memory multi-cores workstations and distributed-memory clusters of shared-memory nodes (supercomputers); the code's parallelization is based on Open Multiprocessing (OpenMP) and Message Passing Interface (MPI) paradigms; Usability, Maintenance and Enhancement in order to improve the usability, maintenance and enhancement of the code also the documentation has been carefully taken into account; the documentation is built upon comprehensive comments placed directly into the source files (no external documentation files needed): these comments are parsed by means of doxygen free software producing high quality html and latex documentation pages; the distributed versioning system referred as git

  9. Long-range and wide field of view optical coherence tomography for in vivo 3D imaging of large volume object based on akinetic programmable swept source.

    Science.gov (United States)

    Song, Shaozhen; Xu, Jingjiang; Wang, Ruikang K

    2016-11-01

    Current optical coherence tomography (OCT) imaging suffers from short ranging distance and narrow imaging field of view (FOV). There is growing interest in searching for solutions to these limitations in order to expand further in vivo OCT applications. This paper describes a solution where we utilize an akinetic swept source for OCT implementation to enable ~10 cm ranging distance, associated with the use of a wide-angle camera lens in the sample arm to provide a FOV of ~20 x 20 cm(2). The akinetic swept source operates at 1300 nm central wavelength with a bandwidth of 100 nm. We propose an adaptive calibration procedure to the programmable akinetic light source so that the sensitivity of the OCT system over ~10 cm ranging distance is substantially improved for imaging of large volume samples. We demonstrate the proposed swept source OCT system for in vivo imaging of entire human hands and faces with an unprecedented FOV (up to 400 cm(2)). The capability of large-volume OCT imaging with ultra-long ranging and ultra-wide FOV is expected to bring new opportunities for in vivo biomedical applications.

  10. Transportation Cluster Volume 3 [Small Power Sources].

    Science.gov (United States)

    Pennsylvania State Dept. of Justice, Harrisburg. Bureau of Correction.

    The document is one of seven volumes of instructional materials developed around a cluster of Transportation Industries. Primarily technical in focus, they are designed to be used in a cluster-concept program and to integrate with a regular General Education Development (G.E.D.) program so that students may attain an employable skill level and a…

  11. A wavelength tunable photon source with sealed inner volume

    DEFF Research Database (Denmark)

    2014-01-01

    There is presented a method of providing a wavelength tunable photon source (200), comprising bonding a first element (101) with a first mirror (106), a second element (102) with a second mirror (108) and a third element (103) with a photon emitter together in a structure enclosing an inner volume...... (214) being a sealed volume, and forming a bonding interface (212) which is gas-tight, so that the first mirror (106) is placed in the inner volume (214) so the first mirror (106) may move within the inner volume (214). The method provides a relatively simple way of obtaining a tunable photon source...

  12. Ecr Driven Multicusp Volume h- Source

    Science.gov (United States)

    Bacal, M.; Ivanov, A. A., Jr.; Rouille, C.; Bechu, S.; Pelletier, J.

    2004-11-01

    The plasma of the source is created by seven elementary multi-dipolar ECR sources (2.45GHz). Each of these sources consists of a permanent magnet mounted at the extremity of a coaxial feedthrough. This plasma is confined by a cylindrical multicusp configuration, which traps the hot electrons at the outer boundary of the system thus favoring the creation of optimum plasma conditions necessary for H^- generation (in particular, the electron temperature of the order of 0.6 1.0 eV). We studied the density and temperature of the plasma and of the hydrogen negative ions in the center of the source, as well as the extracted currents, in the pressure range 1 to 4 mTorr, with a microwave power of 1 kW. The negative ion density measured at 4.5 to 9.5 cm from the ECR sources is going up with pressure, while the extracted current (measured at 19.5 to 24.5 cm from the sources) attains a maximum at 1.5 mTorr. Approaching the multi-dipolar sources to the extractor leads to the increase of the extracted negative ion current. Tantalum evaporation also leads to the increase of the extracted negative ion current . The effect of a collar in front of the plasma electrode will also be described. . The support of EEC (Contract HPRI-CT-2001-50021) is gratefully acknowledged.

  13. Dictionary Based Segmentation in Volumes

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Jespersen, Kristine Munk; Jørgensen, Peter Stanley

    2015-01-01

    We present a method for supervised volumetric segmentation based on a dictionary of small cubes composed of pairs of intensity and label cubes. Intensity cubes are small image volumes where each voxel contains an image intensity. Label cubes are volumes with voxelwise probabilities for a given...... label. The segmentation process is done by matching a cube from the volume, of the same size as the dictionary intensity cubes, to the most similar intensity dictionary cube, and from the associated label cube we get voxel-wise label probabilities. Probabilities from overlapping cubes are averaged...... and hereby we obtain a robust label probability encoding. The dictionary is computed from labeled volumetric image data based on weighted clustering. We experimentally demonstrate our method using two data sets from material science – a phantom data set of a solid oxide fuel cell simulation for detecting...

  14. Octree-based Volume Sculpting

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas

    1998-01-01

    A volume sculpting system is presented. The system provides tools for interactive editing of a voxel raster that is stored in an octree data structure. Two different modes of sculpting are supported: Sculpting by adding and subtracting solids, and sculpting with tools that are based on a spray ca...... metaphor. The possibility of extending the method to support multiresolution sculpting is discussed....

  15. Tank waste source term inventory validation. Volume 1. Letter report

    Energy Technology Data Exchange (ETDEWEB)

    Brevick, C.H.; Gaddis, L.A.; Johnson, E.D.

    1995-04-28

    The sample data for selection of 11 radionuclides and 24 chemical analytes were extracted from six separate sample data sets, were arranged in a tabular format and were plotted on scatter plots for all of the 149 single-shell tanks, the 24 double-shell tanks and the four aging waste tanks. The solid and liquid sample data was placed in separate tables and plots. The sample data and plots were compiled from the following data sets: characterization raw sample data, recent core samples, D. Braun data base, Wastren (Van Vleet) data base, TRAC and HTCE inventories. This document is Volume I of the Letter Report entitled Tank Waste Source Term Inventory Validation.

  16. Dry lake beds as sources of dust in Australia during the Late Quaternary: A volumetric approach based on lake bed and deflated dune volumes

    Science.gov (United States)

    Farebrother, Will; Hesse, Paul P.; Chang, Hsing-Chung; Jones, Claudia

    2017-04-01

    Dust affects Earth's climate, ecology and economies across a broad range of scales, both temporally and spatially, and is an integral part of the earth's climate system. Previous studies have highlighted the importance of inland lake beds to dust emissions both locally and globally. This study aims to explore the relative volumetric importance of ephemeral lakes that emit dust to the Australian southeastern dust path over the last glacial cycle. SRTM DEMs and GIS analyses of long-term (up to 80 ka) lake-bed deflation volumes and deposition of sand-sized sediment onto downwind source bordering dunes were used to derive estimates of transported dust mass. A strong power relationship was found between lake area and the mass of deflated lake bed sediments. Total dust masses for the largest 53 lakes in southeastern Australia were derived using the relationship between lake area and dust mass and used to determine an upper value for total dust mass deflated from lake beds in southeastern Australia. Ephemeral lake-derived dust was found to represent at most 13% of the dust derived from southeastern Australia deposited in the southern Pacific over the last 80 ka or 22% over the last 40 ka. Lake Eyre (the largest lake) has contributed at most 3% of the Australian southeast dust plume. These results imply that there are significant additional sources of dust in Australia over these timescales, such as floodplains or dunefields, and that modelling must allow for diverse climatic and geomorphic controls on dust production.

  17. Radiofrequency and 2.45 GHz electron cyclotron resonance H- volume production ion sources

    Science.gov (United States)

    Tarvainen, O.; Peng, S. X.

    2016-10-01

    The volume production of negative hydrogen ions ({{{H}}}-) in plasma ion sources is based on dissociative electron attachment (DEA) to rovibrationally excited hydrogen molecules (H2), which is a two-step process requiring both, hot electrons for ionization, and vibrational excitation of the H2 and cold electrons for the {{{H}}}- formation through DEA. Traditionally {{{H}}}- ion sources relying on the volume production have been tandem-type arc discharge sources equipped with biased filament cathodes sustaining the plasma by thermionic electron emission and with a magnetic filter separating the main discharge from the {{{H}}}- formation volume. The main motivation to develop ion sources based on radiofrequency (RF) or electron cyclotron resonance (ECR) plasma discharges is to eliminate the apparent limitation of the cathode lifetime. In this paper we summarize the principles of {{{H}}}- volume production dictating the ion source design and highlight the differences between the arc discharge and RF/ECR ion sources from both, physics and technology point-of-view. Furthermore, we introduce the state-of-the-art RF and ECR {{{H}}}- volume production ion sources and review the challenges and future prospects of these yet developing technologies.

  18. New concept of ECR driven multicusp volume H- source

    Science.gov (United States)

    Bacal, M.; Ivanov, A. A., Jr.; Rouillé, C.; Arnal, Y.; Béchu, S.; Pelletier, J.

    2003-10-01

    We propose a new concept of a large volume H- source. The plasma necessary to generate the negative hydrogen ions is created by several elementary multi-dipolar ECR sources. Each of these sources consists of a permanent magnet mounted at the extremity of a coaxial feedthrough. This plasma is confined in a periodic magnetic field configuration known as multicusp configuration. It traps the hot electrons at the outer boundary of the system thus favoring the creation of optimum plasma conditions necessary for H- generation (in particular, the electron temperature of the order of 0.6 - 0.8 eV). To test this idea we installed seven ECR sources (2.45GHz) on the top flange of the volume H- source Camembert III with its cylindrical multicusp configuration. The pressure was varied from 1 to 4 mTorr, the total power of the microwave generator was varied between 500 W and 1 kW. We studied the density and temperature of the plasma and of the hydrogen negative ions. We obtained encouraging results confirming the generation of the optimum plasma conditions in this system. We also compared the obtained results with the previous generation scheme, using filaments. Acknowledgement. The support of EEC (Contract HPRI-CT-2001-50021) is gratefully acknowledged.

  19. Control volume based hydrocephalus research

    Science.gov (United States)

    Cohen, Benjamin; Voorhees, Abram; Wei, Timothy

    2008-11-01

    Hydrocephalus is a disease involving excess amounts of cerebral spinal fluid (CSF) in the brain. Recent research has shown correlations to pulsatility of blood flow through the brain. However, the problem to date has presented as too complex for much more than statistical analysis and understanding. This talk will highlight progress on developing a fundamental control volume approach to studying hydrocephalus. The specific goals are to select physiologically control volume(s), develop conservation equations along with the experimental capabilities to accurately quantify terms in those equations. To this end, an in vitro phantom is used as a simplified model of the human brain. The phantom's design consists of a rigid container filled with a compressible gel. The gel has a hollow spherical cavity representing a ventricle and a cylindrical passage representing the aquaducts. A computer controlled piston pump supplies pulsatile volume fluctuations into and out of the flow phantom. MRI is used to measure fluid velocity, and volume change as functions of time. Independent pressure measurements and flow rate measurements are used to calibrate the MRI data. These data are used as a framework for future work with live patients.

  20. Online blind source separation using incremental nonnegative matrix factorization with volume constraint.

    Science.gov (United States)

    Zhou, Guoxu; Yang, Zuyuan; Xie, Shengli; Yang, Jun-Mei

    2011-04-01

    Online blind source separation (BSS) is proposed to overcome the high computational cost problem, which limits the practical applications of traditional batch BSS algorithms. However, the existing online BSS methods are mainly used to separate independent or uncorrelated sources. Recently, nonnegative matrix factorization (NMF) shows great potential to separate the correlative sources, where some constraints are often imposed to overcome the non-uniqueness of the factorization. In this paper, an incremental NMF with volume constraint is derived and utilized for solving online BSS. The volume constraint to the mixing matrix enhances the identifiability of the sources, while the incremental learning mode reduces the computational cost. The proposed method takes advantage of the natural gradient based multiplication updating rule, and it performs especially well in the recovery of dependent sources. Simulations in BSS for dual-energy X-ray images, online encrypted speech signals, and high correlative face images show the validity of the proposed method.

  1. Source fields reconstruction with 3D mapping by means of the virtual acoustic volume concept

    Science.gov (United States)

    Forget, S.; Totaro, N.; Guyader, J. L.; Schaeffer, M.

    2016-10-01

    This paper presents the theoretical framework of the virtual acoustic volume concept and two related inverse Patch Transfer Functions (iPTF) identification methods (called u-iPTF and m-iPTF depending on the chosen boundary conditions for the virtual volume). They are based on the application of Green's identity on an arbitrary closed virtual volume defined around the source. The reconstruction of sound source fields combines discrete acoustic measurements performed at accessible positions around the source with the modal behavior of the chosen virtual acoustic volume. The mode shapes of the virtual volume can be computed by a Finite Element solver to handle the geometrical complexity of the source. As a result, it is possible to identify all the acoustic source fields at the real surface of an irregularly shaped structure and irrespective of its acoustic environment. The m-iPTF method is introduced for the first time in this paper. Conversely to the already published u-iPTF method, the m-iPTF method needs only acoustic pressure and avoids particle velocity measurements. This paper is focused on its validation, both with numerical computations and by experiments on a baffled oil pan.

  2. Geometric Deformations Based on 3D Volume Morphing

    Institute of Scientific and Technical Information of China (English)

    JIN Xiaogang; WAN Huagen; PENG Qunsheng

    2001-01-01

    This paper presents a new geometric deformation method based on 3D volume morphing by using a new concept called directional polar coordinate. The user specifies the source control object and the destination control object which act as the embedded spaces.The source and the destination control objects determine a 3D volume morphing which maps the space enclosed in the source control object to that of the destination control object. By embedding the object to be deformed into the source control object, the 3D volume morphing determines the deformed object automatically without the tiring moving of control points.Experiments show that this deformation model is efficient and intuitive, and it can achieve some deformation effects which are difficult to achieve for traditional methods.

  3. Dictionary Based Segmentation in Volumes

    DEFF Research Database (Denmark)

    Emerson, Monica Jane; Jespersen, Kristine Munk; Jørgensen, Peter Stanley

    Method for supervised segmentation of volumetric data. The method is trained from manual annotations, and these annotations make the method very flexible, which we demonstrate in our experiments. Our method infers label information locally by matching the pattern in a neighborhood around a voxel ...... to a dictionary, and hereby accounts for the volume texture....

  4. Optimization-based mesh correction with volume and convexity constraints

    Science.gov (United States)

    D'Elia, Marta; Ridzal, Denis; Peterson, Kara J.; Bochev, Pavel; Shashkov, Mikhail

    2016-05-01

    We consider the problem of finding a mesh such that 1) it is the closest, with respect to a suitable metric, to a given source mesh having the same connectivity, and 2) the volumes of its cells match a set of prescribed positive values that are not necessarily equal to the cell volumes in the source mesh. This volume correction problem arises in important simulation contexts, such as satisfying a discrete geometric conservation law and solving transport equations by incremental remapping or similar semi-Lagrangian transport schemes. In this paper we formulate volume correction as a constrained optimization problem in which the distance to the source mesh defines an optimization objective, while the prescribed cell volumes, mesh validity and/or cell convexity specify the constraints. We solve this problem numerically using a sequential quadratic programming (SQP) method whose performance scales with the mesh size. To achieve scalable performance we develop a specialized multigrid-based preconditioner for optimality systems that arise in the application of the SQP method to the volume correction problem. Numerical examples illustrate the importance of volume correction, and showcase the accuracy, robustness and scalability of our approach.

  5. Aperture and Receiver Technology. Delivery Order 0002: Bandwidth Invariant Spatial Processing. Volume 2. Digital Signal Processor (DSP) Based Implementation of Direction of Arrival (DOA) for Wideband Sources

    Science.gov (United States)

    2007-05-01

    ARM) family of general-purpose 32-bit microprocessors . The ARM architecture is based on Reduced Instruction Set Computer (RISC) principles. It is...Memories: mAgic SSRAM, ARM FLASH and ARM SRAM 28 • Stereo Audio CODECs (4 in + 4 out) • Serial I/O: • 1 USB 2.0 Full (12 Mbps) • 2 RS232/LVTTL

  6. Faceted Taxonomy-Based Sources

    Science.gov (United States)

    Tzitzikas, Yannis

    The objective of this chapter is to explain the underlying mathematical structure of faceted taxonomy-based sources and to provide some common notions and notations that are used in some parts of the book. Subsequently, and on the basis of the introduced formalism, this chapter describes the interaction between a user and an information source that supports dynamic taxonomies and faceted search.

  7. Volume-scalable high-brightness three-dimensional visible light source

    Science.gov (United States)

    Subramania, Ganapathi; Fischer, Arthur J; Wang, George T; Li, Qiming

    2014-02-18

    A volume-scalable, high-brightness, electrically driven visible light source comprises a three-dimensional photonic crystal (3DPC) comprising one or more direct bandgap semiconductors. The improved light emission performance of the invention is achieved based on the enhancement of radiative emission of light emitters placed inside a 3DPC due to the strong modification of the photonic density-of-states engendered by the 3DPC.

  8. Development of production methods of volume source by the resinous solution which has hardening

    CERN Document Server

    Motoki, R

    2002-01-01

    Volume sources is used for standard sources by radioactive measurement using Ge semiconductor detector of environmental sample, e.g. water, soil and etc. that require large volume. The commercial volume source used in measurement of the water sample is made of agar-agar, and that used in measurement of the soil sample is made of alumina powder. When the plastic receptacles of this two kinds of volume sources were damaged, the leakage contents cause contamination. Moreover, if hermetically sealing performance of volume source made of agar-agar fell, volume decrease due to an evaporation off moisture gives an error to radioactive measurement. Therefore, we developed the two type methods using unsaturated polyester resin, vinilester resin, their hardening agent and acrylicresin. The first type is due to dispersing the hydrochloric acid solution included the radioisotopes uniformly in each resin and hardening the resin. The second is due to dispersing the alumina powder absorbed the radioisotopes in each resin an...

  9. Assessment of control technology for stationary sources. Volume II: control technology data tables. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Minicucci, D.; Herther, M.; Babb, L.; Kuby, W.

    1980-02-01

    This report, the Control Technology Data Tables, is the second volume of the three-volume final report for the contract. It presents in tabular format, qualitative descriptions of control options for the various sources and quantitative information on control technology cost, efficiency, reliability, energy consumption, other environmental impacts and application status. Also included is a code list which classifies the stationary sources examined by industry, process, and emission source.

  10. Beam formation in CERNs cesiated surfaces and volume H- ion sources

    Science.gov (United States)

    Mochalskyy, Serhiy; Lettry, Jacques; Minea, Tiberiu

    2016-08-01

    At CERN, a high performance negative ion (NI) source is required for the 160 MeV H- linear accelerator named Linac4. The source should deliver 80 mA H- ion beams within an emittance of 0.25 mm·mrad. For this purpose two ion sources were developed: IS01 is based on the NI volume production and IS02 provides additional NI by surface production via H interaction on a cesiated Molybdenum plasma electrode. The development of negative ion sources for Linac4 is accompanied by modelling activities. ONIX code has been modified and adapted to investigate the transport of NI and electrons in the extraction region of the CERN negative ion sources. The simulated results from modeling of IS01 and IS02 extraction regions, which were obtained in 2012 during source commissioning, are presented and benchmarked with experimental measurements obtained after 2013. The formation of the plasma meniscus and the screening of the extraction field by the source plasma are discussed. The NI production is compared between two types of sources, the first one based on volume production only and the second one encompassing NI cesiated surface production. For the IS02 source, different states of conditioning were simulated by changing the NI emission flux from the plasma electrode and Cs+ density in the bulk plasma region. The numerical results show that in low work function regime, with high NI surface emission rate of 3000 A m-2 and Cs-density of nCs+ = 3.8 × 1016 m-3, the total extracted NI current could reach ~80 mA. At the less favorable Cs-coverage, when the surface NI emission rate becomes significantly lower, namely 300 A m-2 with nCs+ = 3.3 × 1015 m-3, the total extracted NI current only reaches ~20 mA. A good agreement between simulation and experimental results is observed in terms of extracted NI current for both extraction systems, including the case of reversed extraction potential that corresponds to positive (H+) ion extraction.

  11. Advanced Photon Source research: Volume 1, Number 1, April 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-04-01

    The following articles are included in this publication: (1) The Advanced Photon Source: A Brief Overview; (2) MAD Analysis of FHIT at the Structural Biology Center; (3) Advances in High-Energy-Resolution X-ray Scattering at Beamline 3-ID; (4) X-ray Imaging and Microspectroscopy of the Mycorrhyizal Fungus-Plant Symbiosis; (5) Measurement and Control of Particle-beam Trajectories in the Advanced Photon Storage Ring; (6) Beam Acceleration and Storage at the Advanced Photon Source; and (7) Experimental Facilities Operations and Current Status.

  12. The 5-Year Outlook on Science and Technology 1981. Source Materials Volume 2.

    Science.gov (United States)

    National Science Foundation, Washington, DC.

    This is the second of two volumes of source documents commissioned by the National Science Foundation in preparing the second 5-Year Outlook on Science and Technology for transmission to the Congress. This volume consists of the views of individuals selected by the Committee on Science, Engineering and Public Policy of the American Association for…

  13. ECR-Driven Multicusp Volume H- Ion Source

    Science.gov (United States)

    Bacal, M.; Ivanov, A. A.; Rouillé, C.; Svarnas, P.; Béchu, S.; Pelletier, J.

    2005-04-01

    We studied the negative ion current extracted from the plasma created by seven elementary ECR sources, operating at 2.45 GHz, placed in the magnetic multipole chamber "Camembert III". We varied the pressure from 1 to 4 mTorr, with a maximum power of 1 kW and studied the plasma created in this system by measuring the various plasma parameters, including the density and temperature of the negative hydrogen ions. We found that the electron temperature is optimal for negative hydrogen ion production at 9.5 cm from the ECR sources. The tantalum-covered wall surface pollution reduces the extracted negative ion current and enhances the electron current. Tantalum evaporation has a positive effect. The use of a grid and of a collar in front of the plasma electrode did not lead to any enhancement of the extracted negative ion current.

  14. Volume of Home and Community Based Services and...

    Data.gov (United States)

    U.S. Department of Health & Human Services — Volume of Home- and Community-Based Services and Time to Nursing-Home Placement The purpose of this study was to determine whether the volume of Home and Community...

  15. Novel designs for undulator based positron sources

    OpenAIRE

    Jenkins, Mike; Bailey, Ian

    2015-01-01

    Proposed high energy electron-positron linear colliders require a high ux of positrons. To achieve this a number of new positron source designs have been proposed. One of these is an undulator-based positron source, which is the baseline positron source design for the International Linear Collider. The undulator-based positron source for the International Linear Collider uses a helical undulator to produce a intense photon beam that generates positrons through the pairproduction mechanism. As...

  16. Open-source software for demand forecasting of clinical laboratory test volumes using time-series analysis

    Directory of Open Access Journals (Sweden)

    Emad A Mohammed

    2017-01-01

    Full Text Available Background: Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. Method: In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. Results: This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. Conclusion: This tool will allow anyone with historic test volume data to model future demand.

  17. Open-source Software for Demand Forecasting of Clinical Laboratory Test Volumes Using Time-series Analysis.

    Science.gov (United States)

    Mohammed, Emad A; Naugler, Christopher

    2017-01-01

    Demand forecasting is the area of predictive analytics devoted to predicting future volumes of services or consumables. Fair understanding and estimation of how demand will vary facilitates the optimal utilization of resources. In a medical laboratory, accurate forecasting of future demand, that is, test volumes, can increase efficiency and facilitate long-term laboratory planning. Importantly, in an era of utilization management initiatives, accurately predicted volumes compared to the realized test volumes can form a precise way to evaluate utilization management initiatives. Laboratory test volumes are often highly amenable to forecasting by time-series models; however, the statistical software needed to do this is generally either expensive or highly technical. In this paper, we describe an open-source web-based software tool for time-series forecasting and explain how to use it as a demand forecasting tool in clinical laboratories to estimate test volumes. This tool has three different models, that is, Holt-Winters multiplicative, Holt-Winters additive, and simple linear regression. Moreover, these models are ranked and the best one is highlighted. This tool will allow anyone with historic test volume data to model future demand.

  18. A Repetitive Nanosecond Pulse Source for Generation of Large Volume Streamer Discharge

    Institute of Scientific and Technical Information of China (English)

    TAO Fengbo; ZHANG Qiaogen; GAO Bo; WANG Hu; LI Zhou

    2008-01-01

    Using a unipolar pulse with the rise time and the pulse duration in the order of microsecond as the primary pulse,a nanosecond pulse with the repetitive frequency of several kilohertz is generated by a spark gap switch.By varying both the inter-pulse duration and the pulse frequency,the voltage recovery rate of the spark gap switch is investigated at different working conditions such as the gas pressure,the gas composition as well as the bias voltage.The results reveal that either increase in gas pressure or addition of SF6 to the air can increase the voltage recovery rate.The effect of gas composition on the voltage recovery rate is discussed based on the transferring and distribution of the residual space charges.The repetitive nanosecond pulse source is also applied to the generation of large volume,and the discharge currents are measured to investigate the effect of pulse repetition rate on the large volume streamer discharge.

  19. Generalized source Finite Volume Method for radiative transfer equation in participating media

    Science.gov (United States)

    Zhang, Biao; Xu, Chuan-Long; Wang, Shi-Min

    2017-03-01

    Temperature monitoring is very important in a combustion system. In recent years, non-intrusive temperature reconstruction has been explored intensively on the basis of calculating arbitrary directional radiative intensities. In this paper, a new method named Generalized Source Finite Volume Method (GSFVM) was proposed. It was based on radiative transfer equation and Finite Volume Method (FVM). This method can be used to calculate arbitrary directional radiative intensities and is proven to be accurate and efficient. To verify the performance of this method, six test cases of 1D, 2D, and 3D radiative transfer problems were investigated. The numerical results show that the efficiency of this method is close to the radial basis function interpolation method, but the accuracy and stability is higher than that of the interpolation method. The accuracy of the GSFVM is similar to that of the Backward Monte Carlo (BMC) algorithm, while the time required by the GSFVM is much shorter than that of the BMC algorithm. Therefore, the GSFVM can be used in temperature reconstruction and improvement on the accuracy of the FVM.

  20. Vector velocity volume flow estimation: Sources of error and corrections applied for arteriovenous fistulas

    DEFF Research Database (Denmark)

    Jensen, Jonas; Olesen, Jacob Bjerring; Stuart, Matthias Bo

    2016-01-01

    A method for vector velocity volume flow estimation is presented, along with an investigation of its sources of error and correction of actual volume flow measurements. Volume flow errors are quantified theoretically by numerical modeling, through flow phantom measurements, and studied in vivo...... than circular, vessel area and correcting the ultrasound beam for being off-axis, gave a significant (p = 0.008) reduction in error from 31.2% to 24.3%. The error is relative to the Ultrasound Dilution Technique, which is considered the gold standard for volume flow estimation for dialysis patients....... This paper investigates errors from estimating volumetric flow using a commercial ultrasound scanner and the common assumptions made in the literature. The theoretical model shows, e.g. that volume flow is underestimated by 15%, when the scan plane is off-axis with the vessel center by 28% of the vessel...

  1. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  2. Assessment of control technology for stationary sources. Volume I: technical discussion. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Minicucci, D.; Herther, M.; Babb, L.; Kuby, W.

    1980-02-01

    The purpose of this project was to develop a reference document for use by the Air Resources Board, local air pollution control districts, and the U.S. Environmental Protection Agency that describes technological options available for the control of emissions from stationary sources located in California. Control technologies were examined for 10 industry groups and six air pollutants. Volume I, Technical Discussion, includes an overall introduction to the project, descriptions of its major elements, background information for each industry group addressed, and the project bibliography. In Volume II, Control Technology Data Tables, qualitative descriptions of control options for the various sources and quantitative information on control technology cost, efficiency, reliability, energy consumption, other environmental impacts, and application status are presented in tabular format. Also included is a code list that classifies the stationary sources examined by industry, process and emission source.

  3. Light-assisted drying (LAD) of small volume biologics: a comparison of two IR light sources

    Science.gov (United States)

    Young, Madison A.; Van Vorst, Matthew; Elliott, Gloria D.; Trammell, Susan R.

    2016-03-01

    Protein therapeutics have been developed to treat diseases ranging from arthritis and psoriasis to cancer. A challenge in the development of protein-based drugs is maintaining the protein in the folded state during processing and storage. We are developing a novel processing method, light-assisted drying (LAD), to dehydrate proteins suspended in a sugar (trehalose) solution for storage at supra-zero temperatures. Our technique selectively heats the water in small volume samples using near-IR light to speed dehydration which prevents sugar crystallization that can damage embedded proteins. In this study, we compare the end moisture content (EMC) as a function of processing time of samples dried with two different light sources, Nd:YAG (1064 nm) and Thulium fiber (1850 nm) lasers. EMC is the ratio of water to dry weight in a sample and the lower the EMC the higher the possible storage temperature. LAD with the 1064 and 1850 nm lasers yielded 78% and 65% lower EMC, respectively, than standard air-drying. After 40 minutes of LAD with 1064 and 1850 nm sources, EMCs of 0.27+/-.27 and 0.15+/-.05 gH2O/gDryWeight were reached, which are near the desired value of 0.10 gH2O/gDryWeight that enables storage in a glassy state without refrigeration. LAD is a promising new technique for the preparation of biologics for anhydrous preservation.

  4. Accuracy and Sources of Error for an Angle Independent Volume Flow Estimator

    DEFF Research Database (Denmark)

    Jensen, Jonas; Olesen, Jacob Bjerring; Hansen, Peter Møller

    2014-01-01

    This paper investigates sources of error for a vector velocity volume flow estimator. Quantification of the estima tor’s accuracy is performed theoretically and investigated in vivo . Womersley’s model for pulsatile flow is used to simulate velo city profiles and calculate volume flow errors in c...... % underestimated volume flow according to the simulation. Volume flow estimates were corrected for the beam being off- axis, but was not able to significantly decrease the error rel ative to measurements with the reference method.......This paper investigates sources of error for a vector velocity volume flow estimator. Quantification of the estima tor’s accuracy is performed theoretically and investigated in vivo . Womersley’s model for pulsatile flow is used to simulate velo city profiles and calculate volume flow errors....... A BK Medical UltraView 800 ultrasound scanner with a 9 MHz linear array transducer is used to obtain Vector Flow Imaging sequences of a superficial part of the fistulas. Cross-sectional diameters of each fistu la are measured on B-mode images by rotating the scan plane 90 degrees. The major axis...

  5. Salmonella source attribution based on microbial subtyping

    DEFF Research Database (Denmark)

    Barco, Lisa; Barrucci, Federica; Olsen, John Elmerdahl

    2013-01-01

    Source attribution of cases of food-borne disease represents a valuable tool for identifying and prioritizing effective food-safety interventions. Microbial subtyping is one of the most common methods to infer potential sources of human food-borne infections. So far, Salmonella microbial subtyping...... source attribution models have been implemented by using serotyping and phage-typing data. Molecular-based methods may prove to be similarly valuable in the future, as already demonstrated for other food-borne pathogens like Campylobacter. This review assesses the state of the art concerning Salmonella...... in the context of their potential applicability for Salmonella source attribution studies....

  6. Studies on plasma production in a large volume system using multiple compact ECR plasma sources

    Science.gov (United States)

    Tarey, R. D.; Ganguli, A.; Sahu, D.; Narayanan, R.; Arora, N.

    2017-01-01

    This paper presents a scheme for large volume plasma production using multiple highly portable compact ECR plasma sources (CEPS) (Ganguli et al 2016 Plasma Source Sci. Technol. 25 025026). The large volume plasma system (LVPS) described in the paper is a scalable, cylindrical vessel of diameter  ≈1 m, consisting of source and spacer sections with multiple CEPS mounted symmetrically on the periphery of the source sections. Scaling is achieved by altering the number of source sections/the number of sources in a source section or changing the number of spacer sections for adjusting the spacing between the source sections. A series of plasma characterization experiments using argon gas were conducted on the LVPS under different configurations of CEPS, source and spacer sections, for an operating pressure in the range 0.5-20 mTorr, and a microwave power level in the range 400-500 W per source. Using Langmuir probes (LP), it was possible to show that the plasma density (~1  -  2  ×  1011 cm-3) remains fairly uniform inside the system and decreases marginally close to the chamber wall, and this uniformity increases with an increase in the number of sources. It was seen that a warm electron population (60-80 eV) is always present and is about 0.1% of the bulk plasma density. The mechanism of plasma production is discussed in light of the results obtained for a single CEPS (Ganguli et al 2016 Plasma Source Sci. Technol. 25 025026).

  7. Vector velocity volume flow estimation: Sources of error and corrections applied for arteriovenous fistulas.

    Science.gov (United States)

    Jensen, Jonas; Olesen, Jacob Bjerring; Stuart, Matthias Bo; Hansen, Peter Møller; Nielsen, Michael Bachmann; Jensen, Jørgen Arendt

    2016-08-01

    A method for vector velocity volume flow estimation is presented, along with an investigation of its sources of error and correction of actual volume flow measurements. Volume flow errors are quantified theoretically by numerical modeling, through flow phantom measurements, and studied in vivo. This paper investigates errors from estimating volumetric flow using a commercial ultrasound scanner and the common assumptions made in the literature. The theoretical model shows, e.g. that volume flow is underestimated by 15%, when the scan plane is off-axis with the vessel center by 28% of the vessel radius. The error sources were also studied in vivo under realistic clinical conditions, and the theoretical results were applied for correcting the volume flow errors. Twenty dialysis patients with arteriovenous fistulas were scanned to obtain vector flow maps of fistulas. When fitting an ellipsis to cross-sectional scans of the fistulas, the major axis was on average 10.2mm, which is 8.6% larger than the minor axis. The ultrasound beam was on average 1.5mm from the vessel center, corresponding to 28% of the semi-major axis in an average fistula. Estimating volume flow with an elliptical, rather than circular, vessel area and correcting the ultrasound beam for being off-axis, gave a significant (p=0.008) reduction in error from 31.2% to 24.3%. The error is relative to the Ultrasound Dilution Technique, which is considered the gold standard for volume flow estimation for dialysis patients. The study shows the importance of correcting for volume flow errors, which are often made in clinical practice.

  8. An Open-Source Based ITS Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Torp, Kristian

    2013-01-01

    In this paper, a complete platform used to compute travel times from GPS data is described. Two approaches to computing travel time are proposed one based on points and one based on trips. Overall both approaches give reasonable results compared to existing manual estimated travel times. However......, the trip-based approach requires more GPS data and of a higher quality than the point-based approach. The platform has been completely implemented using open-source software. The main conclusion is that large quantity of GPS data can be managed, with a limited budget and that GPS data is a good source...

  9. Web based brain volume calculation for magnetic resonance images.

    Science.gov (United States)

    Karsch, Kevin; Grinstead, Brian; He, Qing; Duan, Ye

    2008-01-01

    Brain volume calculations are crucial in modern medical research, especially in the study of neurodevelopmental disorders. In this paper, we present an algorithm for calculating two classifications of brain volume, total brain volume (TBV) and intracranial volume (ICV). Our algorithm takes MRI data as input, performs several preprocessing and intermediate steps, and then returns each of the two calculated volumes. To simplify this process and make our algorithm publicly accessible to anyone, we have created a web-based interface that allows users to upload their own MRI data and calculate the TBV and ICV for the given data. This interface provides a simple and efficient method for calculating these two classifications of brain volume, and it also removes the need for the user to download or install any applications.

  10. Characteristics of the negative ion beam extracted from an LBL multicusp volume source

    Energy Technology Data Exchange (ETDEWEB)

    Debiak, T.W.; Solensten, L.; Sredniawski, J.J.; Ng, Y.C.; Heuer, R. (Grumman Corporation, Bethpage, New York 11714 (US))

    1990-01-01

    This work encompasses a study of the beam position, profile, and emittance of a Lawrence Berkeley Lab (LBL) multicusp volume source. The study includes a comparison of different extraction geometries with single- and multiple-hole apertures. Our work is currently based on single-gap extraction and acceleration. These experiments are the first of a planned series of studies with various extractor geometries. The beam profile full width at half-maximum ranged from 5.7 to 10.2 mm at a position 69 mm from the emission aperture. Measurements of profile and position in the vertical direction indicate that the beam is significantly bent in the direction expected due to the field of the electron separation magnet. Phase space contour plots in the horizontal plane have been obtained for circular extraction apertures with a diameter of 1.0 and 2.0 mm, and a multiple-hole aperture with an overall diameter of 2.46 mm. Emittances were calculated to be as low as 0.0010 {pi} cm mrad for the 1-mm aperture and 0.0014 {pi} cm mrad for the 2-mm aperture. Emittances are not reported for the multiple-hole aperture due to the shape of the phase space contours; however, analysis of the data is in progress to provide a meaningful comparison of the single-hole and multiple-hole beam characteristics.

  11. The enhanced volume source boundary point method for the calculation of acoustic radiation problem

    Institute of Scientific and Technical Information of China (English)

    WANG Xiufeng; CHEN Xinzhao; WANG Youcheng

    2003-01-01

    The Volume Source Boundary Point Method (VSBPM) is greatly improved so that it will speed up the VSBPM's solution of the acoustic radiation problem caused by the vibrating body. The fundamental solution provided by Helmholtz equation is enforced in a weighted residual sense over a tetrahedron located on the normal line of the boundary node to replace the coefficient matrices of the system equation. Through the enhanced volume source boundary point analysis of various examples and the sound field of a vibrating rectangular box in a semi-anechoic chamber, it has revealed that the calculating speed of the EVSBPM is more than 10 times faster than that of the VSBPM while it works on the aspects of its calculating precision and stability, adaptation to geometric shape of vibrating body as well as its ability to overcome the non-uniqueness problem.

  12. Characterization of volume type ion source for $p$, $H_2^+$ and $H_3^+$ beams

    CERN Document Server

    Joshi, N; Meusel, O; Ratzinger, U

    2016-01-01

    Recently, there is an increasing need for $H_{2}^+$ and $H_{3}^+$ ion sources. One example are ion therapy facilities, where $C^{4+}$ and $H_{3}^+$ ion beams along the linac are of great interest. Another example is a $H_{2}^+$ test beam for linacs finally operated with intense deuteron beams. At Frankfurt, a simple proton ion source is needed to test a new kind of beam injection into a magnetic storage ring\\cite{EPAC08}\\cite{EPAC06}. This article describes a volume type ion source which can deliver upto $3.05~mA$ beam current at $10~keV$ in stable dc operation. It is a hot filament driven ion source which can provide high fractions of $p$, $H_{2}^+$ or $H_{3}^+$, depending on the operation settings.

  13. HTGR accident initiation and progression analysis status report. Volume V. AIPA fission product source terms

    Energy Technology Data Exchange (ETDEWEB)

    Alberstein, D.; Apperson, C.E. Jr.; Hanson, D.L.; Myers, B.F.; Pfeiffer, W.W.

    1976-02-01

    The primary objective of the Accident Initiation and Progression Analysis (AIPA) Program is to provide guidance for high-temperature gas-cooled reactor (HTGR) safety research and development. Among the parameters considered in estimating the uncertainties in site boundary doses are uncertainties in fission product source terms generated under normal operating conditions, i.e., fuel body inventories, circulating coolant activity, total plateout activity in the primary circuit, and plateout distributions. The volume presented documents the analyses of these source term uncertainties. The results are used for the detailed consequence evaluations, and they provide the basis for evaluation of fission products important for HTGR maintenance and shielding.

  14. Advanced energy sources and conversion techniques. Proceedings of a seminar. Volume 1. [35 papers

    Energy Technology Data Exchange (ETDEWEB)

    None

    1958-11-01

    The Seminar was organized as a series of tutorial presentations and round table discussions on a technical level to implement the following: (a) to identify and explore present and projected needs for energy sources and conversion techniques for military applications; (b) to exchange information on current and planned efforts in these fields; (c) to examine the effect of anticipated scientific and technological advances on these efforts; and (d) to present suggested programs aimed at satisfying the military needs for energy sources and conversion techniques. Volume I contains all of the unclassified papers presented at the Seminar. (W.D.M.)

  15. Monolithic fuel cell based power source for burst power generation

    Science.gov (United States)

    Fee, D. C.; Blackburn, P. E.; Busch, D. E.; Dees, D. W.; Dusek, J.; Easler, T. E.; Ellingson, W. A.; Flandermeyer, B. K.; Fousek, R. J.; Heiberger, J. J.

    A unique fuel cell coupled with a low power nuclear reactor presents an attractive approach for SDI burst power requirements. The monolithic fuel cell looks attractive for space applications and represents a quantum jump in fuel cell technology. Such a breakthrough in design is the enabling technology for lightweight, low volume power sources for space based pulse power systems. The monolith is unique among fuel cells in being an all solid state device. The capability for miniaturization, inherent in solid state devices, gives the low volume required for space missions. In addition, the solid oxide fuel cell technology employed in the monolith has high temperature reject heat and can be operated in either closed or open cycles. Both these features are attractive for integration into a burst power system.

  16. Nanomaterial-based x-ray sources

    Science.gov (United States)

    Cole, Matthew T.; Parmee, R. J.; Milne, William I.

    2016-02-01

    Following the recent global excitement and investment in the emerging, and rapidly growing, classes of one and two-dimensional nanomaterials, we here present a perspective on one of the viable applications of such materials: field electron emission based x-ray sources. These devices, which have a notable history in medicine, security, industry and research, to date have almost exclusively incorporated thermionic electron sources. Since the middle of the last century, field emission based cathodes were demonstrated, but it is only recently that they have become practicable. We outline some of the technological achievements of the past two decades, and describe a number of the seminal contributions. We explore the foremost market hurdles hindering their roll-out and broader industrial adoption and summarise the recent progress in miniaturised, pulsed and multi-source devices.

  17. The Chandra Local Volume Survey: The X-ray Point Source Catalog of NGC 300

    CERN Document Server

    Binder, Breanna; Eracleous, Michael; Gaetz, Terrance J; Plucinsky, Paul P; Skillman, Evan D; Dalcanton, Julianne J; Anderson, Scott F; Weisz, Daniel R; Kong, Albert K H

    2012-01-01

    We present the source catalog of a new Chandra ACIS-I observation of NGC 300 obtained as part of the Chandra Local Volume Survey. Our 63 ks exposure covers ~88% of the D25 isophote (R~6.3 kpc) and yields a catalog of 95 X-ray point sources detected at high significance to a limiting unabsorbed 0.35-8 keV luminosity of ~10^36 erg s^-1. Sources were cross-correlated with a previous XMM-Newton catalog, and we find 75 "X-ray transient candidate" sources that were detected by one observatory, but not the other. We derive an X-ray scale length of 1.7+/-0.2 kpc and a recent star formation rate of 0.12 Msun yr^-1, in excellent agreement with optical observations. Deep, multi-color imaging from the Hubble Space Telescope, covering ~32% of our Chandra field, was used to search for optical counterparts to the X-ray sources, and we have developed a new source classification scheme to determine which sources are likely X-ray binaries, supernova remnants, and background AGN candidates. Finally, we present the X-ray luminos...

  18. A Peltier-based variable temperature source

    Science.gov (United States)

    Molki, Arman; Roof Baba, Abdul

    2014-11-01

    In this paper we propose a simple and cost-effective variable temperature source based on the Peltier effect using a commercially purchased thermoelectric cooler. The proposed setup can be used to quickly establish relatively accurate dry temperature reference points, which are necessary for many temperature applications such as thermocouple calibration.

  19. Research on Canal System Operation Based on Controlled Volume Method

    Directory of Open Access Journals (Sweden)

    Zhiliang Ding

    2009-10-01

    Full Text Available An operating simulation mode based on storage volume control method for multireach canal system in series was established. In allusion to the deficiency of existing controlled volume algorithm, the improved algorithm was proposed, that is the controlled volume algorithm of whole canal pools, the simulation results indicate that the storage volume and water level of each canal pool can be accurately controlled after the improved algorithm was adopted. However, for some typical discharge demand change operating conditions of canal, if the controlled volume algorithm of whole canal pool is still adopted, then it certainly will cause some unnecessary regulation, and consequently increases the disturbed canal reaches. Therefor, the idea of controlled volume operation method of continuous canal pools was proposed, and its algorithm was designed. Through simulation to practical project, the results indicate that the new controlled volume algorithm proposed for typical operating condition can comparatively obviously reduce the number of regulated check gates and disturbed canal pools for some typical discharge demand change operating conditions of canal, thus the control efficiency of canal system was improved. The controlled volume method of operation is specially suitable for large-scale water delivery canal system which possesses complex operation requirements.

  20. High-pitch dual-source CT coronary angiography with low volumes of contrast medium

    Energy Technology Data Exchange (ETDEWEB)

    Lembcke, Alexander; Hein, Patrick A.; Knobloch, Gesine; Durmus, Tahir; Hamm, Bernd [Charite - University Medicine Berlin, Department of Radiology, Berlin (Germany); Schwenke, Carsten [SCO:SSiS - Schwenke Consulting, Berlin (Germany); Huppertz, Alexander [Charite - University Medicine Berlin, Department of Radiology, Berlin (Germany); ISI - Imaging Science Institute Charite, Berlin (Germany)

    2014-01-15

    To assess the effect of lower volumes of contrast medium (CM) on image quality in high-pitch dual-source computed tomography coronary angiography (CTCA). One-hundred consecutive patients (body weight 65-85 kg, stable heart rate ≤65 bpm, cardiac index ≥2.5 L/min/m{sup 2}) referred for CTCA were prospectively enrolled. Patients were randomly assigned to one of five groups of different CM volumes (G{sub 30}, 30 mL; G{sub 40}, 40 mL; G{sub 50}, 50 mL; G{sub 60}, 60 mL; G{sub 70}, 70 mL; flow rate 5 mL/s each, iodine content 370 mg/mL). Attenuation within the proximal and distal coronary artery segments was analysed. Mean attenuation for men and women ranged from 345.0 and 399.1 HU in G{sub 30} to 478.2 and 571.8 HU in G{sub 70}. Mean attenuation values were higher in groups with higher CM volumes (P < 0.0001) and higher in women than in men (P < 0.0001). The proportions of segments with attenuation of at least 300 HU in G{sub 30}, G{sub 40}, G{sub 50}, G{sub 60} and G{sub 70} were 89 %, 95 %, 98 %, 98 % and 99 %. CM volume of 30 mL in women and 40 mL in men proved to be sufficient to guarantee attenuation of at least 300 HU. In selected patients high-pitch dual-source CTCA can be performed with CM volumes of 40 mL in men or 30 mL in women. (orig.)

  1. THE CHANDRA LOCAL VOLUME SURVEY: THE X-RAY POINT-SOURCE CATALOG OF NGC 300

    Energy Technology Data Exchange (ETDEWEB)

    Binder, B.; Williams, B. F.; Dalcanton, J. J.; Anderson, S. F.; Weisz, D. R. [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Eracleous, M. [Department of Astronomy and Astrophysics, Pennsylvania State University, 525 Davey Laboratory, University Park, PA 16802 (United States); Gaetz, T. J.; Plucinsky, P. P. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street Cambridge, MA 02138 (United States); Skillman, E. D. [Astronomy Department, University of Minnesota, 116 Church St. SE, Minneapolis, MN 55455 (United States); Kong, A. K. H. [Institute of Astronomy and Department of Physics, National Tsing Hua University, Hsinchu 30013, Taiwan (China)

    2012-10-10

    We present the source catalog of a new Chandra ACIS-I observation of NGC 300 obtained as part of the Chandra Local Volume Survey. Our 63 ks exposure covers {approx}88% of the D{sub 25} isophote (R Almost-Equal-To 6.3 kpc) and yields a catalog of 95 X-ray point sources detected at high significance to a limiting unabsorbed 0.35-8 keV luminosity of {approx}10{sup 36} erg s{sup -1}. Sources were cross-correlated with a previous XMM-Newton catalog, and we find 75 'X-ray transient candidate' sources that were detected by one observatory, but not the other. We derive an X-ray scale length of 1.7 {+-} 0.2 kpc and a recent star formation rate of 0.12 M{sub Sun} yr{sup -1} in excellent agreement with optical observations. Deep, multi-color imaging from the Hubble Space Telescope, covering {approx}32% of our Chandra field, was used to search for optical counterparts to the X-ray sources, and we have developed a new source classification scheme to determine which sources are likely X-ray binaries, supernova remnants, and background active galactic nucleus candidates. Finally, we present the X-ray luminosity functions (XLFs) at different X-ray energies, and we find the total NGC 300 X-ray point-source population to be consistent with other late-type galaxies hosting young stellar populations ({approx}< 50 Myr). We find that XLF of sources associated with older stellar populations has a steeper slope than the XLF of X-ray sources coinciding with young stellar populations, consistent with theoretical predictions.

  2. MEMS-based IR-sources

    Science.gov (United States)

    Weise, Sebastian; Steinbach, Bastian; Biermann, Steffen

    2016-03-01

    The series JSIR350 sources are MEMS based infrared emitters. These IR sources are characterized by a high radiation output. Thus, they are excellent for NDIR gas analysis and are ideally suited for using with our pyro-electric or thermopile detectors. The MEMS chips used in Micro-Hybrid's infrared emitters consist of nano-amorphous carbon (NAC). The MEMS chips are produced in the USA. All Micro-Hybrid Emitter are designed and specified to operate up to 850°C. The improvements we have made in the source's packaging enable us to provide IR sources with the best performance on the market. This new technology enables us to seal the housings of infrared radiation sources with soldered infrared filters or windows and thus cause the parts to be impenetrable to gases. Micro-Hybrid provide various ways of adapting our MEMS based infrared emitter JSIR350 to customer specifications, like specific burn-in parameters/characteristic, different industrial standard housings, producible with customized cap, reflector or pin-out.

  3. Technology transfer package on seismic base isolation - Volume III

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-14

    This Technology Transfer Package provides some detailed information for the U.S. Department of Energy (DOE) and its contractors about seismic base isolation. Intended users of this three-volume package are DOE Design and Safety Engineers as well as DOE Facility Managers who are responsible for reducing the effects of natural phenomena hazards (NPH), specifically earthquakes, on their facilities. The package was developed as part of DOE's efforts to study and implement techniques for protecting lives and property from the effects of natural phenomena and to support the International Decade for Natural Disaster Reduction. Volume III contains supporting materials not included in Volumes I and II.

  4. Plasma-Based Ion Beam Sources

    Energy Technology Data Exchange (ETDEWEB)

    Loeb, H. W.

    2005-07-01

    Ion beam sources cover a broad spectrum of scientific and technical applications delivering ion currents between less than 1 mA and about 100 A at acceleration voltages between 100 V and 100 kV. The ions are mostly generated by electron collisions in a gas discharge and then extracted from the discharge plasma, focused and post-accelerated by single- or multi-aperture electrode systems. Some important applications require the neutralization of the exhausted beam either by charge exchange or by admixture of electrons. In the first part of the paper, the theory of ionization by electron impact, the energy and carrier balances in the plasma, and the extraction and focusing mechanisms will be outlined. The principles of the preferred gas discharges and of the ion beam sources based on them are discussed; i.e. of the Penning, bombardment, arc, duoplasmatron, radio frequency, and microwave types. In the second part of the paper, the special requirements of the different applications are described together with the related source hardware. One distinguishes: 1. Single-aperture ion sources producing protons, heavy ions, isotope ions, etc. for particle accelerators, ion microprobes, mass spectrometers, isotope separators, etc.; quality determinative quantities are brightness, emittance, energy width, etc. 2. Broad-beam multi-aperture injector sources for fusion machines with positive or negative deuterium ions; very high beam densities, small portions of molecular ions, flat beam profiles with small divergence angles, etc. are required. 3. Broad-beam multi-aperture ion thrusters for space propulsion operated with singly charged xenon ions; high efficiencies, reliable operation, and long lifetimes are most important. Spin-offs are applied in industry for material processing. Referring to these applications, the following sources will be described in some detail: 1. Cold cathode and filament driven sources, capillary arc and plasmatron types, microwave and ECR-sources. 2

  5. Intense Pulsed Neutron Source: Progress report 1991--1996. 15. Anniversary edition -- Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-05-01

    The 15th Anniversary Edition of the IPNS Progress Report is being published in recognition of the Intense Pulsed Neutron Source`s first 15 years of successful operation as a user facility. To emphasize the importance of this milestone, the author shave made the design and organization of the report significantly different from previous IPNS Progress Reports. This report consists of two volumes. For Volume 1, authors were asked to prepare articles that highlighted recent scientific accomplishments at IPNS, from 1991 to present; to focus on and illustrate the scientific advances achieved through the unique capabilities of neutron studies performed by IPNS users; to report on specific activities or results from an instrument; or to focus on a body of work encompassing different neutron-scattering techniques. Articles were also included on the accelerator system, instrumentation, computing, target, and moderators. A list of published and ``in press` articles in journals, books, and conference proceedings, resulting from work done at IPNS since 1991, was compiled. This list is arranged alphabetically according to first author. Publication references in the articles are listed by last name of first author and year of publication. The IPNS experimental reports received since 1991 are compiled in Volume 2. Experimental reports referenced in the articles are listed by last name of first author, instrument designation, and experiment number.

  6. Intense Pulsed Neutron Source: Progress report 1991--1996. 15. Anniversary edition -- Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Marzec, B. [ed.

    1996-05-01

    The 15th Anniversary Edition of the IPNS Progress Report is being published in recognition of the Intense Pulsed Neutron Source`s first 15 years of successful operation as a user facility. To emphasize the importance of this milestone, the authors have made the design and organization of the report significantly different from previous IPNS Progress Reports. This report consists of two volumes. For Volume 1, authors were asked to prepare articles that highlighted recent scientific accomplishments at IPNS, from 1991 to present; to focus on and illustrate the scientific advances achieved through the unique capabilities of neutron studies performed by IPNS users; to report on specific activities or results from an instrument; or to focus on a body of work encompassing different neutron-scattering techniques. Articles were also included on the accelerator system, instrumentation, computing, target, and moderators. A list of published and ``in press` articles in journals, books, and conference proceedings, resulting from work done at IPNS since 1991, was compiled. This list is arranged alphabetically according to first author. Publication references in the articles are listed by last name of first author and year of publication. The IPNS experimental reports received since 1991 are compiled in Volume 2. Experimental reports referenced in the articles are listed by last name of first author, instrument designation, and experiment number.

  7. Cyclotron-based neutron source for BNCT

    Science.gov (United States)

    Mitsumoto, T.; Yajima, S.; Tsutsui, H.; Ogasawara, T.; Fujita, K.; Tanaka, H.; Sakurai, Y.; Maruhashi, A.

    2013-04-01

    Kyoto University Research Reactor Institute (KURRI) and Sumitomo Heavy Industries, Ltd. (SHI) have developed a cyclotron-based neutron source for Boron Neutron Capture Therapy (BNCT). It was installed at KURRI in Osaka prefecture. The neutron source consists of a proton cyclotron named HM-30, a beam transport system and an irradiation & treatment system. In the cyclotron, H- ions are accelerated and extracted as 30 MeV proton beams of 1 mA. The proton beams is transported to the neutron production target made by a beryllium plate. Emitted neutrons are moderated by lead, iron, aluminum and calcium fluoride. The aperture diameter of neutron collimator is in the range from 100 mm to 250 mm. The peak neutron flux in the water phantom is 1.8×109 neutrons/cm2/sec at 20 mm from the surface at 1 mA proton beam. The neutron source have been stably operated for 3 years with 30 kW proton beam. Various pre-clinical tests including animal tests have been done by using the cyclotron-based neutron source with 10B-p-Borono-phenylalanine. Clinical trials of malignant brain tumors will be started in this year.

  8. Cyclotron-based neutron source for BNCT

    Energy Technology Data Exchange (ETDEWEB)

    Mitsumoto, T.; Yajima, S.; Tsutsui, H.; Ogasawara, T.; Fujita, K. [Sumitomo Heavy Industries, Ltd (Japan); Tanaka, H.; Sakurai, Y.; Maruhashi, A. [Kyoto University Research Reactor Institute (Japan)

    2013-04-19

    Kyoto University Research Reactor Institute (KURRI) and Sumitomo Heavy Industries, Ltd. (SHI) have developed a cyclotron-based neutron source for Boron Neutron Capture Therapy (BNCT). It was installed at KURRI in Osaka prefecture. The neutron source consists of a proton cyclotron named HM-30, a beam transport system and an irradiation and treatment system. In the cyclotron, H- ions are accelerated and extracted as 30 MeV proton beams of 1 mA. The proton beams is transported to the neutron production target made by a beryllium plate. Emitted neutrons are moderated by lead, iron, aluminum and calcium fluoride. The aperture diameter of neutron collimator is in the range from 100 mm to 250 mm. The peak neutron flux in the water phantom is 1.8 Multiplication-Sign 109 neutrons/cm{sup 2}/sec at 20 mm from the surface at 1 mA proton beam. The neutron source have been stably operated for 3 years with 30 kW proton beam. Various pre-clinical tests including animal tests have been done by using the cyclotron-based neutron source with {sup 10}B-p-Borono-phenylalanine. Clinical trials of malignant brain tumors will be started in this year.

  9. The Einstein Observatory catalog of IPC x ray sources. Volume 1E: Documentation

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics, which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  10. SET OPERATOR-BASED METHOD OF DENOISING MEDICAL VOLUME DATA

    Institute of Scientific and Technical Information of China (English)

    程兵; 郑南宁; 袁泽剑

    2002-01-01

    Objective To investigate impulsive noise suppression of medical volume data. Methods The volume data is represented as level sets and a special set operator is defined and applied to filtering it. The small connected components, which are likely to be produced by impulsive noise, are eliminated after the filtering process. A fast algorithm that uses a heap data structure is also designed. Results Compared with traditional linear filters such as a Gaussian filter, this method preserves the fine structure features of the medical volume data while removing noise, and the fast algorithm developed by us reduces memory consumption and improves computing efficiency. The experimental results given illustrate the efficiency of the method and the fast algorithm. Conclusion The set operator-based method shows outstanding denoising properties in our experiment, especially for impulsive noise. The method has a wide variety of applications in the areas of volume visualization and high dimensional data processing.

  11. Open Source GIS based integrated watershed management

    Science.gov (United States)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  12. Negative hydrogen ion production in multicusp volume source with a pulsed discharge (abstract)a)

    Science.gov (United States)

    Bacal, M.; Belchenko, Yu. I.

    1996-03-01

    The pulsed operation of a negative ion volume source has been investigated, both with a magnetic filter present and without it, under conditions of full-scale acceleration of the extracted negative hydrogen ion beam. We report the observation of three afterglow negative ion peaks. As the negative ion current during the discharge pulse, each of the afterglow peaks can be optimized by varying the pressure, the plasma electrode bias and the extraction voltage. Under optimum conditions, the negative ion current during the discharge pulse exceeds the afterglow peaks.

  13. On Issues of Precision for Hardware-based Volume Visualization

    Energy Technology Data Exchange (ETDEWEB)

    LaMar, E C

    2003-04-11

    This paper discusses issues with the limited precision of hardware-based volume visualization. We will describe the compositing OVER operator and how fixed-point arithmetic affects it. We propose two techniques to improve the precision of fixed-point compositing and the accuracy of hardware-based volume visualization. The first technique is to perform dithering of color and alpha values. The second technique we call exponent-factoring, and captures significantly more numeric resolution than dithering, but can only produce monochromatic images.

  14. Capillary plasma jet: A low volume plasma source for life science applications

    Energy Technology Data Exchange (ETDEWEB)

    Topala, I., E-mail: ionut.topala@uaic.ro, E-mail: tmnagat@ipc.shizuoka.ac.jp [Alexandru Ioan Cuza University of Iasi, Faculty of Physics, Iasi Plasma Advanced Research Center (IPARC), Bd. Carol I No. 11, Iasi 700506 (Romania); Nagatsu, M., E-mail: ionut.topala@uaic.ro, E-mail: tmnagat@ipc.shizuoka.ac.jp [Graduate School of Science and Technology, Shizuoka University, 3-5-1 Johoku, Naka-ku, Hamamatsu 432-8561 (Japan)

    2015-02-02

    In this letter, we present results from multispectroscopic analysis of protein films, after exposure to a peculiar plasma source, i.e., the capillary plasma jet. This plasma source is able to generate very small pulsed plasma volumes, in kilohertz range, with characteristic dimensions smaller than 1 mm. This leads to specific microscale generation and transport of all plasma species. Plasma diagnosis was realized using general electrical and optical methods. Depending on power level and exposure duration, this miniature plasma jet can induce controllable modifications to soft matter targets. Detailed discussions on protein film oxidation and chemical etching are supported by results from absorption, X-ray photoelectron spectroscopy, and microscopy techniques. Further exploitation of principles presented here may consolidate research interests involving plasmas in biotechnologies and plasma medicine, especially in patterning technologies, modified biomolecule arrays, and local chemical functionalization.

  15. Measuring Modularity in Open Source Code Bases

    Directory of Open Access Journals (Sweden)

    Roberto Milev

    2009-03-01

    Full Text Available Modularity of an open source software code base has been associated with growth of the software development community, the incentives for voluntary code contribution, and a reduction in the number of users who take code without contributing back to the community. As a theoretical construct, modularity links OSS to other domains of research, including organization theory, the economics of industry structure, and new product development. However, measuring the modularity of an OSS design has proven difficult, especially for large and complex systems. In this article, we describe some preliminary results of recent research at Carleton University that examines the evolving modularity of large-scale software systems. We describe a measurement method and a new modularity metric for comparing code bases of different size, introduce an open source toolkit that implements this method and metric, and provide an analysis of the evolution of the Apache Tomcat application server as an illustrative example of the insights gained from this approach. Although these results are preliminary, they open the door to further cross-discipline research that quantitatively links the concerns of business managers, entrepreneurs, policy-makers, and open source software developers.

  16. Source extension based on ε-entropy

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian; YU Sheng-sheng; ZHOU Jing-li; ZHENG Xin-wei

    2005-01-01

    It is known by entropy theory that image is a source correlated with a certain characteristic of probability. The entropy rate of the source and ? entropy (rate-distortion function theory) are the information content to identify the characteristics of video images, and hence are essentially related with video image compression. They are fundamental theories of great significance to image compression, though impossible to be directly turned into a compression method. Based on the entropy theory and the image compression theory, by the application of the rate-distortion feature mathematical model and Lagrange multipliers to some theoretical problems in the H.264 standard, this paper presents a new the algorithm model of coding rate-distortion. This model is introduced into complete test on the capability of the test model of JM61e (JUT Test Model). The result shows that the speed of coding increases without significant reduction of the rate-distortion performance of the coder.

  17. The Chandra Local Volume Survey: The X-ray Point Source Population of NGC 404

    CERN Document Server

    Binder, B; Eracleous, M; Gaetz, T J; Kong, A K H; Skillman, E D; Weisz, D R

    2012-01-01

    We present a comprehensive X-ray point source catalog of NGC 404 obtained as part of the Chandra Local Volume Survey. A new, 97 ks Chandra ACIS-S observation of NGC 404 was combined with archival observations for a total exposure of ~123 ks. Our survey yields 74 highly significant X-ray point sources and is sensitive to a limiting unabsorbed luminosity of ~6x10^35 erg s^-1 in the 0.35-8 keV band. To constrain the nature of each X-ray source, cross-correlations with multi-wavelength data were generated. We searched overlapping HST observations for optical counterparts to our X-ray detections, but find only two X-ray sources with candidate optical counterparts. We find 21 likely low mass X-ray binaries (LMXBs), although this number is a lower limit due to the difficulties in separating LMXBs from background AGN. The X-ray luminosity functions (XLFs) in both the soft and hard energy bands are presented. The XLFs in the soft band (0.5-2 keV) and the hard band (2-8 keV) have a limiting luminosity at the 90% comple...

  18. H- ion production in electron cyclotron resonance driven multicusp volume source

    Science.gov (United States)

    Ivanov, A. A.; Rouillé, C.; Bacal, M.; Arnal, Y.; Béchu, S.; Pelletier, J.

    2004-05-01

    We have used the existing magnetic multicusp configuration of the large volume H- source Camembert III to confine the plasma created by seven elementary multidipolar electron cyclotron resonance (ECR) sources, operating at 2.45 GHz. We varied the pressure from 1 to 4 mTorr, while the total power of the microwave generator was varied between 500 W and 1 kW. We studied the plasma created by this system and measured the various plasma parameters, including the density and temperature of the negative hydrogen ions which are compared to the data obtained in a chamber with elementary ECR sources without multicusp magnetic confinement. The electron temperature is lower than that obtained with similar elementary sources in the absence of the magnetic multicusp field. We found that at pressures in the range from 2 to 4 mTorr and microwave power of up to 1 kW, the electron temperature is optimal for H- ion production (0.6-0.8 eV). This could indicate that the multicusp configuration effectively traps the fast electrons produced by the ECR discharge.

  19. A High-Temperature, "Volume-Type" ECR Ion Source for RIB Generation

    Energy Technology Data Exchange (ETDEWEB)

    Alton, G.D.; Liu, Y.; Reed, C.A.; Williams, C.; Zhang, T.

    1999-03-29

    A high temperature, low-charge-state, "volume-type" source has been designed for use in the nuclear physics and nuclear astrophysics research radioactive ion beam (RIB) programs at the Holifield Radioactive Ion beam Facility (HRIBF). The source utilizes electromagnetic coils to generate a large and uniformly distributed central magnetic field with magnitude (875 G) chosen to be in electron-cyclotron-resonance (ECR) with single- frequency (2.45 GHz) microwave radiation. Among the features of the source includti a variable mirror-ratio at ion extraction as required for optimizing low-charge state ion beam generation, a right-hand, circularly-polarized RF injection system to overcome the relatively-low, cutoff-density, (nC - 7.4x10'0/cm3) associated with the use of 2.45 GHz microwave radiatiom, and a high temperature, Ir- or Re-coated-Ta plasma chamber to reduce the residence times of radioactive species that are adsorbed on the walls of the chamber. No provisions are made for radial plasma confinement due to the sensitivity of permanent magnets to degradation by the huge fluxes of neutrons incumbent during target irradiation, routinely used for this purpose. Aspects of the design features of the source are described in this report.

  20. The extraction of negative carbon ions from a volume cusp ion source

    Science.gov (United States)

    Melanson, Stephane; Dehnel, Morgan; Potkins, Dave; McDonald, Hamish; Hollinger, Craig; Theroux, Joseph; Martin, Jeff; Stewart, Thomas; Jackle, Philip; Philpott, Chris; Jones, Tobin; Kalvas, Taneli; Tarvainen, Olli

    2017-08-01

    Acetylene and carbon dioxide gases are used in a filament-powered volume-cusp ion source to produce negative carbon ions for the purpose of carbon implantation for gettering applications. The beam was extracted to an energy of 25 keV and the composition was analyzed with a spectrometer system consisting of a 90° dipole magnet and a pair of slits. It is found that acetylene produces mostly C2- ions (up to 92 µA), while carbon dioxide produces mostly O- with only trace amounts of C-. Maximum C2- current was achieved with 400 W of arc power and, the beam current and composition were found to be highly dependent on the pressure in the source. The beam properties as a function of source settings are analyzed, and plasma properties are measured with a Langmuir probe. Finally, we describe testing of a new RF H- ion source, found to produce more than 6 mA of CW H- beam.

  1. Advisory Committee on human radiation experiments. Final report, Supplemental Volume 2. Sources and documentation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    This volume and its appendixes supplement the Advisory Committee`s final report by reporting how we went about looking for information concerning human radiation experiments and intentional releases, a description of what we found and where we found it, and a finding aid for the information that we collected. This volume begins with an overview of federal records, including general descriptions of the types of records that have been useful and how the federal government handles these records. This is followed by an agency-by-agency account of the discovery process and descriptions of the records reviewed, together with instructions on how to obtain further information from those agencies. There is also a description of other sources of information that have been important, including institutional records, print resources, and nonprint media and interviews. The third part contains brief accounts of ACHRE`s two major contemporary survey projects (these are described in greater detail in the final report and another supplemental volume) and other research activities. The final section describes how the ACHRE information-nation collections were managed and the records that ACHRE created in the course of its work; this constitutes a general finding aid for the materials deposited with the National Archives. The appendices provide brief references to federal records reviewed, descriptions of the accessions that comprise the ACHRE Research Document Collection, and descriptions of the documents selected for individual treatment. Also included are an account of the documentation available for ACHRE meetings, brief abstracts of the almost 4,000 experiments individually described by ACHRE staff, a full bibliography of secondary sources used, and other information.

  2. Alternative modeling methods for plasma-based Rf ion sources

    Energy Technology Data Exchange (ETDEWEB)

    Veitzer, Seth A., E-mail: veitzer@txcorp.com; Kundrapu, Madhusudhan, E-mail: madhusnk@txcorp.com; Stoltz, Peter H., E-mail: phstoltz@txcorp.com; Beckwith, Kristian R. C., E-mail: beckwith@txcorp.com [Tech-X Corporation, Boulder, Colorado 80303 (United States)

    2016-02-15

    Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H{sup −} source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. In particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H{sup −} ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two

  3. Alternative modeling methods for plasma-based Rf ion sources

    Science.gov (United States)

    Veitzer, Seth A.; Kundrapu, Madhusudhan; Stoltz, Peter H.; Beckwith, Kristian R. C.

    2016-02-01

    Rf-driven ion sources for accelerators and many industrial applications benefit from detailed numerical modeling and simulation of plasma characteristics. For instance, modeling of the Spallation Neutron Source (SNS) internal antenna H- source has indicated that a large plasma velocity is induced near bends in the antenna where structural failures are often observed. This could lead to improved designs and ion source performance based on simulation and modeling. However, there are significant separations of time and spatial scales inherent to Rf-driven plasma ion sources, which makes it difficult to model ion sources with explicit, kinetic Particle-In-Cell (PIC) simulation codes. In particular, if both electron and ion motions are to be explicitly modeled, then the simulation time step must be very small, and total simulation times must be large enough to capture the evolution of the plasma ions, as well as extending over many Rf periods. Additional physics processes such as plasma chemistry and surface effects such as secondary electron emission increase the computational requirements in such a way that even fully parallel explicit PIC models cannot be used. One alternative method is to develop fluid-based codes coupled with electromagnetics in order to model ion sources. Time-domain fluid models can simulate plasma evolution, plasma chemistry, and surface physics models with reasonable computational resources by not explicitly resolving electron motions, which thereby leads to an increase in the time step. This is achieved by solving fluid motions coupled with electromagnetics using reduced-physics models, such as single-temperature magnetohydrodynamics (MHD), extended, gas dynamic, and Hall MHD, and two-fluid MHD models. We show recent results on modeling the internal antenna H- ion source for the SNS at Oak Ridge National Laboratory using the fluid plasma modeling code USim. We compare demonstrate plasma temperature equilibration in two-temperature MHD models

  4. SYNBAPS. Volume 1. Data Base Sources and Data Preparation

    Science.gov (United States)

    1979-12-01

    Processing Laboratory. A comprehensive description of the Calspan methology of chart digitization is given in Solosko (1976). Appendix A is a brief...CONFIDENTIAL FORM LINES: Dashed lines resembling contours, but representing rco actual elevations, that have been sketched from visual observation or from

  5. An open source workflow for 3D printouts of scientific data volumes

    Science.gov (United States)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  6. Rotating-Disk-Based Hybridized Electromagnetic-Triboelectric Nanogenerator for Sustainably Powering Wireless Traffic Volume Sensors.

    Science.gov (United States)

    Zhang, Binbin; Chen, Jun; Jin, Long; Deng, Weili; Zhang, Lei; Zhang, Haitao; Zhu, Minhao; Yang, Weiqing; Wang, Zhong Lin

    2016-06-28

    Wireless traffic volume detectors play a critical role for measuring the traffic-flow in a real-time for current Intelligent Traffic System. However, as a battery-operated electronic device, regularly replacing battery remains a great challenge, especially in the remote area and wide distribution. Here, we report a self-powered active wireless traffic volume sensor by using a rotating-disk-based hybridized nanogenerator of triboelectric nanogenerator and electromagnetic generator as the sustainable power source. Operated at a rotating rate of 1000 rpm, the device delivered an output power of 17.5 mW, corresponding to a volume power density of 55.7 W/m(3) (Pd = P/V, see Supporting Information for detailed calculation) at a loading resistance of 700 Ω. The hybridized nanogenerator was demonstrated to effectively harvest energy from wind generated by a moving vehicle through the tunnel. And the delivered power is capable of triggering a counter via a wireless transmitter for real-time monitoring the traffic volume in the tunnel. This study further expands the applications of triboelectric nanogenerators for high-performance ambient mechanical energy harvesting and as sustainable power sources for driving wireless traffic volume sensors.

  7. Perception-based transparency optimization for direct volume rendering.

    Science.gov (United States)

    Chan, Ming-Yuen; Wu, Yingcai; Mak, Wai-Ho; Chen, Wei; Qu, Huamin

    2009-01-01

    The semi-transparent nature of direct volume rendered images is useful to depict layered structures in a volume. However, obtaining a semi-transparent result with the layers clearly revealed is difficult and may involve tedious adjustment on opacity and other rendering parameters. Furthermore, the visual quality of layers also depends on various perceptual factors. In this paper, we propose an auto-correction method for enhancing the perceived quality of the semi-transparent layers in direct volume rendered images. We introduce a suite of new measures based on psychological principles to evaluate the perceptual quality of transparent structures in the rendered images. By optimizing rendering parameters within an adaptive and intuitive user interaction process, the quality of the images is enhanced such that specific user requirements can be met. Experimental results on various datasets demonstrate the effectiveness and robustness of our method.

  8. A consensus-based dynamics for market volumes

    Science.gov (United States)

    Sabatelli, Lorenzo; Richmond, Peter

    2004-12-01

    We develop a model of trading orders based on opinion dynamics. The agents may be thought as the share holders of a major mutual fund rather than as direct traders. The balance between their buy and sell orders determines the size of the fund order (volume) and has an impact on prices and indexes. We assume agents interact simultaneously to each other through a Sznajd-like interaction. Their degree of connection is determined by the probability of changing opinion independently of what their neighbours are doing. We assume that such a probability may change randomly, after each transaction, of an amount proportional to the relative difference between the volatility then measured and a benchmark that we assume to be an exponential moving average of the past volume values. We show how this simple model is compatible with some of the main statistical features observed for the asset volumes in financial markets.

  9. Source Code Generator Based on Dynamic Frames

    Directory of Open Access Journals (Sweden)

    Danijel Radošević

    2011-06-01

    Full Text Available Normal 0 21 false false false HR X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Obična tablica"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} This paper presents the model of source code generator based on dynamic frames. The model is named as the SCT model because if its three basic components: Specification (S, which describes the application characteristics, Configuration (C, which describes the rules for building applications, and Templates (T, which refer to application building blocks. The process of code generation dynamically creates XML frames containing all building elements (S, C ant T until final code is produced. This approach is compared to existing XVCL frames based model for source code generating. The SCT model is described by both XML syntax and the appropriate graphical elements. The SCT model is aimed to build complete applications, not just skeletons. The main advantages of the presented model are its textual and graphic description, a fully configurable generator, and the reduced overhead of the generated source code. The presented SCT model is shown on development of web application example in order to demonstrate its features and justify our design choices.

  10. THE CHANDRA LOCAL VOLUME SURVEY: THE X-RAY POINT-SOURCE POPULATION OF NGC 404

    Energy Technology Data Exchange (ETDEWEB)

    Binder, B.; Williams, B. F.; Weisz, D. R. [University of Washington, Department of Astronomy, Box 351580, Seattle, WA 98195 (United States); Eracleous, M. [Department of Astronomy and Astrophysics and Center for Gravitational Wave Physics, The Pennsylvania State University, 525 Davey Lab, University Park, PA 16802 (United States); Gaetz, T. J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street Cambridge, MA 02138 (United States); Kong, A. K. H. [Institute of Astronomy and Department of Physics, National Tsing Hua University, Hsinchu 30013, Taiwan (China); Skillman, E. D. [University of Minnesota, Astronomy Department, 116 Church St. SE, Minneapolis, MN 55455 (United States)

    2013-02-15

    We present a comprehensive X-ray point-source catalog of NGC 404 obtained as part of the Chandra Local Volume Survey. A new 97 ks Chandra ACIS-S observation of NGC 404 was combined with archival observations for a total exposure of {approx}123 ks. Our survey yields 74 highly significant X-ray point sources and is sensitive to a limiting unabsorbed luminosity of {approx}6 Multiplication-Sign 10{sup 35} erg s{sup -1} in the 0.35-8 keV band. To constrain the nature of each X-ray source, cross-correlations with multi-wavelength data were generated. We searched overlapping Hubble Space Telescope observations for optical counterparts to our X-ray detections, but find only two X-ray sources with candidate optical counterparts. We find 21 likely low-mass X-ray binaries (LMXBs), although this number is a lower limit due to the difficulties in separating LMXBs from background active galactic nuclei. The X-ray luminosity functions (XLFs) in both the soft and hard energy bands are presented. The XLFs in the soft band (0.5-2 keV) and the hard band (2-8 keV) have a limiting luminosity at the 90% completeness limit of 10{sup 35} erg s{sup -1} and 10{sup 36} erg s{sup -1}, respectively, significantly lower than previous X-ray studies of NGC 404. We find the XLFs to be consistent with those of other X-ray populations dominated by LMXBs. However, the number of luminous (>10{sup 37} erg s{sup -1}) X-ray sources per unit stellar mass in NGC 404 is lower than is observed for other galaxies. The relative lack of luminous XRBs may be due to a population of LMXBs with main-sequence companions formed during an epoch of elevated star formation {approx}0.5 Gyr ago.

  11. Plasma-based EUV light source

    Science.gov (United States)

    Shumlak, Uri; Golingo, Raymond; Nelson, Brian A.

    2010-11-02

    Various mechanisms are provided relating to plasma-based light source that may be used for lithography as well as other applications. For example, a device is disclosed for producing extreme ultraviolet (EUV) light based on a sheared plasma flow. The device can produce a plasma pinch that can last several orders of magnitude longer than what is typically sustained in a Z-pinch, thus enabling the device to provide more power output than what has been hitherto predicted in theory or attained in practice. Such power output may be used in a lithography system for manufacturing integrated circuits, enabling the use of EUV wavelengths on the order of about 13.5 nm. Lastly, the process of manufacturing such a plasma pinch is discussed, where the process includes providing a sheared flow of plasma in order to stabilize it for long periods of time.

  12. Synchrotron based spallation neutron source concepts

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Y.

    1998-07-01

    During the past 20 years, rapid-cycling synchrotrons (RCS) have been used very productively to generate short-pulse thermal neutron beams for neutron scattering research by materials science communities in Japan (KENS), the UK (ISIS) and the US (IPNS). The most powerful source in existence, ISIS in the UK, delivers a 160-kW proton beam to a neutron-generating target. Several recently proposed facilities require proton beams in the MW range to produce intense short-pulse neutron beams. In some proposals, a linear accelerator provides the beam power and an accumulator ring compresses the pulse length to the required {approx} 1 {micro}s. In others, RCS technology provides the bulk of the beam power and compresses the pulse length. Some synchrotron-based proposals achieve the desired beam power by combining two or more synchrotrons of the same energy, and others propose a combination of lower and higher energy synchrotrons. This paper presents the rationale for using RCS technology, and a discussion of the advantages and disadvantages of synchrotron-based spallation sources.

  13. Environmental Toxicology: A Guide to Information Sources. Volume 7 in the "Man and the Environment Information Guide" Series.

    Science.gov (United States)

    Rudd, Robert L.

    This annotated bibliography on environmental toxicology brings together a diverse set of information sources from the physical, social, and natural sciences. These sources include periodical literature, government documents, scientific journals, and teaching materials. The volume is divided into sixteen sections organized into four parts: (1)…

  14. ECR-driven multicusp H^- volume source operated in pulsed or cw mode

    Science.gov (United States)

    Svarnas, Panayiotis

    2005-10-01

    Electron cyclotron resonance (ECR) driven multicusp H^- volume hybrid source [1, 2] operates in continuous (cw) or pulsed microwave (2.45 GHz) mode up to 3 kW. The hydrogen plasma is produced between 1 and 7 mTorr by seven elementary ECR sources housed in the magnetic multipole chamber ``Camembert III'' [3]. This ECR configuration could be applied both to accelerator and fusion ion sources. Negative ion or electron extracted currents and plasma characteristics are studied in both modes with electrical measurements, electrostatic probe and photodetachment. The role of the plasma electrode bias in the values of the extracted currents is major. H^- current is maximized for a bias voltage close to plasma potential. An optimum pressure at 4-5 mTorr yields enhanced H^- density in the center of the chamber, under cw regime. Finally, the post-discharge formation of H^-, in the pulsed mode, is observed. [1] A.A. Ivanov Jr., C. Rouille, M. Bacal, Y. Arnal, S. Bechu, J. Pelletier, Rev. Sci. Instrum. 75(5), 1750 (2004) [2] M. Bacal, A.A. Ivanov Jr., C. Rouille, P. Svarnas, S. Bechu, J. Pelletier, AIP Conf. Proc. No 763 (Kiev, Ukraine) (2004) [3] C. Courteille, A.M. Bruneteau, M. Bacal, Rev. Sci. Instrum. 66(3), 2533 (1995)

  15. Acoustic beam steering by light refraction: illustration with directivity patterns of a tilted volume photoacoustic source.

    Science.gov (United States)

    Raetz, Samuel; Dehoux, Thomas; Perton, Mathieu; Audoin, Bertrand

    2013-12-01

    The symmetry of a thermoelastic source resulting from laser absorption can be broken when the direction of light propagation in an elastic half-space is inclined relatively to the surface. This leads to an asymmetry of the directivity patterns of both compressional and shear acoustic waves. In contrast to classical surface acoustic sources, the tunable volume source allows one to take advantage of the mode conversion at the surface to control the directivity of specific modes. Physical interpretations of the evolution of the directivity patterns with the increasing light angle of incidence and of the relations between the preferential directions of compressional- and shear-wave emission are proposed. In order to compare calculated directivity patterns with measurements of normal displacement amplitudes performed on plates, a procedure is proposed to transform the directivity patterns into pseudo-directivity patterns representative of the experimental conditions. The comparison of the theoretical with measured pseudo-directivity patterns demonstrates the ability to enhance bulk-wave amplitudes and to steer specific bulk acoustic modes by adequately tuning light refraction.

  16. Numerical study of cesium effects on negative ion production in volume sources

    Energy Technology Data Exchange (ETDEWEB)

    Fukumasa, Osamu; Niitani, Eiji [Yamaguchi Univ., Ube (Japan). Faculty of Engineering

    1997-02-01

    Effects of cesium vapor injection of H{sup -} production in a tandem negative ion source are studied numerically as a function of plasma parameters. Model calculation is done by solving a set of particle balance equations in a steady-state hydrogen discharge plasmas. Here, the results which focus on gas pressure and electron temperature dependences of H{sup -} volume production are presented and discussed. With including H{sup -} surface production processes caused by both H atoms and positive hydrogen ions, enhancement of H{sup -} production and pressure dependence of H{sup -} production observed experimentally are well reproduced in the model. To enhance H{sup -} production, however, so-called electron cooling is not so effective if plasma parameters are initially optimized with the use of magnetic filter. (author)

  17. Evaluation of Anterior Chamber Volume in Cataract Patients with Swept-Source Optical Coherence Tomography.

    Science.gov (United States)

    He, Wenwen; Zhu, Xiangjia; Wolff, Don; Zhao, Zhennan; Sun, Xinghuai; Lu, Yi

    2016-01-01

    Purpose. To evaluate the anterior chamber volume in cataract patients with Swept-Source Optical Coherence Tomography (SS-OCT) and its influencing factors. Methods. Anterior chamber volume of 92 cataract patients was evaluated with SS-OCT in this cross-sectional study. Univariate analyses and multiple linear regression were used to investigate gender, age, operated eye, posterior vitreous detachment, lens opacity grading, and axial length (AXL) related variables capable of influencing the ACV. Results. The average ACV was 139.80 ± 38.21 mm(3) (range 59.41 to 254.09 mm(3)). The average ACV was significantly larger in male patients than in female patients (P = 0.001). ACV was negatively correlated with age and LOCS III cortical (C) grading of the lens (Pearson's correlation analysis, r = -0.443, P ACV was also increased with AXL (Pearson's correlation analysis, r = 0.552, P ACV (F = 10.252  P ACV varied significantly among different subjects. Influencing factors that contribute to reduced ACV were female gender, increased age, LOCS III C grade, and shorter AXL.

  18. Technology transfer package on seismic base isolation - Volume II

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-14

    This Technology Transfer Package provides some detailed information for the U.S. Department of Energy (DOE) and its contractors about seismic base isolation. Intended users of this three-volume package are DOE Design and Safety Engineers as well as DOE Facility Managers who are responsible for reducing the effects of natural phenomena hazards (NPH), specifically earthquakes, on their facilities. The package was developed as part of DOE's efforts to study and implement techniques for protecting lives and property from the effects of natural phenomena and to support the International Decade for Natural Disaster Reduction. Volume II contains the proceedings for the Short Course on Seismic Base Isolation held in Berkeley, California, August 10-14, 1992.

  19. Technology transfer package on seismic base isolation - Volume I

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-02-14

    This Technology Transfer Package provides some detailed information for the U.S. Department of Energy (DOE) and its contractors about seismic base isolation. Intended users of this three-volume package are DOE Design and Safety Engineers as well as DOE Facility Managers who are responsible for reducing the effects of natural phenomena hazards (NPH), specifically earthquakes, on their facilities. The package was developed as part of DOE's efforts to study and implement techniques for protecting lives and property from the effects of natural phenomena and to support the International Decade for Natural Disaster Reduction. Volume I contains the proceedings of the Workshop on Seismic Base Isolation for Department of Energy Facilities held in Marina Del Rey, California, May 13-15, 1992.

  20. Environmental effects of energy production and utilization in the U. S. Volume I. Sources, trends, and costs of control

    Energy Technology Data Exchange (ETDEWEB)

    Newkirk, H.W. (comp.)

    1976-05-01

    Volume I deals with sources (what the emissions are and where they come from), trends (quantities of emissions and their dispersion with time), and costs of control (what it takes in time, energy, and money to meet minimum standards). Volume II concerns itself with the public health effects of energy production and utilization. Volume III summarizes the various techniques for controlling emissions, technological as well as economic, social, and political. (For abstracts of Vols. II and III, see ERDA Energy Research Abstracts, Vol. 2, Absts. 5764 and 5670, respectively) Each volume is divided into sections dealing with the atmosphere, water, land, and social activities--each division indicating a particular sphere of man's environment affected by energy production and use. The sources of information that were used in this study included textbooks, journal articles, technical reports, memoranda, letters, and personal communications. These are cited in the text at the end of each subsection and on the applicable tables and figures.

  1. Trellis-based source and channel coding

    NARCIS (Netherlands)

    Van der Vleuten, R.J.

    1994-01-01

    This thesis concerns the efficient transmission of digital data, such as digitized sounds or images, from a source to its destination. To make the best use of the limited capacity of the source-destination channel, a source coder is used to delete the less significant information. To correct the occ

  2. Development of a plume-in-grid model for industrial point and volume sources: application to power plant and refinery sources in the Paris region

    Science.gov (United States)

    Kim, Y.; Seigneur, C.; Duclaux, O.

    2014-04-01

    Plume-in-grid (PinG) models incorporating a host Eulerian model and a subgrid-scale model (usually a Gaussian plume or puff model) have been used for the simulations of stack emissions (e.g., fossil fuel-fired power plants and cement plants) for gaseous and particulate species such as nitrogen oxides (NOx), sulfur dioxide (SO2), particulate matter (PM) and mercury (Hg). Here, we describe the extension of a PinG model to study the impact of an oil refinery where volatile organic compound (VOC) emissions can be important. The model is based on a reactive PinG model for ozone (O3), which incorporates a three-dimensional (3-D) Eulerian model and a Gaussian puff model. The model is extended to treat PM, with treatments of aerosol chemistry, particle size distribution, and the formation of secondary aerosols, which are consistent in both the 3-D Eulerian host model and the Gaussian puff model. Furthermore, the PinG model is extended to include the treatment of volume sources to simulate fugitive VOC emissions. The new PinG model is evaluated over Greater Paris during July 2009. Model performance is satisfactory for O3, PM2.5 and most PM2.5 components. Two industrial sources, a coal-fired power plant and an oil refinery, are simulated with the PinG model. The characteristics of the sources (stack height and diameter, exhaust temperature and velocity) govern the surface concentrations of primary pollutants (NOx, SO2 and VOC). O3 concentrations are impacted differently near the power plant than near the refinery, because of the presence of VOC emissions at the latter. The formation of sulfate is influenced by both the dispersion of SO2 and the oxidant concentration; however, the former tends to dominate in the simulations presented here. The impact of PinG modeling on the formation of secondary organic aerosol (SOA) is small and results mostly from the effect of different oxidant concentrations on biogenic SOA formation. The investigation of the criteria for injecting

  3. Ion source based on the cathodic arc

    Science.gov (United States)

    Sanders, David M.; Falabella, Steven

    1994-01-01

    A cylindrically symmetric arc source to produce a ring of ions which leave the surface of the arc target radially and are reflected by electrostatic fields present in the source to a point of use, such as a part to be coated. An array of electrically isolated rings positioned in the source serves the dual purpose of minimizing bouncing of macroparticles and providing electrical insulation to maximize the electric field gradients within the source. The source also includes a series of baffles which function as a filtering or trapping mechanism for any macroparticles.

  4. Precise segmentation of multiple organs in CT volumes using learning-based approach and information theory.

    Science.gov (United States)

    Lu, Chao; Zheng, Yefeng; Birkbeck, Neil; Zhang, Jingdan; Kohlberger, Timo; Tietjen, Christian; Boettger, Thomas; Duncan, James S; Zhou, S Kevin

    2012-01-01

    In this paper, we present a novel method by incorporating information theory into the learning-based approach for automatic and accurate pelvic organ segmentation (including the prostate, bladder and rectum). We target 3D CT volumes that are generated using different scanning protocols (e.g., contrast and non-contrast, with and without implant in the prostate, various resolution and position), and the volumes come from largely diverse sources (e.g., diseased in different organs). Three key ingredients are combined to solve this challenging segmentation problem. First, marginal space learning (MSL) is applied to efficiently and effectively localize the multiple organs in the largely diverse CT volumes. Second, learning techniques, steerable features, are applied for robust boundary detection. This enables handling of highly heterogeneous texture pattern. Third, a novel information theoretic scheme is incorporated into the boundary inference process. The incorporation of the Jensen-Shannon divergence further drives the mesh to the best fit of the image, thus improves the segmentation performance. The proposed approach is tested on a challenging dataset containing 188 volumes from diverse sources. Our approach not only produces excellent segmentation accuracy, but also runs about eighty times faster than previous state-of-the-art solutions. The proposed method can be applied to CT images to provide visual guidance to physicians during the computer-aided diagnosis, treatment planning and image-guided radiotherapy to treat cancers in pelvic region.

  5. [Development of ultrasound-based monitor of relative blood volume].

    Science.gov (United States)

    Jiang, Shunzhong; Hu, Xiao; Liang, Zhongwei; Fan, Jianghong; Xia, Wubing; Zhou, Hongbo; Yi, Wei

    2013-12-01

    Assessing dry weight accurately is crucial in providing effective and safe haemodialysis. Biases towards dry weight assessment may bring a series of dialysis complications. This study introduces an online detection technique of relative blood volume (RBV) based on ultrasound, which analyzes the correlation between changes in blood density and sound speed. By measuring the attenuation in sound velocity, this method was employed to calculate RBV, and then to evaluate the dry weight of patients on dialysis. TDC-GP2 time measurement chip and MSP430 Single-chip Microcontroller (SCM) were used in the system to measure the ultrasonic travel time. In the clinical trials, RBV values range between 71.3% and 108.1%, showing consistent result with Fresenius 4008S blood volume monitor (BVM). This detection method possesses several advantages, such as real time, convenient, reproducible, non-invasive, and etc.

  6. Evaluation of Anterior Chamber Volume in Cataract Patients with Swept-Source Optical Coherence Tomography

    Directory of Open Access Journals (Sweden)

    Wenwen He

    2016-01-01

    Full Text Available Purpose. To evaluate the anterior chamber volume in cataract patients with Swept-Source Optical Coherence Tomography (SS-OCT and its influencing factors. Methods. Anterior chamber volume of 92 cataract patients was evaluated with SS-OCT in this cross-sectional study. Univariate analyses and multiple linear regression were used to investigate gender, age, operated eye, posterior vitreous detachment, lens opacity grading, and axial length (AXL related variables capable of influencing the ACV. Results. The average ACV was 139.80 ± 38.21 mm3 (range 59.41 to 254.09 mm3. The average ACV was significantly larger in male patients than in female patients (P=0.001. ACV was negatively correlated with age and LOCS III cortical (C grading of the lens (Pearson’s correlation analysis, r=-0.443, P<0.001, and Spearman’s correlation analysis, ρ=-0.450, P<0.001. ACV was also increased with AXL (Pearson’s correlation analysis, r=0.552, P<0.001. Multiple linear regression showed that, with all of the covariates entered into the model, gender (P=0.002, age (P=0.015, LOCS III C grade (P=0.043, and AXL (P=0.001 were still associated with ACV (F=10.252  P<0.001  R2=0.498. Conclusion. With SS-OCT, we found that, in healthy cataract patients, ACV varied significantly among different subjects. Influencing factors that contribute to reduced ACV were female gender, increased age, LOCS III C grade, and shorter AXL.

  7. Enhancement of H{sup -}/D{sup -} volume production in a double plasma type negative ion source

    Energy Technology Data Exchange (ETDEWEB)

    Fukumasa, Osamu; Nishimura, Hideki; Sakiyama, Satoshi [Yamaguchi Univ., Ube (Japan). Faculty of Engineering

    1997-02-01

    H{sup -}/D{sup -} production in a pure volume source has been studied. In our double plasma type negative ion source, both energy and density of fast electrons are well controlled. With the use of this source, the enhancement of H{sup -}/D{sup -} production has been observed. Namely, under the same discharge power, the extracted H{sup -}/D{sup -} current in the double plasma operation is higher than that in the single plasma operation. At the same time, measurements of plasma parameters have been made in the source and the extractor regions for these two cases. (author)

  8. ADAPTIVE CONTENT BASED TEXTUAL INFORMATION SOURCE PRIORITIZATION

    Directory of Open Access Journals (Sweden)

    Nikhil Mitra

    2014-10-01

    Full Text Available The world-wide-web offers a posse of textual information sources which are ready to be utilized for several applications. In fact, given the rapidly evolving nature of online data, there is a real risk of information overload unless we continue to develop and refine techniques to meaningfully segregate these information sources. Specifically, there is a dearth of content-oriented and intelligent techniques which can learn from past search experiences and also adapt to a user’s specific requirements during her current search. In this paper, we tackle the core issue of prioritizing textual information sources on the basis of the relevance of their content to the central theme that a user is currently exploring. We propose a new Source Prioritization Algorithm that adopts an iterative learning approach to assess the proclivity of given information sources towards a set of user-defined seed words in order to prioritise them. The final priorities obtained serve as initial priorities for the next search request. This serves a dual purpose. Firstly, the system learns incrementally from several users’ cumulative search experiences and re-adjusts the source priorities to reflect the acquired knowledge. Secondly, the refreshed source priorities are utilized to direct a user’s current search towards more relevant sources while adapting also to the new set of keywords acquired from that user. Experimental results show that the proposed algorithm progressively improves the system’s ability to discern between different sources, even in the presence of several random sources. Further, it is able to scale well to identify the augmented information source when a new enriched information source is generated by clubbing existing ones.

  9. Evaluation of severe accident risks: Methodology for the containment, source term, consequence, and risk integration analyses; Volume 1, Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Gorham, E.D.; Breeding, R.J.; Brown, T.D.; Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Hora, S.C. [Hawaii Univ., Hilo, HI (United States)

    1993-12-01

    NUREG-1150 examines the risk to the public from five nuclear power plants. The NUREG-1150 plant studies are Level III probabilistic risk assessments (PRAs) and, as such, they consist of four analysis components: accident frequency analysis, accident progression analysis, source term analysis, and consequence analysis. This volume summarizes the methods utilized in performing the last three components and the assembly of these analyses into an overall risk assessment. The NUREG-1150 analysis approach is based on the following ideas: (1) general and relatively fast-running models for the individual analysis components, (2) well-defined interfaces between the individual analysis components, (3) use of Monte Carlo techniques together with an efficient sampling procedure to propagate uncertainties, (4) use of expert panels to develop distributions for important phenomenological issues, and (5) automation of the overall analysis. Many features of the new analysis procedures were adopted to facilitate a comprehensive treatment of uncertainty in the complete risk analysis. Uncertainties in the accident frequency, accident progression and source term analyses were included in the overall uncertainty assessment. The uncertainties in the consequence analysis were not included in this assessment. A large effort was devoted to the development of procedures for obtaining expert opinion and the execution of these procedures to quantify parameters and phenomena for which there is large uncertainty and divergent opinions in the reactor safety community.

  10. Efficient Error Calculation for Multiresolution Texture-Based Volume Visualization

    Energy Technology Data Exchange (ETDEWEB)

    LaMar, E; Hamann, B; Joy, K I

    2001-10-16

    Multiresolution texture-based volume visualization is an excellent technique to enable interactive rendering of massive data sets. Interactive manipulation of a transfer function is necessary for proper exploration of a data set. However, multiresolution techniques require assessing the accuracy of the resulting images, and re-computing the error after each change in a transfer function is very expensive. They extend their existing multiresolution volume visualization method by introducing a method for accelerating error calculations for multiresolution volume approximations. Computing the error for an approximation requires adding individual error terms. One error value must be computed once for each original voxel and its corresponding approximating voxel. For byte data, i.e., data sets where integer function values between 0 and 255 are given, they observe that the set of error pairs can be quite large, yet the set of unique error pairs is small. instead of evaluating the error function for each original voxel, they construct a table of the unique combinations and the number of their occurrences. To evaluate the error, they add the products of the error function for each unique error pair and the frequency of each error pair. This approach dramatically reduces the amount of computation time involved and allows them to re-compute the error associated with a new transfer function quickly.

  11. Artificial Neural Network-Based System for PET Volume Segmentation

    Directory of Open Access Journals (Sweden)

    Mhd Saeed Sharif

    2010-01-01

    Full Text Available Tumour detection, classification, and quantification in positron emission tomography (PET imaging at early stage of disease are important issues for clinical diagnosis, assessment of response to treatment, and radiotherapy planning. Many techniques have been proposed for segmenting medical imaging data; however, some of the approaches have poor performance, large inaccuracy, and require substantial computation time for analysing large medical volumes. Artificial intelligence (AI approaches can provide improved accuracy and save decent amount of time. Artificial neural networks (ANNs, as one of the best AI techniques, have the capability to classify and quantify precisely lesions and model the clinical evaluation for a specific problem. This paper presents a novel application of ANNs in the wavelet domain for PET volume segmentation. ANN performance evaluation using different training algorithms in both spatial and wavelet domains with a different number of neurons in the hidden layer is also presented. The best number of neurons in the hidden layer is determined according to the experimental results, which is also stated Levenberg-Marquardt backpropagation training algorithm as the best training approach for the proposed application. The proposed intelligent system results are compared with those obtained using conventional techniques including thresholding and clustering based approaches. Experimental and Monte Carlo simulated PET phantom data sets and clinical PET volumes of nonsmall cell lung cancer patients were utilised to validate the proposed algorithm which has demonstrated promising results.

  12. Artificial Neural Network-Based System for PET Volume Segmentation.

    Science.gov (United States)

    Sharif, Mhd Saeed; Abbod, Maysam; Amira, Abbes; Zaidi, Habib

    2010-01-01

    Tumour detection, classification, and quantification in positron emission tomography (PET) imaging at early stage of disease are important issues for clinical diagnosis, assessment of response to treatment, and radiotherapy planning. Many techniques have been proposed for segmenting medical imaging data; however, some of the approaches have poor performance, large inaccuracy, and require substantial computation time for analysing large medical volumes. Artificial intelligence (AI) approaches can provide improved accuracy and save decent amount of time. Artificial neural networks (ANNs), as one of the best AI techniques, have the capability to classify and quantify precisely lesions and model the clinical evaluation for a specific problem. This paper presents a novel application of ANNs in the wavelet domain for PET volume segmentation. ANN performance evaluation using different training algorithms in both spatial and wavelet domains with a different number of neurons in the hidden layer is also presented. The best number of neurons in the hidden layer is determined according to the experimental results, which is also stated Levenberg-Marquardt backpropagation training algorithm as the best training approach for the proposed application. The proposed intelligent system results are compared with those obtained using conventional techniques including thresholding and clustering based approaches. Experimental and Monte Carlo simulated PET phantom data sets and clinical PET volumes of nonsmall cell lung cancer patients were utilised to validate the proposed algorithm which has demonstrated promising results.

  13. Ethernet-based Mass Volume Train Security Detection Network

    Directory of Open Access Journals (Sweden)

    D. Q. He

    2013-07-01

    Full Text Available As the existing train communication network transmission rate is low, large capacity status and fault diagnosis data, the event log data, passenger information which are stored in different vehicles equipments, it is difficult to realize fault diagnosis and intelligent maintenance efficiently and timely. Based on the train level and vehicle level Ethernet network, this paper will focus on network construction technology and real-time performance of mass volume onboard security detection network. The research results will improve control and network function of train.

  14. Elliptical Splats Based Isosurface Visualization for Volume Data

    Institute of Scientific and Technical Information of China (English)

    QIN Hong-xing; SHI Feng; GUO Lü; YANG Jie

    2008-01-01

    Elliptical splats are used to represent and render the isosurface of volume data. The method consists of two steps. The first step is to extract points on the isosurface by looking up the case table. In the second step, properties of splats are computed based on local geometry. Rendering is achieved using surface splatting algorithm. The obtained results show that the extraction time of isosurfaces can be reduced by a factor of three. So our approach is more appropriate for interactive visualization of large medical data than the classical marching cubes (MC) technique.

  15. Neutron Sources for Standard-Based Testing

    Energy Technology Data Exchange (ETDEWEB)

    Radev, Radoslav [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McLean, Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-11-10

    The DHS TC Standards and the consensus ANSI Standards use 252Cf as the neutron source for performance testing because its energy spectrum is similar to the 235U and 239Pu fission sources used in nuclear weapons. An emission rate of 20,000 ± 20% neutrons per second is used for testing of the radiological requirements both in the ANSI standards and the TCS. Determination of the accurate neutron emission rate of the test source is important for maintaining consistency and agreement between testing results obtained at different testing facilities. Several characteristics in the manufacture and the decay of the source need to be understood and accounted for in order to make an accurate measurement of the performance of the neutron detection instrument. Additionally, neutron response characteristics of the particular instrument need to be known and taken into account as well as neutron scattering in the testing environment.

  16. Simple Signal Source based Micro Controller

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Using micro controller, DAC and Multi-periods syn-thesis, we can buildup a very simple signal source with precisefrequency, amplitude and waveform. Wave parameters can beprogrammed in advance. The circuit can satisfy some special re-quirements.

  17. Photonic Crystal Fiber Based Entangled Photon Sources

    Science.gov (United States)

    2014-03-01

    new entanglement source is to make sure the source can provide an efficient and scalable quantum information processor . They are usually generated...multiple scattering on the telecom wavelength photon-pair. Our findings show that quantum correlation of polarization-entangled photon-pairs is...Fiber, Quantum communication, Keyed Communication in Quantum Noise (KCQ) 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT UU 18

  18. Moving sound source localization based on triangulation method

    Science.gov (United States)

    Miao, Feng; Yang, Diange; Wen, Junjie; Lian, Xiaomin

    2016-12-01

    This study develops a sound source localization method that extends traditional triangulation to moving sources. First, the possible sound source locating plane is scanned. Secondly, for each hypothetical source location in this possible plane, the Doppler effect is removed through the integration of sound pressure. Taking advantage of the de-Dopplerized signals, the moving time difference of arrival (MTDOA) is calculated, and the sound source is located based on triangulation. Thirdly, the estimated sound source location is compared to the original hypothetical location and the deviations are recorded. Because the real sound source location leads to zero deviation, the sound source can be finally located by minimizing the deviation matrix. Simulations have shown the superiority of MTDOA method over traditional triangulation in case of moving sound sources. The MTDOA method can be used to locate moving sound sources with as high resolution as DAMAS beamforming, as shown in the experiments, offering thus a new method for locating moving sound sources.

  19. A new algorithm for EEG source reconstruction based on LORETA by contracting the source region

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A new method is presented for EEG source reconstruction based on multichannel surface EEG recordings. From the low-resolution tomography obtained by the low resolution electromagnetic tomography algorithm (LORETA), this method acquires the source tomography, which has high-resolution by contracting the source region. In contrast to focal underdetermined system solver (FOCUSS), this method can gain more accurate result under certain circumstances.

  20. Radiative Transport Based Flame Volume Reconstruction from Videos.

    Science.gov (United States)

    Shen, Liang; Zhu, Dengming; Nadeem, Saad; Wang, Zhaoqi; Kaufman, Arie E

    2017-06-06

    We introduce a novel approach for flame volume reconstruction from videos using inexpensive charge-coupled device (CCD) consumer cameras. The approach includes an economical data capture technique using inexpensive CCD cameras. Leveraging the smear feature of the CCD chip, we present a technique for synchronizing CCD cameras while capturing flame videos from different views. Our reconstruction is based on the radiative transport equation which enables complex phenomena such as emission, extinction, and scattering to be used in the rendering process. Both the color intensity and temperature reconstructions are implemented using the CUDA parallel computing framework, which provides real-time performance and allows visualization of reconstruction results after every iteration. We present the results of our approach using real captured data and physically-based simulated data. Finally, we also compare our approach against the other state-of-the-art flame volume reconstruction methods and demonstrate the efficacy and efficiency of our approach in four different applications: (1) rendering of reconstructed flames in virtual environments, (2) rendering of reconstructed flames in augmented reality, (3) flame stylization, and (4) reconstruction of other semitransparent phenomena.

  1. Singles transmission in volume-imaging PET with a 137Cs source.

    Science.gov (United States)

    Karp, J S; Muehllehner, G; Qu, H; Yan, X H

    1995-05-01

    The feasibility of a new method of attenuation correction in PET has been investigated, using a single-photon emitter for the transmission scan. The transmission scan is predicted to be more than a factor of ten faster with the singles method than the standard coincidence method, for comparable statistics. Thus, a transmission scan be completed in 1-2 min, rather than 10-20 min, as is common practice with the coincidence method. In addition, a potential advantage of using the single-photon source 137Cs, which has an energy of 662 keV, is that postinjection transmission studies can be performed using energy discrimination to separate the transmission from the emission data at 511 keV. In order to compensate for the energy difference of the attenuation coefficients at 662 keV compared to 511 keV, the transmission images are segmented into two compartments, tissue and lung, and known values (for 511 keV) of attenuation are inserted into these compartments. This technique also compensates for the higher amount of scatter present with the singles method, since it is not possible to use a position gate (based on collinearity of the source and two detector positions) as is commonly done with a positron-emitting source. We have demonstrated, with experimental phantom studies, that the singles transmission method combined with segmentation gives results equivalent both qualitatively and quantitatively to the coincidence method, but requires significantly less time.

  2. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose-volume outcome relationships

    Energy Technology Data Exchange (ETDEWEB)

    Naqa, I El [Washington University, Saint Louis, MO (United States); Suneja, G [Brown Medical School, Providence, RI (United States); Lindsay, P E [Washington University, St. Louis, MO (United States); Hope, A J [Washington University, Saint Louis, MO (United States); Alaly, J R [Washington University, Saint Louis, MO (United States); Vicic, M [Washington University, Saint Louis, MO (United States); Bradley, J D [Washington University, Saint Louis, MO (United States); Apte, A [Washington University, Saint Louis, MO (United States); Deasy, J O [Washington University, Saint Louis, MO (United States)

    2006-11-21

    Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.

  3. Dose response explorer: an integrated open-source tool for exploring and modelling radiotherapy dose volume outcome relationships

    Science.gov (United States)

    El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.

    2006-11-01

    Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.

  4. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  5. TRIPPy: Python-based Trailed Source Photometry

    Science.gov (United States)

    Fraser, Wesley C.; Alexandersen, Mike; Schwamb, Megan E.; Marsset, Michael E.; Pike, Rosemary E.; Kavelaars, JJ; Bannister, Michele T.; Benecchi, Susan; Delsanti, Audrey

    2016-05-01

    TRIPPy (TRailed Image Photometry in Python) uses a pill-shaped aperture, a rectangle described by three parameters (trail length, angle, and radius) to improve photometry of moving sources over that done with circular apertures. It can generate accurate model and trailed point-spread functions from stationary background sources in sidereally tracked images. Appropriate aperture correction provides accurate, unbiased flux measurement. TRIPPy requires numpy, scipy, matplotlib, Astropy (ascl:1304.002), and stsci.numdisplay; emcee (ascl:1303.002) and SExtractor (ascl:1010.064) are optional.

  6. Extrasynaptic exocytosis and its mechanisms: a source of molecules mediating volume transmission in the nervous system

    Directory of Open Access Journals (Sweden)

    Citlali eTrueta

    2012-09-01

    Full Text Available We review the evidence of exocytosis from extrasynaptic sites in the soma, dendrites and axonal varicosities of central and peripheral neurons of vertebrates and invertebrates, and how it may contribute to signaling in the nervous system. The finding of secretory vesicles in extrasynaptic sites of neurons, the presence of transmitters in the extracellular space outside synaptic clefts, and the mismatch between exocytosis sites and the location of receptors for these molecules in neurons and glial cells, have long suggested that in addition to synaptic communication, transmitters are released and act extrasynaptically. The catalog of these molecules includes low molecular weight transmitters such as monoamines, acetylcholine, glutamate, GABA, ATP, and a list of peptides including substance P, BDNF, and oxytocin. By comparing the mechanisms of extrasynaptic exocytosis of different molecules in various neuron types we show that extrasynaptic exocytosis is a widespread mechanism for communication in the nervous system that uses certain common mechanisms, which are different from those of synaptic exocytosis but similar to those of exocytosis from excitable endocrine cells. Somatic exocytosis, which has been measured directly in different neuron types, starts after high-frequency electrical activity or long experimental depolarizations and may continue for several minutes after the end of stimulation. Activation of L-type calcium channels, calcium release from intracellular stores and vesicle transport couples excitation and exocytosis from small clear or large dense core vesicles in release sites lacking postsynaptic counterparts. The presence of synaptic and extrasynaptic exocytosis endows individual neurons with a wide variety of time- and space-dependent communication possibilities. Extrasynaptic exocytosis may be the major source of signaling molecules producing volume transmission and by doing so may be part of a long duration signaling mode in

  7. FORMALIZING PRODUCT COST DISTORTION: The Impact of Volume-Related Allocation Bases on Cost Information

    Directory of Open Access Journals (Sweden)

    Johnny Jermias

    2003-09-01

    Full Text Available The purpose o f this study is to formally analyze product cost distortions resulting from the process of allocating costs to products based on Activity-Based Costing (ABC and the conventional product costing systems. The model developed in this paper rigorously shows the impact of treating costs that are not volume related as if they are. The model demonstrates that the source of product cost distortion is the difference between the proportion of driver used by each product in ABC and the proportion of the base used by the same product in the conventional costing systems. The difference arises because the conventional costing systems ignore the existence of batch-related and product-related costs. The model predicts a positive association between volume and size diversity with product cost distortions. When interaction between volume and size diversity exists, the distortion is either mitigated or exacerbated. The magnitude of the distortion is jointly determined by the size of the differences and the size of the total indirect costs.

  8. FY02 Engineering Technology Reports Volume 1: Technology Base

    Energy Technology Data Exchange (ETDEWEB)

    Minichino, C; Meeker, D

    2003-01-28

    Engineering has touched on every challenge, every accomplishment, and every endeavor of Lawrence Livermore National Laboratory during its fifty-year history. In this time of transition to new leadership, Engineering continues to be central to the mission of the Laboratory, returning to the tradition and core values of E. O. Lawrence: science-based engineering--turning scientific concepts into reality. This volume of Engineering Technical Reports summarizes progress on the projects funded for technology-base efforts. Technology-base projects effect the natural transition to reduction-to-practice of scientific or engineering methods that are well understood and established. They represent discipline-oriented, core competency activities that are multi-programmatic in application, nature, and scope. Objectives of technology-base funding include: (1) the development and enhancement of tools and processes to provide Engineering support capability, such as code maintenance and improved fabrication methods; (2) the support of Engineering science and technology infrastructure, such as the installation or integration of a new capability; (3) support for technical and administrative leadership through our technology Centers; and (4) the initial scoping and exploration of selected technology areas with high strategic potential, such as assessment of university, laboratory, and industrial partnerships. Five Centers focus and guide longer-term investments within Engineering. The Centers attract and retain top staff, develop and maintain critical core technologies, and enable programs. Through their technology-base projects, they oversee the application of known engineering approaches and techniques to scientific and technical problems.

  9. Seneca Falls: Achieving Woman's Rights. Teaching with Primary Sources Series, Volume 12.

    Science.gov (United States)

    Hodges, Elaine Prater

    This volume provides documentation on the origin of the women's rights movement placing the documents in a context that aims to show the rationale that blocked women from achieving full equality. The volume contains 127 fully annotated documents presented in chronological order (with a few exceptions) beginning in 1632 with colonial laws regarding…

  10. Engineering Technology Reports, Volume 2: Technology Base FY01

    Energy Technology Data Exchange (ETDEWEB)

    Minichino, C; Meeker, D

    2002-07-01

    Engineering has touched on every challenge, every accomplishment, and every endeavor of Lawrence Livermore National Laboratory during its fifty-year history. In this time of transition to new leadership, Engineering continues to be central to the mission of the Laboratory, returning to the tradition and core values of E.O. Lawrence: science-based engineering--turning scientific concepts into reality. This volume of Engineering Technical Reports summarizes progress on the projects funded for technology-base efforts. Technology-base projects effect the natural transition to reduction-to-practice of scientific or engineering methods that are well understood and established. They represent discipline-oriented, core competency activities that are multi-programmatic in application, nature, and scope. Objectives of technology-base funding include: (1) the development and enhancement of tools and processes to provide Engineering support capability, such as code maintenance and improved fabrication methods; (2) the support of Engineering science and technology infrastructure, such as the installation or integration of a new capability; (3) support for technical and administrative leadership through our technology Centers; (4) the initial scoping and exploration of selected technology areas with high strategic potential, such as assessment of university, laboratory, and industrial partnerships.

  11. Measuring glioma volumes: A comparison of linear measurement based formulae with the manual image segmentation technique

    Directory of Open Access Journals (Sweden)

    Sanjeev A Sreenivasan

    2016-01-01

    Conclusions: Manual region of interest-based image segmentation is the standard technique for measuring glioma volumes. For routine clinical use, the simple formula v = abc/2 (or the formula for volume of an ellipsoid could be used as alternatives.

  12. Room Volume Estimation Based on Ambiguity of Short-Term Interaural Phase Differences Using Humanoid Robot Head

    Directory of Open Access Journals (Sweden)

    Ryuichi Shimoyama

    2016-07-01

    Full Text Available Humans can recognize approximate room size using only binaural audition. However, sound reverberation is not negligible in most environments. The reverberation causes temporal fluctuations in the short-term interaural phase differences (IPDs of sound pressure. This study proposes a novel method for a binaural humanoid robot head to estimate room volume. The method is based on the statistical properties of the short-term IPDs of sound pressure. The humanoid robot turns its head toward a sound source, recognizes the sound source, and then estimates the ego-centric distance by its stereovision. By interpolating the relations between room volume, average standard deviation, and ego-centric distance experimentally obtained for various rooms in a prepared database, the room volume was estimated by the binaural audition of the robot from the average standard deviation of the short-term IPDs at the estimated distance.

  13. Home-based sourcing of tobacco among adolescents.

    Science.gov (United States)

    Rainio, Susanna U; Rimpelä, Arja H

    2009-04-01

    To study home-based sources of tobacco and associated family factors among Finnish adolescents. Nationwide surveys (1999, 2003, 2007) of 14-16-year-old daily (n=2355), occasional (n=708), and experimental (n=2763) smokers. The main outcome measure was home-based sourcing of tobacco (parents, siblings, taking from home) during the past month. Logistic regression was used for statistical analysis. Home-based sources were used by 44% of daily, 11% of occasional, and 9% of experimental smokers; other social sources by 93%, 65%, and 51%; and commercial sources by 70%, 28%, and 10% respectively. Among daily smokers, home sources meant siblings (24%), parents (19%), and taking from home (19%). Parental smoking and absence of a home-smoking ban increased home-based sourcing. The odds ratio (OR) for obtaining tobacco from any home-based source was 6.96 (95% CI: 3.75-12.91) and from parents 7.44 (2.68-20.65) when both parents smoked versus nonsmoking parents. In the absence of a home-smoking ban, corresponding ORs were 2.21 (1.28-3.81) and 21.33 (2.84-60.30) versus those reporting having a ban. Obtaining tobacco from parents was more common in single-parent/reconstituted families than in families with two biological parents. Parents should be provided with guidance about the consequences of home-based sourcing in the persistence of children's smoking habit.

  14. Quantitative prediction of respiratory tidal volume based on the external torso volume change: a potential volumetric surrogate

    Energy Technology Data Exchange (ETDEWEB)

    Li Guang; Arora, Naveen C; Xie Huchen; Ning, Holly; Citrin, Deborah; Kaushal, Aradhana; Zach, Leor; Camphausen, Kevin; Miller, Robert W [Radiation Oncology Branch, National Cancer Institute, National Institutes of Health, Bethesda, MD 20892 (United States); Lu Wei; Low, Daniel [Department of Radiation Oncology, Washington University School of Medicine, St Louis, MO 63110 (United States)], E-mail: ligeorge@mail.nih.gov

    2009-04-07

    An external respiratory surrogate that not only highly correlates with but also quantitatively predicts internal tidal volume should be useful in guiding four-dimensional computed tomography (4DCT), as well as 4D radiation therapy (4DRT). A volumetric surrogate should have advantages over external fiducial point(s) for monitoring respiration-induced motion of the torso, which deforms in synchronization with a patient-specific breathing pattern. This study establishes a linear relationship between the external torso volume change (TVC) and lung air volume change (AVC) by validating a proposed volume conservation hypothesis (TVC = AVC) throughout the respiratory cycle using 4DCT and spirometry. Fourteen patients' torso 4DCT images and corresponding spirometric tidal volumes were acquired to examine this hypothesis. The 4DCT images were acquired using dual surrogates in cine mode and amplitude-based binning in 12 respiratory stages, minimizing residual motion artifacts. Torso and lung volumes were calculated using threshold-based segmentation algorithms and volume changes were calculated relative to the full-exhalation stage. The TVC and AVC, as functions of respiratory stages, were compared, showing a high correlation (r = 0.992 {+-} 0.005, p < 0.0001) as well as a linear relationship (slope = 1.027 {+-} 0.061, R{sup 2} = 0.980) without phase shift. The AVC was also compared to the spirometric tidal volumes, showing a similar linearity (slope = 1.030 {+-} 0.092, R{sup 2} = 0.947). In contrast, the thoracic and abdominal heights measured from 4DCT showed relatively low correlation (0.28 {+-} 0.44 and 0.82 {+-} 0.30, respectively) and location-dependent phase shifts. This novel approach establishes the foundation for developing an external volumetric respiratory surrogate.

  15. Van de Graaff based positron source production

    Science.gov (United States)

    Lund, Kasey Roy

    The anti-matter counterpart to the electron, the positron, can be used for a myriad of different scientific research projects to include materials research, energy storage, and deep space flight propulsion. Currently there is a demand for large numbers of positrons to aid in these mentioned research projects. There are different methods of producing and harvesting positrons but all require radioactive sources or large facilities. Positron beams produced by relatively small accelerators are attractive because they are easily shut down, and small accelerators are readily available. A 4MV Van de Graaff accelerator was used to induce the nuclear reaction 12C(d,n)13N in order to produce an intense beam of positrons. 13N is an isotope of nitrogen that decays with a 10 minute half life into 13C, a positron, and an electron neutrino. This radioactive gas is frozen onto a cryogenic freezer where it is then channeled to form an antimatter beam. The beam is then guided using axial magnetic fields into a superconducting magnet with a field strength up to 7 Tesla where it will be stored in a newly designed Micro-Penning-Malmberg trap. Several source geometries have been experimented on and found that a maximum antimatter beam with a positron flux of greater than 0.55x10 6 e+s-1 was achieved. This beam was produced using a solid rare gas moderator composed of krypton. Due to geometric restrictions on this set up, only 0.1-1.0% of the antimatter was being frozen to the desired locations. Simulations and preliminary experiments suggest that a new geometry, currently under testing, will produce a beam of 107 e+s-1 or more.

  16. Hiding the Source Based on Limited Flooding for Sensor Networks

    Directory of Open Access Journals (Sweden)

    Juan Chen

    2015-11-01

    Full Text Available Wireless sensor networks are widely used to monitor valuable objects such as rare animals or armies. Once an object is detected, the source, i.e., the sensor nearest to the object, generates and periodically sends a packet about the object to the base station. Since attackers can capture the object by localizing the source, many protocols have been proposed to protect source location. Instead of transmitting the packet to the base station directly, typical source location protection protocols first transmit packets randomly for a few hops to a phantom location, and then forward the packets to the base station. The problem with these protocols is that the generated phantom locations are usually not only near the true source but also close to each other. As a result, attackers can easily trace a route back to the source from the phantom locations. To address the above problem, we propose a new protocol for source location protection based on limited flooding, named SLP. Compared with existing protocols, SLP can generate phantom locations that are not only far away from the source, but also widely distributed. It improves source location security significantly with low communication cost. We further propose a protocol, namely SLP-E, to protect source location against more powerful attackers with wider fields of vision. The performance of our SLP and SLP-E are validated by both theoretical analysis and simulation results.

  17. Sampling-based motion planning with reachable volumes: Theoretical foundations

    KAUST Repository

    McMahon, Troy

    2014-05-01

    © 2014 IEEE. We introduce a new concept, reachable volumes, that denotes the set of points that the end effector of a chain or linkage can reach. We show that the reachable volume of a chain is equivalent to the Minkowski sum of the reachable volumes of its links, and give an efficient method for computing reachable volumes. We present a method for generating configurations using reachable volumes that is applicable to various types of robots including open and closed chain robots, tree-like robots, and complex robots including both loops and branches. We also describe how to apply constraints (both on end effectors and internal joints) using reachable volumes. Unlike previous methods, reachable volumes work for spherical and prismatic joints as well as planar joints. Visualizations of reachable volumes can allow an operator to see what positions the robot can reach and can guide robot design. We present visualizations of reachable volumes for representative robots including closed chains and graspers as well as for examples with joint and end effector constraints.

  18. Sourcing Team Behavior in Project-Based MNE's

    DEFF Research Database (Denmark)

    Hansen, Anders Peder Lysholm

    2014-01-01

    across the three cases was characterized by conflict between departments represented in the category teams. This resulted in unfortunate sourcing team behaviour and unaligned performance management, which in turn had a number of adverse effects. Further research on how to create a holistic and balanced......This paper presents and discusses a multiple case study of three cross-functional category teams responsible for sourcing critical components within multi-national, project-based enterprises. The study focused on behaviour and management of the sourcing teams and found that the sourcing process...... team perspective in the sourcing teams is suggested....

  19. Dynamic Garment Simulation based on Hybrid Bounding Volume Hierarchy

    Directory of Open Access Journals (Sweden)

    Zhu Dongyong

    2016-12-01

    Full Text Available In order to solve the computing speed and efficiency problem of existing dynamic clothing simulation, this paper presents a dynamic garment simulation based on a hybrid bounding volume hierarchy. It firstly uses MCASG graph theory to do the primary segmentation for a given three-dimensional human body model. And then it applies K-means cluster to do the secondary segmentation to collect the human body’s upper arms, lower arms, upper legs, lower legs, trunk, hip and woman’s chest as the elementary units of dynamic clothing simulation. According to different shapes of these elementary units, it chooses the closest and most efficient hybrid bounding box to specify these units, such as cylinder bounding box and elliptic cylinder bounding box. During the process of constructing these bounding boxes, it uses the least squares method and slices of the human body to get the related parameters. This approach makes it possible to use the least amount of bounding boxes to create close collision detection regions for the appearance of the human body. A spring-mass model based on a triangular mesh of the clothing model is finally constructed for dynamic simulation. The simulation result shows the feasibility and superiority of the method described.

  20. Patrick Air Force Base integrated resource assessment. Volume 2, Baseline detail

    Energy Technology Data Exchange (ETDEWEB)

    Wahlstrom, R.R.; King, D.A.; Parker, S.A.; Sandusky, W.F.

    1993-08-01

    The US Air Force has tasked the Pacific Northwest Laboratory (PNL), in support of the US Department of Energy (DOE) Federal Energy Management Program (FEMP), to assess energy use at Patrick Air Force Base (AFB). The information obtained from this assessment will be used in identifying energy resource opportunities to reduce overall energy consumption on the base. The primary focus of this report is to assess the current baseline energy consumption at Patrick AFB. It is a comparison report to Volume 1, the Executive Summary, and Volume 3, the Resource Assessment. This assessment requires that information be obtained and characterized for buildings, utilities, energy sources, energy uses, and load profile information to be used to improve the characterization of energy use on the base. The characteristics of electricity, natural gas, and No. 2 fuel oil are analyzed for on-base facilities and housing. The assessment examines basic regional information used to determine energy-use intensity (EUI) values for Patrick AFB facilities by building, fuel type, and energy end use. It also provides a summary of electricity consumption from Florida Power and Light Company (FPL) metered data for 1985-1991. Load profile information obtained from FPL data is presented for the north and south substations for the four seasons of the year, including weekdays and weekends.

  1. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    Science.gov (United States)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  2. Quantitation of mandibular symphysis volume as a source of bone grafting.

    Science.gov (United States)

    Verdugo, Fernando; Simonian, Krikor; Smith McDonald, Roberto; Nowzari, Hessam

    2010-06-01

    Autogenous intramembranous bone graft present several advantages such as minimal resorption and high concentration of bone morphogenetic proteins. A method for measuring the amount of bone that can be harvested from the symphysis area has not been reported in real patients. The aim of the present study was to intrasurgically quantitate the volume of the symphysis bone graft that can be safely harvested in live patients and compare it with AutoCAD (version 16.0, Autodesk, Inc., San Rafael, CA, USA) tomographic calculations. AutoCAD software program quantitated symphysis bone graft in 40 patients using computerized tomographies. Direct intrasurgical measurements were recorded thereafter and compared with AutoCAD data. The bone volume was measured at the recipient sites of a subgroup of 10 patients, 6 months post sinus augmentation. The volume of bone graft measured by AutoCAD averaged 1.4 mL (SD 0.6 mL, range: 0.5-2.7 mL). The volume of bone graft measured intrasurgically averaged 2.3 mL (SD 0.4 mL, range 1.7-2.8 mL). The statistical difference between the two measurement methods was significant. The bone volume measured at the recipient sites 6 months post sinus augmentation averaged 1.9 mL (SD 0.3 mL, range 1.3-2.6 mL) with a mean loss of 0.4 mL. AutoCAD did not overestimate the volume of bone that can be safely harvested from the mandibular symphysis. The use of the design software program may improve surgical treatment planning prior to sinus augmentation.

  3. Electron Source based on Superconducting RF

    Science.gov (United States)

    Xin, Tianmu

    High-bunch-charge photoemission electron-sources operating in a Continuous Wave (CW) mode can provide high peak current as well as the high average current which are required for many advanced applications of accelerators facilities, for example, electron coolers for hadron beams, electron-ion colliders, and Free-Electron Lasers (FELs). Superconducting Radio Frequency (SRF) has many advantages over other electron-injector technologies, especially when it is working in CW mode as it offers higher repetition rate. An 112 MHz SRF electron photo-injector (gun) was developed at Brookhaven National Laboratory (BNL) to produce high-brightness and high-bunch-charge bunches for electron cooling experiments. The gun utilizes a Quarter-Wave Resonator (QWR) geometry for a compact structure and improved electron beam dynamics. The detailed RF design of the cavity, fundamental coupler and cathode stalk are presented in this work. A GPU accelerated code was written to improve the speed of simulation of multipacting, an important hurdle the SRF structure has to overcome in various locations. The injector utilizes high Quantum Efficiency (QE) multi-alkali photocathodes (K2CsSb) for generating electrons. The cathode fabrication system and procedure are also included in the thesis. Beam dynamic simulation of the injector was done with the code ASTRA. To find the optimized parameters of the cavities and beam optics, the author wrote a genetic algorithm Python script to search for the best solution in this high-dimensional parameter space. The gun was successfully commissioned and produced world record bunch charge and average current in an SRF photo-injector.

  4. Smart material-based radiation sources

    Science.gov (United States)

    Kovaleski, Scott

    2014-10-01

    From sensors to power harvesters, the unique properties of smart materials have been exploited in numerous ways to enable new applications and reduce the size of many useful devices. Smart materials are defined as materials whose properties can be changed in a controlled and often reversible fashion by use of external stimuli, such as electric and magnetic fields, temperature, or humidity. Smart materials have been used to make acceleration sensors that are ubiquitous in mobile phones, to make highly accurate frequency standards, to make unprecedentedly small actuators and motors, to seal and reduce friction of rotating shafts, and to generate power by conversion of either kinetic or thermal energy to electrical energy. The number of useful devices enabled by smart materials is large and continues to grow. Smart materials can also be used to generate plasmas and accelerate particles at small scales. The materials discussed in this talk are from non-centrosymmetric crystalline classes including piezoelectric, pyroelectric, and ferroelectric materials, which produce large electric fields in response to external stimuli such as applied electric fields or thermal energy. First, the use of ferroelectric, pyroelectric and piezoelectric materials for plasma generation and particle acceleration will be reviewed. The talk will then focus on the use of piezoelectric materials at the University of Missouri to construct plasma sources and electrostatic accelerators for applications including space propulsion, x-ray imaging, and neutron production. The basic concepts of piezoelectric transformers, which are analogous to conventional magnetic transformers, will be discussed, along with results from experiments over the last decade to produce micro-thrusters for space propulsion and particle accelerators for x-ray and neutron production. Support from ONR, AFOSR, and LANL.

  5. Feasibility study of PDT light sources based on lasing action in strongly scattering media

    Science.gov (United States)

    Lilge, Lothar D.; Pang, Gendi; Jonkman, James; Wilson, Brian C.

    1997-05-01

    Lasing action in strongly scattering media containing a fluorescent dye and pumped by a pulsed high peak power laser can be used to produce light sources which may be suitable for surface or intracavity light delivery in photodynamic therapy, eliminating the need for a dye laser to obtain selectable treatment wavelengths. The present study focuses on evaluating the effects, in cylindrical fiber tip sources for interstitial light delivery, of fluorophore concentration and scattering particle density on lasing peak power, emission wavelength and maximum deliverable, clinically useful fluence-rate and radiant exposure. The sources tested are comprised of Rhodamine 640 perchloride incorporated into a TiO2 based scattering matrix in either ethylene glycol or methanol. The cylindrical fiber tips, 10 mm long and 2 mm in diameter, were pumped via a 320 micrometers diameter multimode optical fiber, achieving line narrowing to approximately 7 nm FWHM at approximately 617 nm, using pulse energies of 1.7 mJ, delivered in 10 nsec from a Q-switched, frequency doubled Nd:YAG laser. The results showed the dependence of the total gain length in the pump volume and reabsorption effects in the remaining volume of the fiber tip. Sources capable of delivering sufficient radiant exposure for clinical use were achieved. While these sources are promising, for clinical use of these fiber sources pump lasers delivering MW pulses at high repetition rates are required to achieve acceptable total irradiation times.

  6. The New York Head-A precise standardized volume conductor model for EEG source localization and tES targeting.

    Science.gov (United States)

    Huang, Yu; Parra, Lucas C; Haufe, Stefan

    2016-10-15

    In source localization of electroencephalograpic (EEG) signals, as well as in targeted transcranial electric current stimulation (tES), a volume conductor model is required to describe the flow of electric currents in the head. Boundary element models (BEM) can be readily computed to represent major tissue compartments, but cannot encode detailed anatomical information within compartments. Finite element models (FEM) can capture more tissue types and intricate anatomical structures, but with the higher precision also comes the need for semi-automated segmentation, and a higher computational cost. In either case, adjusting to the individual human anatomy requires costly magnetic resonance imaging (MRI), and thus head modeling is often based on the anatomy of an 'arbitrary' individual (e.g. Colin27). Additionally, existing reference models for the human head often do not include the cerebro-spinal fluid (CSF), and their field of view excludes portions of the head and neck-two factors that demonstrably affect current-flow patterns. Here we present a highly detailed FEM, which we call ICBM-NY, or "New York Head". It is based on the ICBM152 anatomical template (a non-linear average of the MRI of 152 adult human brains) defined in MNI coordinates, for which we extended the field of view to the neck and performed a detailed segmentation of six tissue types (scalp, skull, CSF, gray matter, white matter, air cavities) at 0.5mm(3) resolution. The model was solved for 231 electrode locations. To evaluate its performance, additional FEMs and BEMs were constructed for four individual subjects. Each of the four individual FEMs (regarded as the 'ground truth') is compared to its BEM counterpart, the ICBM-NY, a BEM of the ICBM anatomy, an 'individualized' BEM of the ICBM anatomy warped to the individual head surface, and FEMs of the other individuals. Performance is measured in terms of EEG source localization and tES targeting errors. Results show that the ICBM-NY outperforms

  7. Coarsening in high volume fraction nickel-base alloys

    Science.gov (United States)

    Mackay, R. A.; Nathal, M. V.

    1990-01-01

    The coarsening behavior of the gamma-prime precipitate has been examined in high volume fraction nickel-base alloys aged at elevated temperatures for times of up to 5000 h. Although the cube rate law was observed during coarsening, none of the presently available coarsening theories showed complete agreement with the experimental particle size distributions (PSDs). These discrepancies were thought to be due to elastic coherency strains which were not considered by the available models. Increasing the Mo content significantly influenced the PSDs and decreased the coarsening rate of the gamma-prime cubes, as a result of increasing the magnitude of the lattice mismatch. After extended aging times, the gamma-prime cubes underwent massive coalescence into plates at a rate which was much faster than the cuboidal coarsening rate. Once the gamma-prime plates were formed, further coarsening was not observed, and this stabilization of the microstructure was attributed to the development of dislocation networks at the gamma-gamma-prime interfaces.

  8. Patch nearfield acoustic holography based on the equivalent source method

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    On the basis of nearfield acoustic holography (NAH) based on the equivalent source method (ESM), patch NAH based on the ESM is proposed. The method overcomes the shortcoming in the conventional NAH that the hologram surface should be larger than the source surface. It need not to discretize the whole source and its measurement need not to cover the whole source. The measurement may be performed over the region of interest, and the reconstruction can be done in the region directly. The method is flexible in applications, stable in computation, and very easy to implement. It has good potential applications in engineering. The nu- merical simulations show the invalidity of the conventional NAH based on the ESM and prove the validities of the proposed method for reconstructing a partial source and the regularization for reducing the error effect of the pressure measured on the hologram surface.

  9. Patch nearfield acoustic holography based on the equivalent source method

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    On the basis of nearfield acoustic holography (NAH) based on the equivalent source method (ESM), patch NAH based on the ESM is proposed. The method overcomes the shortcoming in the conventional NAH that the hologram surface should be larger than the source surface. It need not to discretize the whole source and its measurement need not to cover the whole source. The measurement may be performed over the region of interest, and the reconstruction can be done in the region directly. The method is flexible in applications, stable in computation, and very easy to implement. It has good potential applications in engineering. The numerical simulations show the invalidity of the conventional NAH based on the ESM and prove the validities of the proposed method for reconstructing a partial source and the regularization for reducing the error effect of the pressure measured on the hologram surface.

  10. Quantitation of mandibular ramus volume as a source of bone grafting.

    Science.gov (United States)

    Verdugo, Fernando; Simonian, Krikor; Smith McDonald, Roberto; Nowzari, Hessam

    2009-10-01

    When alveolar atrophy impairs dental implant placement, ridge augmentation using mandibular ramus graft may be considered. In live patients, however, an accurate calculation of the amount of bone that can be safely harvested from the ramus has not been reported. The use of a software program to perform these calculations can aid in preventing surgical complications. The aim of the present study was to intra-surgically quantify the volume of the ramus bone graft that can be safely harvested in live patients, and compare it to presurgical computerized tomographic calculations. The AutoCAD software program quantified ramus bone graft in 40 consecutive patients from computerized tomographies. Direct intra-surgical measurements were recorded thereafter and compared to software data (n = 10). In these 10 patients, the bone volume was also measured at the recipient sites 6 months post-sinus augmentation. The mandibular second and third molar areas provided the thickest cortical graft averaging 2.8 +/- 0.6 mm. The thinnest bone was immediately posterior to the third molar (1.9 +/- 0.3 mm). The volume of ramus bone graft measured by AutoCAD averaged 0.8 mL (standard deviation [SD] 0.2 mL, range: 0.4-1.2 mL). The volume of bone graft measured intra-surgically averaged 2.5 mL (SD 0.4 mL, range: 1.8-3.0 mL). The difference between the two measurement methods was significant (p AutoCAD software program did not overestimate the volume of bone that can be safely harvested from the mandibular ramus.

  11. Source—to—Source Conversion Based on Formal Definition

    Institute of Scientific and Technical Information of China (English)

    张幸儿; 李建新; 等

    1991-01-01

    This paper proposes the idea of source-to-source conversion between two heterogeneous high-level programming languages.The conversion is based on formal definition and oriented to multi-pairs of languages.The issues in conversion from PASCAL to C are also discussed.

  12. Gross tumor volume (GTV) and clinical target volume (CTV) for radiation therapy of benign skull base tumours; Volume tumoral macroscopique (GTV) et volume-cible anatomoclinique (CTV) dans la radiotherapie des tumeurs benignes de la base du crane

    Energy Technology Data Exchange (ETDEWEB)

    Maire, J.P. [Centre Hospitalier Universitaire de Bordeaux, Hopital Saint Andre, Service d' Oncologie Radiotherapie, 33 - Bordeaux (France); Liguoro, D.; San Galli, F. [Centre Hospitalier Universitaire de Bordeaux, Hopital Saint Andre, Service de Neurochirurgie A, 33 - Bordeaux (France)

    2001-10-01

    Skull base tumours represent a out 35 to 40% of all intracranial tumours. There are now many reports in the literature confirming the fact that about 80 to 90% of such tumours are controlled with fractionated radiotherapy. Stereotactic and 3-dimensional treatment planning techniques increase local control and central nervous system tolerance. Definition of the gross tumor volume (GTV) is generally easy with currently available medical imaging systems and computers for 3-dimensional dosimetry. The definition of the clinical target volume (CTV) is more difficult to appreciate: it is defined from the CTV plus a margin, which depends on the histology and anterior therapeutic history of the tumour. It is important to take into account the visible tumour and its possible extension pathways (adjacent bone, holes at the base of skull) and/or an anatomic region (sella turcica + adjacent cavernous sinus). It is necessary to evaluate these volumes with CT Scan and MRI to appreciate tumor extension in a 3-dimensional approach, in order to reduce the risk of marginal recurrences. The aim of this paper is to discuss volume definition as a function of tumour site and tumour type to be irradiated. (authors)

  13. Light sources for high-volume manufacturing EUV lithography: technology, performance, and power scaling

    Science.gov (United States)

    Fomenkov, Igor; Brandt, David; Ershov, Alex; Schafgans, Alexander; Tao, Yezheng; Vaschenko, Georgiy; Rokitski, Slava; Kats, Michael; Vargas, Michael; Purvis, Michael; Rafac, Rob; La Fontaine, Bruno; De Dea, Silvia; LaForge, Andrew; Stewart, Jayson; Chang, Steven; Graham, Matthew; Riggs, Daniel; Taylor, Ted; Abraham, Mathew; Brown, Daniel

    2017-06-01

    Extreme ultraviolet (EUV) lithography is expected to succeed in 193-nm immersion multi-patterning technology for sub-10-nm critical layer patterning. In order to be successful, EUV lithography has to demonstrate that it can satisfy the industry requirements in the following critical areas: power, dose stability, etendue, spectral content, and lifetime. Currently, development of second-generation laser-produced plasma (LPP) light sources for the ASML's NXE:3300B EUV scanner is complete, and first units are installed and operational at chipmaker customers. We describe different aspects and performance characteristics of the sources, dose stability results, power scaling, and availability data for EUV sources and also report new development results.

  14. Single channel blind source separation based on ICA feature extraction

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    A new technique is proposed to solve the blind source separation (BSS) given only a single channel observation. The basis functions and the density of the coefficients of source signals learned by ICA are used as the prior knowledge. Based on the learned prior information the learning rules of single channel BSS are presented by maximizing the joint log likelihood of the mixed sources to obtain source signals from single observation,in which the posterior density of the given measurements is maximized. The experimental results exhibit a successful separation performance for mixtures of speech and music signals.

  15. AlInGaN-Based Superlattice Terahertz Source Project

    Data.gov (United States)

    National Aeronautics and Space Administration — WaveBand Corporation in collaboration with Virginia Commonwealth University proposes to design and fabricate a new sub-millimeter source based on an InAlGaN...

  16. Arc Length Based Grid Distribution For Surface and Volume Grids

    Science.gov (United States)

    Mastin, C. Wayne

    1996-01-01

    Techniques are presented for distributing grid points on parametric surfaces and in volumes according to a specified distribution of arc length. Interpolation techniques are introduced which permit a given distribution of grid points on the edges of a three-dimensional grid block to be propagated through the surface and volume grids. Examples demonstrate how these methods can be used to improve the quality of grids generated by transfinite interpolation.

  17. Arc length based grid distribution for surface and volume grids

    Energy Technology Data Exchange (ETDEWEB)

    Mastin, C.W. [NASA Langley Research Center, Hampton, VA (United States)

    1996-12-31

    Techniques are presented for distributing grid points on parametric surfaces and in volumes according to a specified distribution of arc length. Interpolation techniques are introduced which permit a given distribution of grid points on the edges of a three-dimensional grid block to be propagated through the surface and volume grids. Examples demonstrate how these methods can be used to improve the quality of grids generated by transfinite interpolation.

  18. Fast estimation of lacustrine groundwater discharge volumes based on stable water isotopes

    Science.gov (United States)

    Lewandowski, Jörg; Gercken, Jasper; Premke, Katrin; Meinikmann, Karin

    2017-04-01

    Lake eutrophication is still a severe problem in many parts of the world, commonly due to anthropogenic sources of nutrients such as fertilizer, manure or sewage. Improved quantification of nutrient inputs to lakes is required to address this problem. One possible input path for nutrients is lacustrine groundwater discharge (LGD). However, LGD has often been disregarded in water and nutrient budgets of lakes although some studies reveal an extraordinary importance of LGD for phosphorus inputs. The aim of the present study is to identify lakes that receive large LGD volumes compared to other input paths. Such lakes are more prone to high groundwater-borne nutrient inputs than lakes with small LGD volumes. . The simple and fast approach used in the present study is based on the fact that evaporation of surface water causes an enrichment of heavier isotopes in lake and river water while precipitation and groundwater are lighter and have similar isotopic signatures. The isotopic signature of lake water depends on a) the isotopic signature of its inputs and b) the lakés residence time (the longer the more enriched with heavier isotopes). In the present study we used the citizen science project "Tatort Gewässer" to let people collect lake water samples all over Germany. Based on additional information we identified lakes without or with small (compared to the lake volume) aboveground inflows. Based on the isotopic signatures of these lakes and additional background information such as the mean depth we could identify lakes in which groundwater is an important component of the water balance. The results will be used as a basis of intense research on groundwater-driven lake eutrophication.

  19. Liquid volume monitoring based on ultrasonic sensor and Arduino microcontroller

    Science.gov (United States)

    Husni, M.; Siahaan, D. O.; Ciptaningtyas, H. T.; Studiawan, H.; Aliarham, Y. P.

    2016-04-01

    Incident of oil leakage and theft in oil tank often happens. To prevent it, the liquid volume insides the tank needs to be monitored continuously. Aim of the study is to calculate the liquid volume inside oil tank on any road condition and send the volume data and location data to the user. This research use some ultrasonic sensors (to monitor the fluid height), Bluetooth modules (to sent data from the sensors to the Arduino microcontroller), Arduino Microcontroller (to calculate the liquid volume), and also GPS/GPRS/GSM Shield module (to get location of vehicle and sent the data to the Server). The experimental results show that the accuracy rate of monitoring liquid volume inside tanker while the vehicle is in the flat road is 99.33% and the one while the vehicle is in the road with elevation angle is 84%. Thus, this system can be used to monitor the tanker position and the liquid volume in any road position continuously via web application to prevent illegal theft.

  20. International Source Book: Nuclear Fuel Cycle Research and Development Vol 1 Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, K. M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lakey, L. T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    1983-07-01

    This document starts with an overview that summarizes nuclear power policies and waste management activities for nations with significant commercial nuclear fuel cycle activities either under way or planned. A more detailed program summary is then included for each country or international agency conducting nuclear fuel cycle and waste management research and development. This first volume includes the overview and the program summaries of those countries listed alphabetically from Argentina to Italy.

  1. Voltage Sag Source Location Based on Instantaneous Energy Detection

    DEFF Research Database (Denmark)

    Chen, Zhe; Kong, Wei; Dong, Xinzhou

    2008-01-01

    Voltage sag is a major power quality problem, which could disrupt the operation of voltage-sensitive equipment. This paper presents the method based on variation components-based instantaneous energy for voltage sag source detection. Simulations have been performed to provide the thorough analysi...... for system with distributed generation units. The studies show that the presented method can effectively detect the location of voltage sag source....

  2. A compact time-of-flight SANS instrument optimised for measurements of small sample volumes at the European Spallation Source

    Science.gov (United States)

    Kynde, Søren; Hewitt Klenø, Kaspar; Nagy, Gergely; Mortensen, Kell; Lefmann, Kim; Kohlbrecher, Joachim; Arleth, Lise

    2014-11-01

    The high flux at European Spallation Source (ESS) will allow for performing experiments with relatively small beam-sizes while maintaining a high intensity of the incoming beam. The pulsed nature of the source makes the facility optimal for time-of-flight small-angle neutron scattering (ToF-SANS). We find that a relatively compact SANS instrument becomes the optimal choice in order to obtain the widest possible q-range in a single setting and the best possible exploitation of the neutrons in each pulse and hence obtaining the highest possible flux at the sample position. The instrument proposed in the present article is optimised for performing fast measurements of small scattering volumes, typically down to 2×2×2 mm3, while covering a broad q-range from about 0.005 1/Å to 0.5 1/Å in a single instrument setting. This q-range corresponds to that available at a typical good BioSAXS instrument and is relevant for a wide set of biomacromolecular samples. A central advantage of covering the whole q-range in a single setting is that each sample has to be loaded only once. This makes it convenient to use the fully automated high-throughput flow-through sample changers commonly applied at modern synchrotron BioSAXS-facilities. The central drawback of choosing a very compact instrument is that the resolution in terms of δλ / λ obtained with the short wavelength neutrons becomes worse than what is usually the standard at state-of-the-art SANS instruments. Our McStas based simulations of the instrument performance for a set of characteristic biomacromolecular samples show that the resulting smearing effects still have relatively minor effects on the obtained data and can be compensated for in the data analysis. However, in cases where a better resolution is required in combination with the large simultaneous q-range characteristic of the instrument, we show that this can be obtained by inserting a set of choppers.

  3. A compact time-of-flight SANS instrument optimised for measurements of small sample volumes at the European Spallation Source

    Energy Technology Data Exchange (ETDEWEB)

    Kynde, Søren, E-mail: kynde@nbi.ku.dk [Niels Bohr Institute, University of Copenhagen (Denmark); Hewitt Klenø, Kaspar [Niels Bohr Institute, University of Copenhagen (Denmark); Nagy, Gergely [SINQ, Paul Scherrer Institute (Switzerland); Mortensen, Kell; Lefmann, Kim [Niels Bohr Institute, University of Copenhagen (Denmark); Kohlbrecher, Joachim, E-mail: Joachim.kohlbrecher@psi.ch [SINQ, Paul Scherrer Institute (Switzerland); Arleth, Lise, E-mail: arleth@nbi.ku.dk [Niels Bohr Institute, University of Copenhagen (Denmark)

    2014-11-11

    The high flux at European Spallation Source (ESS) will allow for performing experiments with relatively small beam-sizes while maintaining a high intensity of the incoming beam. The pulsed nature of the source makes the facility optimal for time-of-flight small-angle neutron scattering (ToF-SANS). We find that a relatively compact SANS instrument becomes the optimal choice in order to obtain the widest possible q-range in a single setting and the best possible exploitation of the neutrons in each pulse and hence obtaining the highest possible flux at the sample position. The instrument proposed in the present article is optimised for performing fast measurements of small scattering volumes, typically down to 2×2×2 mm{sup 3}, while covering a broad q-range from about 0.005 1/Å to 0.5 1/Å in a single instrument setting. This q-range corresponds to that available at a typical good BioSAXS instrument and is relevant for a wide set of biomacromolecular samples. A central advantage of covering the whole q-range in a single setting is that each sample has to be loaded only once. This makes it convenient to use the fully automated high-throughput flow-through sample changers commonly applied at modern synchrotron BioSAXS-facilities. The central drawback of choosing a very compact instrument is that the resolution in terms of δλ/λ obtained with the short wavelength neutrons becomes worse than what is usually the standard at state-of-the-art SANS instruments. Our McStas based simulations of the instrument performance for a set of characteristic biomacromolecular samples show that the resulting smearing effects still have relatively minor effects on the obtained data and can be compensated for in the data analysis. However, in cases where a better resolution is required in combination with the large simultaneous q-range characteristic of the instrument, we show that this can be obtained by inserting a set of choppers.

  4. Paul Scherrer Institute Scientific Report 1998. Volume VII: Swiss Light Source

    Energy Technology Data Exchange (ETDEWEB)

    Weyer, Heinz Josef; Bugmann, Marlen; Neuhaus, Sibylle [eds.

    1999-09-01

    The Swiss Light Source (SLS) is a medium energy range light source that also provides light with high brilliance in the regime of hard X-rays. It is being constructed at PSI and scheduled to be operational in 2001. A series of new features that were adopted for the design and operation of this machine, is described in this annual report for 1998 figs., tabs., refs.

  5. Study of Automatic Fiber Placement Manipulator’s Robotic Kinematics Manipulability Based on Volume Element

    Directory of Open Access Journals (Sweden)

    Ge Xinfeng

    2013-02-01

    Full Text Available The method is proposed based on volume element in order to measure the manipulator’s robotic kinematics manipulability. Then studied the series redundant automatic fiber placement robotic manipulator’s operation space, draw the conclusion that the greater of the robotic manipulator’s operation space volume, the better of the robotic manipulator’s manipulability, volume element based on redundant robotic manipulator’s kinematics is proposed as an operational performance index. n-DOF serial robotic manipulator’s operation space is n-dimensional Riemannian manifold, the n-dimensional Riemannian manifold volume is calculated using the moving coordinate system and the exterior product definition in differential geometry and get the robotic manipulator’s operation space volume then compared the obtained results with the operation space volume using inner product determinant in the literature, it shows that the volume element as a kinematics operational performance index is feasible.

  6. An accelerator-based epithermal photoneutron source for BNCT

    Energy Technology Data Exchange (ETDEWEB)

    Nigg, D.W.; Mitchell, H.E.; Harker, Y.D.; Yoon, W.Y. [and others

    1995-11-01

    Therapeutically-useful epithermal-neutron beams for BNCT are currently generated by nuclear reactors. Various accelerator-based neutron sources for BNCT have been proposed and some low intensity prototypes of such sources, generally featuring the use of proton beams and beryllium or lithium targets have been constructed. This paper describes an alternate approach to the realization of a clinically useful accelerator-based source of epithermal neutrons for BNCT that reconciles the often conflicting objectives of target cooling, neutron beam intensity, and neutron beam spectral purity via a two stage photoneutron production process.

  7. Cerium-Based, Intermetallic-Strengthened Aluminum Casting Alloy: High-Volume Co-product Development

    Science.gov (United States)

    Sims, Zachary C.; Weiss, D.; McCall, S. K.; McGuire, M. A.; Ott, R. T.; Geer, Tom; Rios, Orlando; Turchi, P. A. E.

    2016-07-01

    Several rare earth elements are considered by-products to rare earth mining efforts. By using one of these by-product elements in a high-volume application such as aluminum casting alloys, the supply of more valuable rare earths can be globally stabilized. Stabilizing the global rare earth market will decrease the long-term criticality of other rare earth elements. The low demand for Ce, the most abundant rare earth, contributes to the instability of rare earth extraction. In this article, we discuss a series of intermetallic-strengthened Al alloys that exhibit the potential for new high-volume use of Ce. The castability, structure, and mechanical properties of binary, ternary, and quaternary Al-Ce based alloys are discussed. We have determined Al-Ce based alloys to be highly castable across a broad range of compositions. Nanoscale intermetallics dominate the microstructure and are the theorized source of the high ductility. In addition, room-temperature physical properties appear to be competitive with existing aluminum alloys with extended high-temperature stability of the nanostructured intermetallic.

  8. A dynamical system of deposit and loan volumes based on the Lotka-Volterra model

    Science.gov (United States)

    Sumarti, N.; Nurfitriyana, R.; Nurwenda, W.

    2014-02-01

    In this research, we proposed a dynamical system of deposit and loan volumes of a bank using a predator-prey paradigm, where the predator is loan volumes, and the prey is deposit volumes. The existence of loan depends on the existence of deposit because the bank will allocate the loan volume from a portion of the deposit volume. The dynamical systems have been constructed are a simple model, a model with Michaelis-Menten Response and a model with the Reserve Requirement. Equilibria of the systems are analysed whether they are stable or unstable based on their linearised system.

  9. Segmenting Multi-Source images using hidden Markov fields with copula-based multivariate statistical distributions.

    Science.gov (United States)

    Lapuyade-Lahorgue, Jerome; Xue, Jing-Hao; Ruan, Su

    2017-03-21

    Nowadays, multi-source image acquisition attracts an increasing interest in many fields such as multi-modal medical image segmentation. Such acquisition aims at considering complementary information to perform image segmentation since the same scene has been observed by various types of images. However, strong dependency often exists between multi-source images. This dependency should be taken into account when we try to extract joint information for precisely making a decision. In order to statistically model this dependency between multiple sources, we propose a novel multi-source fusion method based on the Gaussian copula. The proposed fusion model is integrated in a statistical framework with the hidden Markov field inference in order to delineate a target volume from multi-source images. Estimation of parameters of the models and segmentation of the images are jointly performed by an iterative algorithm based on Gibbs sampling. Experiments are performed on multi-sequence MRI to segment tumors. The results show that the proposed method based on the Gaussian copula is effective to accomplish multi-source image segmentation.

  10. Estimating carbon stocks based on forest volume-age relationship

    Science.gov (United States)

    Hangnan, Y.; Lee, W.; Son, Y.; Kwak, D.; Nam, K.; Moonil, K.; Taesung, K.

    2012-12-01

    This research attempted to estimate potential change of forest carbon stocks between 2010 and 2110 in South Korea, using the forest cover map and National Forest Inventory (NFI) data. Allometric functions (logistic regression models) of volume-age relationships were developed to estimate carbon stock change during upcoming 100 years for Pinus densiflora, Pinus koraiensis, Pinus rigida, Larix kaempferi,and Quercus spp. The current forest volume was estimated with the developed regression model and 4th forest cover map. The future volume was predicted by developed volume-age models with adding n years to current age. As a result, we found that the total forest volume would increase from 126.89 m^3/ha to 246.61 m^3/ha and the carbon stocks would increase from 90.55 Mg C ha^(-1) to 174.62 Mg C ha^(-1) during 100 years when current forest remains unchanged. The carbon stocks would increase by approximately 0.84 Mg C ha^(-1) yr^(-1), which has high value if considering other northern countries' (Canada, Russia, China) -0.10 ~ 0.28 Mg C ha^(-1) yr^(-1) in pervious study. This can be attributed to the fact that mixed forest and bamboo forest in this study did not considered. Moreover, it must be influenced by that the change of carbon stocks was estimated without the consideration of mortality, thinning, and tree species' change in this study. ;

  11. Control volume based hydrocephalus research; a phantom study

    Science.gov (United States)

    Cohen, Benjamin; Voorhees, Abram; Madsen, Joseph; Wei, Timothy

    2009-11-01

    Hydrocephalus is a complex spectrum of neurophysiological disorders involving perturbation of the intracranial contents; primarily increased intraventricular cerebrospinal fluid (CSF) volume and intracranial pressure are observed. CSF dynamics are highly coupled to the cerebral blood flows and pressures as well as the mechanical properties of the brain. Hydrocephalus, as such, is a very complex biological problem. We propose integral control volume analysis as a method of tracking these important interactions using mass and momentum conservation principles. As a first step in applying this methodology in humans, an in vitro phantom is used as a simplified model of the intracranial space. The phantom's design consists of a rigid container filled with a compressible gel. Within the gel a hollow spherical cavity represents the ventricular system and a cylindrical passage represents the spinal canal. A computer controlled piston pump supplies sinusoidal volume fluctuations into and out of the flow phantom. MRI is used to measure fluid velocity and volume change as functions of time. Independent pressure measurements and momentum flow rate measurements are used to calibrate the MRI data. These data are used as a framework for future work with live patients and normal individuals. Flow and pressure measurements on the flow phantom will be presented through the control volume framework.

  12. On the efficiency of stochastic volume sources for the determination of light meson masses

    CERN Document Server

    Endress, E; Wittig, H

    2011-01-01

    We investigate the efficiency of single timeslice stochastic sources for the calculation of light meson masses on the lattice as one varies the quark mass. Simulations are carried out with Nf = 2 flavours of non-perturbatively O(a) improved Wilson fermions for pion masses in the range of 450 - 760 MeV. Results for pseudoscalar and vector meson two-point correlation functions computed using stochastic as well as point sources are presented and compared. At fixed computational cost the stochastic approach reduces the variance considerably in the pseudoscalar channel for all simulated quark masses. The vector channel is more affected by the intrinsic stochastic noise. In order to obtain stable estimates of the statistical errors and a more pronounced plateau for the effective vector meson mass, a relatively large number of stochastic sources must be used.

  13. International Source Book: Nuclear Fuel Cycle Research and Development Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Harmon, K. M.; Lakey, L. T.

    1982-11-01

    This document starts with an overview that summarizes nuclear power policies and waste management activities for nations with significant commercial nuclear fuel cycle activities either under way or planned. A more detailed program summary is then included for each country or international agency conducting nuclear fuel cycle and waste management research and development. This second volume includes the program summaries of those countries listed alphabetically from Japan to Yugoslavia. Information on international agencies and associations, particularly the IAEA, NEA, and CEC, is provided also.

  14. Verification of Conjugate Heat Transfer Models in a Closed Volume with Radiative Heat Source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2016-01-01

    Full Text Available The results of verification of mathematical model of convective-conductive heat transfer in a closed volume with a thermally conductive enclosing structures are presented. Experiments were carried out to determine the temperature of floor premises in the working conditions of radiant heating systems. Comparison of mathematical modelling of temperature fields and experiments showed their good agreement. It is concluded that the mathematical model of conjugate heat transfers in the air cavity with a heat-conducting and heat-retaining walls correspond to the real process of formation of temperature fields in premises with gas infrared heaters system.

  15. Radiation Detection Equipment (RDE) Comparative Evaluation Test Program. Volume 1. Point Source Measurements

    Science.gov (United States)

    1994-08-01

    OC W 1. AGINCY USE ONLY M.enm bika ) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED I 940801 I Technical 930201 - 930731 4. TITLE AND SUB`TITLE 5...unmoderaled source. 2/18/93 Detector: Ludlum Model 15 Positve radil alis 1000. of counts 9ma A Xakn~mrfli zab .02 Lim mromm 928 32.5 100 0.040 66.7 32.5...16100 16100 161 D- 16 Table D-48. Am(U) measurement data for "strong" shielded source. (page 7 of 7) Detctor. INF Stong Am(U) Afeswemenft-Ostector lim

  16. Efficient volume preserving approach for skeleton-based implicit surfaces

    Institute of Scientific and Technical Information of China (English)

    史红兵; 童若锋; 董金祥

    2003-01-01

    This paper presents an efficient way to preserve the volume of implicit surfaces generated by skeletons. Recursive subdivision is used to efficiently calculate the volume. The criterion for subdivision is obtained by using the property of density functions and treating different types of skeletons respectively to get accurate minimum and maximum distances from a cube to a skeleton. Compared with the criterion generated by other ways such as using traditional Interval Analysis, Affine Arithmetic, or Lipschitz condition, our approach is much better both in speed and accuracy.

  17. Control volume based modelling of compressible flow in reciprocating machines

    DEFF Research Database (Denmark)

    Andersen, Stig Kildegård; Thomsen, Per Grove; Carlsen, Henrik

    2004-01-01

    conservation laws for mass, energy, and momentum applied to a staggered mesh consisting of two overlapping strings of control volumes. Loss mechanisms can be included directly in the governing equations of models by including them as terms in the conservation laws. Heat transfer, flow friction......, and multidimensional effects must be calculated using empirical correlations; correlations for steady state flow can be used as an approximation. A transformation that assumes ideal gas is presented for transforming equations for masses and energies in control volumes into the corresponding pressures and temperatures...

  18. Virtual-Impedance-Based Control for Voltage-Source and Current-Source Converters

    DEFF Research Database (Denmark)

    Wang, Xiongfei; Li, YunWei; Blaabjerg, Frede;

    2015-01-01

    The virtual impedance concept is increasingly used for the control of power electronic systems. Generally, the virtual impedance loop can either be embedded as an additional degree of freedom for active stabilization and disturbance rejection, or be employed as a command reference generator...... for the converters to provide ancillary services. This paper presents an overview of the virtual-impedance-based control strategies for voltage-source and current-source converters. The control output impedance shaping attained by the virtual impedances is generalized first using the impedance-based models...

  19. Development of Outcome-based, Multipollutant Mobile Source Indicators

    Science.gov (United States)

    Pachon, Jorge E.; Balachandran, Sivaraman; Hu, Yongtao; Mulholland, James A.; Darrow, Lyndsey A.; Sarnat, Jeremy A.; Tolbert, Paige E.; Russell, Armistead G.

    2013-01-01

    Multipollutant indicators of mobile source impacts are developed from readily available CO, NOx, and elemental carbon (EC) data for use in air quality and epidemiologic analysis. Two types of outcome-based integrated mobile source indicators (IMSI) are assessed. The first is derived from analysis of emissions of EC, CO and NOx such that pollutant concentrations are mixed and weighted based on emission ratios for both gasoline and diesel vehicles. The emission-based indicators (IMSIEB) capture the impact of mobile sources on air quality estimated from receptor models and their uncertainty is comparable to measurement and source apportionment uncertainties. The IMSIEB have larger correlation between two different receptor sites impacted by traffic than single pollutants, suggesting they are better indicators of the local impact of mobile sources. A sensitivity analysis of fractions of pollutants in a two-pollutant mixture and the inclusion in an epidemiologic model is conducted to develop a second set of indicators based on health outcomes. The health-based indicators (IMSIHB) are weighted combinations of CO, NOx and EC pairs that have the lowest p-value in their association with cardiovascular disease emergency department visits, possibly due to their better spatial representativeness. These outcome-based, multipollutant indicators can provide support for the setting of multipollutant air quality standards and other air quality management activities. PMID:22616285

  20. Macular Choroidal Thickness and Volume Measured by Swept-source Optical Coherence Tomography in Healthy Korean Children.

    Science.gov (United States)

    Lee, Jung Wook; Song, In Seok; Lee, Ju-hyang; Shin, Yong Un; Lim, Han Woong; Lee, Won June; Lee, Byung Ro

    2016-02-01

    To evaluate the thickness and volume of the choroid in healthy Korean children using swept-source optical coherence tomography. We examined 80 eyes of 40 healthy children and teenagers (choroidal thickness map. We also examined 44 eyes of 35 healthy adult volunteers (≥18 years) and compared adult measurements with the findings in children. The mean age of the children and teenagers was 9.47 ± 3.80 (4 to 17) vs. 55.04 ± 12.63 years (36 to 70 years) in the adult group (p choroid were thinner (p = 0.004, p = 0.002, respectively) than the surrounding areas. The mean choroidal volumes of the inner and outer nasal areas were smaller (p = 0.004, p = 0.003, respectively) than those of all the other areas in each circle. Among the nine subfields, all areas in the children, except the outer nasal subfield, were thicker than those in adults (p choroidal thickness (p choroidal thickness and volume in children and teenagers were significantly greater than in adults. The nasal choroid was significantly thinner than the surrounding areas. The pediatric subfoveal choroid is prone to thinning with increasing age, axial length, and refractive error. These differences should be considered when choroidal thickness is evaluated in children with chorioretinal diseases.

  1. Z - Source Multi Level Inverter Based PV Generation System

    Directory of Open Access Journals (Sweden)

    T. Lakhmi kanth

    2014-09-01

    Full Text Available In this paper a novel technique of Z-Source multilevel Inverter based PV Generation system is implemented and simulated using MATLAB-Simulink simulation software. The Photovoltaic cells are healthier option for converting solar energy into electricity. Due to high capital cost and low efficiency PV cells have not yet been a fully smart choice for electricity users. To enhance the performance of the system, Z-Source multi level inverter can be used in place of conventional Voltage Source Inverter (VSI in Solar Power Generation System. The PV cell model is developed using circuit mathematical equations. The Z-Source multilevel inverter is modeled to realize boosted DC to AC conversion (inversion with low THD. Outcome shows that the energy conversion efficiency of ZSMLI is a lot improved as compared to conventional voltage Source Inverter (VSI. By doing FFT analysis we can know the total THD.

  2. Line-Source Based X-Ray Tomography

    Directory of Open Access Journals (Sweden)

    Deepak Bharkhada

    2009-01-01

    Full Text Available Current computed tomography (CT scanners, including micro-CT scanners, utilize a point x-ray source. As we target higher and higher spatial resolutions, the reduced x-ray focal spot size limits the temporal and contrast resolutions achievable. To overcome this limitation, in this paper we propose to use a line-shaped x-ray source so that many more photons can be generated, given a data acquisition interval. In reference to the simultaneous algebraic reconstruction technique (SART algorithm for image reconstruction from projection data generated by an x-ray point source, here we develop a generalized SART algorithm for image reconstruction from projection data generated by an x-ray line source. Our numerical simulation results demonstrate the feasibility of our novel line-source based x-ray CT approach and the proposed generalized SART algorithm.

  3. Control volume based modelling of compressible flow in reciprocating machines

    DEFF Research Database (Denmark)

    Andersen, Stig Kildegård; Thomsen, Per Grove; Carlsen, Henrik

    2004-01-01

    , and multidimensional effects must be calculated using empirical correlations; correlations for steady state flow can be used as an approximation. A transformation that assumes ideal gas is presented for transforming equations for masses and energies in control volumes into the corresponding pressures and temperatures...

  4. Collision Distance Detection Based on Swept Volume Strategy for Optimal Motion Plan

    Science.gov (United States)

    Huang, Tsai-Jeon

    A swept volume strategy to detect the collision distances between obstacles is presented in this paper for robot motion planning based on optimization technique. The strategy utilizes the recursive quadratic programming optimization method to perform the motion planning problem. This paper is based on segmental swept volume for convenient distance-to-contact calculation. Hermite interpolation is presented to approach the envelope bounding the swept volume. The new method is capable of handling a modestly non-convex swept volume and it has yielded accurate answers in distance calculations. Also, examples would be presented to illustrate and demonstrate this approach in the paper.

  5. Six transformer based asymmetrical embedded Z-source inverters

    DEFF Research Database (Denmark)

    Wei, Mo; Poh Chiang, Loh; Chi, Jin

    2013-01-01

    Embedded/Asymmetrical embedded Z-source inverters were proposed to maintain smooth input current/voltage across the dc source and within the impedance network, remain the shoot-through feature used to boost up the dc-link voltage without adding bulky filter at input side. This paper introduces a ...... a class of transformer based asymmetrical embedded Z-source inverters which keep the smooth input current and voltage while achieving enhanced voltage boost capability. The presented inverters are verified by laboratory prototypes experimentally....

  6. Fiber-based swept-source terahertz radar.

    Science.gov (United States)

    Huang, Yu-Wei; Tseng, Tzu-Fang; Kuo, Chung-Chiu; Hwang, Yuh-Jing; Sun, Chi-Kuang

    2010-05-01

    We demonstrate an all-terahertz swept-source imaging radar operated at room temperature by using terahertz fibers for radiation delivery and with a terahertz-fiber directional coupler acting as a Michelson interferometer. By taking advantage of the high water reflection contrast in the low terahertz regime and by electrically sweeping at a high speed a terahertz source combined with a fast rotating mirror, we obtained the living object's distance information with a high image frame rate. Our experiment showed that this fiber-based swept-source terahertz radar could be used in real time to locate concealed moving live objects with high stability.

  7. Multifunctional bulk plasma source based on discharge with electron injection

    Energy Technology Data Exchange (ETDEWEB)

    Klimov, A. S.; Medovnik, A. V. [Tomsk State University of Control Systems and Radioelectronics, Tomsk 634050 (Russian Federation); Tyunkov, A. V. [Tomsk State University of Control Systems and Radioelectronics, Tomsk 634050 (Russian Federation); Institute of High Current Electronics, Tomsk 634055 (Russian Federation); Savkin, K. P.; Shandrikov, M. V.; Vizir, A. V. [Institute of High Current Electronics, Tomsk 634055 (Russian Federation)

    2013-01-15

    A bulk plasma source, based on a high-current dc glow discharge with electron injection, is described. Electron injection and some special design features of the plasma arc emitter provide a plasma source with very long periods between maintenance down-times and a long overall lifetime. The source uses a sectioned sputter-electrode array with six individual sputter targets, each of which can be independently biased. This discharge assembly configuration provides multifunctional operation, including plasma generation from different gases (argon, nitrogen, oxygen, acetylene) and deposition of composite metal nitride and oxide coatings.

  8. Paul Scherrer Institute Scientific Report 1999. Volume VII: Swiss Light Source

    Energy Technology Data Exchange (ETDEWEB)

    Weyer, Heinz Josef; Bugmann, Marlen; Schuetz, Christine [eds.

    2000-07-01

    The Swiss Synchrotron Light Source (SLS) is a medium energy range light source that also provides light with high brilliance in the regime of hard X-rays. It is being constructed at PSI and scheduled to be operational in 2001. The progress of the construction of pre-injector, booster and storage ring as well as some of the details of new features that were adopted for the design and operation of this machine, are described in this annual report for 1999. An overview of the concept and status of the four SLS beamlines and the related infrastructure is also given. The last chapter contains 11 contributions which report on scientific activities of SLS staff members at synchrotron radiation facilities all over the world.

  9. Piezoelectric-based hybrid reserve power sources for munitions

    Science.gov (United States)

    Rastegar, J.; Kwok, P.

    2017-04-01

    Reserve power sources are used extensively in munitions and other devices, such as emergency devices or remote sensors that need to be powered only once and for a relatively short duration. Current chemical reserve power sources, including thermal batteries and liquid reserve batteries sometimes require more than 100 msec to become fully activated. In many applications, however, electrical energy is required in a few msec following the launch event. In such applications, other power sources are needed to provide power until the reserve battery is fully activated. The amount of electrical energy that is required by most munitions before chemical reserve batteries are fully activated is generally small and can be provided by properly designed piezoelectric-based energy harvesting devices. In this paper, the development of a hybrid reserve power source that is constructed by integration of a piezoelectric-based energy harvesting device with a reserve battery to provide power almost instantaneously upon munitions firing or other similar events is being reported. A review of the state of the art in piezoelectric-based electrical energy harvesting methods and devices and their charge collection electronics for use in the developed hybrid power sources is provided together with the results of testing of the piezoelectric component of the power source and its electronic safety and charge collection electronics.

  10. Piezoelectric-based hybrid reserve power sources for munitions

    Science.gov (United States)

    Rastegar, Jahangir; Pereira, Carlos M.; Feng, Dake

    2016-05-01

    Reserve power sources are used extensively in munitions and other devices such as emergency devices or remote sensors that have to be powered only once and for a relatively short duration. Current chemical reserve power sources, including thermal batteries and liquid reserve batteries require sometimes in excess of 100 msec to become fully activated. In many applications, however, electrical energy is required in a few msec following the launch event. In such applications, other power sources have to be provided to provide power until the reserve battery is fully activated. The amount of electrical energy that is required by most munitions before chemical reserve batteries are fully activated is generally small and can be provided by properly designed piezoelectric-based energy harvesting devices. In this paper the development of a hybrid reserve power source obtained by the integration of a piezoelectric-based energy harvesting device with a reserve battery that can provide power almost instantaneously upon munitions firing or other similar events is being reported. A review of the state of the art in piezoelectric-based electrical energy harvesting methods and devices and their charge collection electronics for use in the developed hybrid power sources is also provided together with the results of testing of the piezoelectric component of the power source and its electronic safety and charge collection electronics.

  11. Finite volume schemes for multi-dimensional hyperbolic systems based on the use of bicharacteristics

    OpenAIRE

    Lukácová-Medvid'ová, Maria; Saibertova, Jitka

    2004-01-01

    In this paper we present recent results for the bicharacteristic based finite volume schemes, the so-called finite volume evolution Galerkin (FVEG) schemes. These methods were proposed to solve multi-dimensional hyperbolic conservation laws. They combine the usually conflicting design objectives of using the conservation form and following the characteristics, or bicharacteristics. This is realized by combining the finite volume formulation with approximate evolution operators, which use bich...

  12. Finite volume schemes for multidimensional hyperbolic systems based on the use of bicharacteristics

    OpenAIRE

    Lukácová-Medvid'ová, Maria

    2003-01-01

    In this survey paper we present an overview on recent results for the bicharacteristics based finite volume schemes, the so-called finite volume evolution Galerkin (FVEG) schemes. These methods were proposed to solve multidimensional hyperbolic conservation laws. They combine the usually conflicting design objectives of using the conservation form and following the characteritics, or bicharacteritics. This is realized by combining the finite volume formulation with approximate evolution opera...

  13. Data-based matched-mode source localization for a moving source.

    Science.gov (United States)

    Yang, T C

    2014-03-01

    A data-based matched-mode source localization method is proposed in this paper for a moving source, using mode wavenumbers and depth functions estimated directly from the data, without requiring any environmental acoustic information and assuming any propagation model. The method is in theory free of the environmental mismatch problem because the mode replicas are estimated from the same data used to localize the source. Besides the estimation error due to the approximations made in deriving the data-based algorithms, the method has some inherent drawbacks: (1) It uses a smaller number of modes than theoretically possible because some modes are not resolved in the measurements, and (2) the depth search is limited to the depth covered by the receivers. Using simulated data, it is found that the performance degradation due to the afore-mentioned approximation/limitation is marginal compared with the original matched-mode source localization method. The proposed method has a potential to estimate the source range and depth for real data and be free of the environmental mismatch problem, noting that certain aspects of the (estimation) algorithms have previously been tested against data. The key issues are discussed in this paper.

  14. High Performance GPU-Based Fourier Volume Rendering.

    Science.gov (United States)

    Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr

    2015-01-01

    Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)log⁡N) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  15. High Performance GPU-Based Fourier Volume Rendering

    Directory of Open Access Journals (Sweden)

    Marwan Abdellah

    2015-01-01

    Full Text Available Fourier volume rendering (FVR is a significant visualization technique that has been used widely in digital radiography. As a result of its O(N2log⁡N time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are O(N3 computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  16. Open-source algorithm for automatic choroid segmentation of OCT volume reconstructions

    Science.gov (United States)

    Mazzaferri, Javier; Beaton, Luke; Hounye, Gisèle; Sayah, Diane N.; Costantino, Santiago

    2017-02-01

    The use of optical coherence tomography (OCT) to study ocular diseases associated with choroidal physiology is sharply limited by the lack of available automated segmentation tools. Current research largely relies on hand-traced, single B-Scan segmentations because commercially available programs require high quality images, and the existing implementations are closed, scarce and not freely available. We developed and implemented a robust algorithm for segmenting and quantifying the choroidal layer from 3-dimensional OCT reconstructions. Here, we describe the algorithm, validate and benchmark the results, and provide an open-source implementation under the General Public License for any researcher to use (https://www.mathworks.com/matlabcentral/fileexchange/61275-choroidsegmentation).

  17. Low-level radioactive waste source terms for the 1992 integrated data base

    Energy Technology Data Exchange (ETDEWEB)

    Loghry, S L; Kibbey, A H; Godbee, H W; Icenhour, A S; DePaoli, S M

    1995-01-01

    This technical manual presents updated generic source terms (i.e., unitized amounts and radionuclide compositions) which have been developed for use in the Integrated Data Base (IDB) Program of the U.S. Department of Energy (DOE). These source terms were used in the IDB annual report, Integrated Data Base for 1992: Spent Fuel and Radioactive Waste Inventories, Projections, and Characteristics, DOE/RW-0006, Rev. 8, October 1992. They are useful as a basis for projecting future amounts (volume and radioactivity) of low-level radioactive waste (LLW) shipped for disposal at commercial burial grounds or sent for storage at DOE solid-waste sites. Commercial fuel cycle LLW categories include boiling-water reactor, pressurized-water reactor, fuel fabrication, and uranium hexafluoride (UF{sub 6}) conversion. Commercial nonfuel cycle LLW includes institutional/industrial (I/I) waste. The LLW from DOE operations is category as uranium/thorium fission product, induced activity, tritium, alpha, and {open_quotes}other{close_quotes}. Fuel cycle commercial LLW source terms are normalized on the basis of net electrical output [MW(e)-year], except for UF{sub 6} conversion, which is normalized on the basis of heavy metal requirement [metric tons of initial heavy metal ]. The nonfuel cycle commercial LLW source term is normalized on the basis of volume (cubic meters) and radioactivity (curies) for each subclass within the I/I category. The DOE LLW is normalized in a manner similar to that for commercial I/I waste. The revised source terms are based on the best available historical data through 1992.

  18. Source effects in analyzer-based X-ray phase contrast imaging with conventional sources

    Energy Technology Data Exchange (ETDEWEB)

    Hoennicke, M. G. [Universidade Federal da Integracao Latino-Americana, 85867-970 Foz do Iguacu, PR (Brazil); Manica, J. [Universidade Estadual do Oeste do Parana, 85867-970 Foz do Iguacu, PR (Brazil); Mazzaro, I.; Cusatis, C. [LORXI, Departamento de Fisica, Universidade Federal do Parana, Caixa Postal 19091, 81531-990 Curitiba, PR (Brazil); Huang, X.-R. [Advanced Photon Source, Argonne National Laboratory, Argonne, Illinois 60439 (United States)

    2012-11-15

    Several recent papers have shown the implementation of analyzer based X-ray phase contrast imaging (ABI) with conventional X-ray sources. The high flux is always a requirement to make the technique useful for bio-medical applications. Here, we present and discuss three important parameters, which need to be taken into account, when searching for the high flux ABI: anisotropic magnification, double image, and source size spread due to intrinsic dispersive diffraction by asymmetrically cut crystals. These parameters, if not well optimized, may cause important features in the acquired images which can mislead the interpretation. A few ways to minimize these effects are implemented and discussed, including some experimental results.

  19. Tutorial on fiber-based sources for biophotonic applications

    Science.gov (United States)

    Taylor, James R.

    2016-06-01

    Fiber-based lasers and master oscillator power fiber amplifier configurations are described. These allow spectral versatility coupled with pulse width and pulse repetition rate selection in compact and efficient packages. This is enhanced through the use of nonlinear optical conversion in fibers and fiber-coupled nonlinear crystals, which can be integrated to provide all-fiber pump sources for diverse application. The advantages and disadvantages of sources based upon supercontinuum generation, stimulated Raman conversion, four-wave mixing, parametric generation and difference frequency generation, allowing spectral coverage from the UV to the mid-infrared, are considered.

  20. Volume-Averaged Model of Inductively-Driven Multicusp Ion Source

    Science.gov (United States)

    Patel, Kedar K.; Lieberman, M. A.; Graf, M. A.

    1998-10-01

    A self-consistent spatially averaged model of high-density oxygen and boron trifluoride discharges has been developed for a 13.56 MHz, inductively coupled multicusp ion source. We determine positive ion, negative ion, and electron densities, the ground state and metastable densities, and the electron temperature as functions of the control parameters: gas pressure, gas flow rate, input power and reactor geometry. Neutralization and fragmentation into atomic species are assumed for all ions hitting the wall. For neutrals, a wall recombination coefficient for oxygen atoms and a wall sticking coefficient for boron trifluoride (BF_3) and its dissociation products are the single adjustable parameters used to model the surface chemistry. For the aluminum walls of the ion source used in the Eaton ULE2 ion implanter, complete wall recombination of O atoms is found to give the best match to the experimental data for oxygen, whereas a sticking coefficient of 0.62 for all neutral species in a BF3 discharge was found to best match experimental data.

  1. Paul Scherrer Institute Scientific Report 2000. Volume VII: Swiss Light Source

    Energy Technology Data Exchange (ETDEWEB)

    Weyer, Heinz Josef; Bugmann, Marlen; Schuetz, Christine [eds.

    2001-07-01

    The Swiss Synchrotron Light Source (SLS) is a medium energy range light source that also provides light with high brilliance in the regime of hard X-rays. It is presently being constructed at PSI. The year 2000 was crucial for maintaining the project milestones with the start of storage ring commissioning for beginning of 2001 and first light on the probe at the four beamlines of phase I for August 2001. The major goals of 2000 were the completion of accelerator installation, the commissioning of linac and booster and the beginning of beamline assembly. In the first half of the year in parallel to the installation, major fabrication procedures were going on, that had to be thoroughly followed up in order to guarantee their completion in time. The overview and detailed description of these developments is supplemented in this annual report by 8 contributions on scientific activities of SLS staff members at synchrotron radiation facilities all over the world. A list of scientific publications in 2000 is also provided.

  2. Source mask optimization study based on latest Nikon immersion scanner

    Science.gov (United States)

    Zhu, Jun; Wei, Fang; Chen, Lijun; Zhang, Chenming; Zhang, Wei; Nishinaga, Hisashi; El-Sewefy, Omar; Gao, Gen-Sheng; Lafferty, Neal; Meiring, Jason; Zhang, Recoo; Zhu, Cynthia

    2016-03-01

    The 2x nm logic foundry node has many challenges since critical levels are pushed close to the limits of low k1 ArF water immersion lithography. For these levels, improvements in lithographic performance can translate to decreased rework and increased yield. Source Mask Optimization (SMO) is one such route to realize these image fidelity improvements. During SMO, critical layout constructs are intensively optimized in both the mask and source domain, resulting in a solution for maximum lithographic entitlement. From the hardware side, advances in source technology have enabled free-form illumination. The approach allows highly customized illumination, enabling the practical application of SMO sources. The customized illumination sources can be adjusted for maximum versatility. In this paper, we present a study on a critical layer of an advanced foundry logic node using the latest ILT based SMO software, paired with state-of-the-art scanner hardware and intelligent illuminator. Performance of the layer's existing POR source is compared with the ideal SMO result and the installed source as realized on the intelligent illuminator of an NSR-S630D scanner. Both simulation and on-silicon measurements are used to confirm that the performance of the studied layer meets established specifications.

  3. A Parallax-based Distance Estimator for Spiral Arm Sources

    CERN Document Server

    Reid, M J; Menten, K M; Brunthaler, A

    2016-01-01

    The spiral arms of the Milky Way are being accurately located for the first time via trigonometric parallaxes of massive star forming regions with the BeSSeL Survey, using the Very Long Baseline Array and the European VLBI Network, and with the Japanese VERA project. Here we describe a computer program that leverages these results to significantly improve the accuracy and reliability of distance estimates to other sources that are known to follow spiral structure. Using a Bayesian approach, sources are assigned to arms based on their (l,b,v) coordinates with respect to arm signatures seen in CO and HI surveys. A source's kinematic distance, displacement from the plane, and proximity to individual parallax sources are also considered in generating a full distance probability density function. Using this program to estimate distances to large numbers of star forming regions, we generate a realistic visualization of the Milky Way's spiral structure as seen from the northern hemisphere.

  4. Phased-array sources based on nonlinear metamaterial nanocavities.

    Science.gov (United States)

    Wolf, Omri; Campione, Salvatore; Benz, Alexander; Ravikumar, Arvind P; Liu, Sheng; Luk, Ting S; Kadlec, Emil A; Shaner, Eric A; Klem, John F; Sinclair, Michael B; Brener, Igal

    2015-07-01

    Coherent superposition of light from subwavelength sources is an attractive prospect for the manipulation of the direction, shape and polarization of optical beams. This phenomenon constitutes the basis of phased arrays, commonly used at microwave and radio frequencies. Here we propose a new concept for phased-array sources at infrared frequencies based on metamaterial nanocavities coupled to a highly nonlinear semiconductor heterostructure. Optical pumping of the nanocavity induces a localized, phase-locked, nonlinear resonant polarization that acts as a source feed for a higher-order resonance of the nanocavity. Varying the nanocavity design enables the production of beams with arbitrary shape and polarization. As an example, we demonstrate two second harmonic phased-array sources that perform two optical functions at the second harmonic wavelength (∼5 μm): a beam splitter and a polarizing beam splitter. Proper design of the nanocavity and nonlinear heterostructure will enable such phased arrays to span most of the infrared spectrum.

  5. Source appointment of fine particle number and volume concentration during severe haze pollution in Beijing in January 2013.

    Science.gov (United States)

    Liu, Zirui; Wang, Yuesi; Hu, Bo; Ji, Dongsheng; Zhang, Junke; Wu, Fangkun; Wan, Xin; Wang, Yonghong

    2016-04-01

    Extreme haze episodes repeatedly shrouded Beijing during the winter of 2012-2013, causing major environmental and health problems. To better understand these extreme events, particle number size distribution (PNSD) and particle chemical composition (PCC) data collected in an intensive winter campaign in an urban site of Beijing were used to investigate the sources of ambient fine particles. Positive matrix factorization (PMF) analysis resolved a total of eight factors: two traffic factors, combustion factors, secondary aerosol, two accumulation mode aerosol factors, road dust, and long-range transported (LRT) dust. Traffic emissions (54%) and combustion aerosol (27%) were found to be the most important sources for particle number concentration, whereas combustion aerosol (33%) and accumulation mode aerosol (37%) dominated particle volume concentrations. Chemical compositions and sources of fine particles changed dynamically in the haze episodes. An enhanced role of secondary inorganic species was observed in the formation of haze pollution. Regional transport played an important role for high particles, contribution of which was on average up to 24-49% during the haze episodes. Secondary aerosols from urban background presented the largest contributions (45%) for the rapid increase of fine particles in the severest haze episode. In addition, the invasion of LRT dust aerosols further elevated the fine particles during the extreme haze episode. Our results showed a clear impact of regional transport on the local air pollution, suggesting the importance of regional-scale emission control measures in the local air quality management of Beijing.

  6. Intensive neutrino source on the base of lithium converter

    CERN Document Server

    Lyashuk, V I

    2015-01-01

    An intensive antineutrino source with a hard spectrum (with energy up to 13 MeV, average energy 6.5 MeV) can be realized on the base of beta-decay of short living isotope 8Li (0.84 s). The 8Li isotope (generated in activation of 7Li isotope) is a prime perspective antineutrino source owing to the hard antineutrino spectrum and square dependence of cross section on the energy. Up today nuclear reactors are the most intensive neutrino sources. Antineutrino reactor spectra have large uncertainties in the summary antineutrino spectrum at energy E>6 MeV. Use of 8Li isotope allows to decrease sharply the uncertainties or to exclude it completely. An intensive neutron fluxes are requested for rapid generation of 8Li isotope. The installations on the base of nuclear reactors can be an alternative for nuclear reactors as traditional neutron sources. It is possible creation of neutrino sources another in principle: on the base of tandem of accelerators, neutron generating targets and lithium converter. An intensive neu...

  7. Calculation of Intercepted Volume of Sewer Overflows: a Model for Control of Nonpoint Pollution Sources in Urban Areas

    Institute of Scientific and Technical Information of China (English)

    S. C. Choi; D. I. Jung; C. H. Won; J. M. Rim

    2006-01-01

    The authors discovered large differences in the characteristics of overflows by the calculation of 1) intercepting volume of overflows for sewer systems using SWMM model which takes into consideration the runoff and pollutants from rainfalls and 2) the intercepted volume in the total flow at an investigation site. The intercepting rate at the investigation point of CSOs showed higher values than the SSDs. Based on the modeling of the receiving water quality after calculating the intercepting amount of overflows by considering the characteristics of outflows for a proper management of the overflow of sewer systems with rainfalls, it is clear that the BOD decreased by 82.9%-94.0% for the discharge after intercepting a specific amount of flows compared to the discharge from unprocessed overflows.

  8. Micro-Power Sources Enabling Robotic Outpost Based Deep Space Exploration

    Science.gov (United States)

    West, W. C.; Whitacre, J. F.; Ratnakumar, B. V.; Brandon, E. J.; Studor, G. F.

    2001-01-01

    Robotic outpost based exploration represents a fundamental shift in mission design from conventional, single spacecraft missions towards a distributed risk approach with many miniaturized semi-autonomous robots and sensors. This approach can facilitate wide-area sampling and exploration, and may consist of a web of orbiters, landers, or penetrators. To meet the mass and volume constraints of deep space missions such as the Europa Ocean Science Station, the distributed units must be fully miniaturized to fully leverage the wide-area exploration approach. However, presently there is a dearth of available options for powering these miniaturized sensors and robots. This group is currently examining miniaturized, solid state batteries as candidates to meet the demand of applications requiring low power, mass, and volume micro-power sources. These applications may include powering microsensors, battery-backing rad-hard CMOS memory and providing momentary chip back-up power. Additional information is contained in the original extended abstract.

  9. New source performance standards for industrial boilers. Volume 2. Review of industry operating practices

    Energy Technology Data Exchange (ETDEWEB)

    Bryan, R.J.; Weisenberg, I.J.; Wilson, K.

    1980-09-01

    The applicability is evaluated of several possible versions of a revised New Source Performance Standards (NSPS) for industrial boilers to boilers that are operated according to typical industry practices. A survey of operating practices is presented, and it is concluded that an NSPS that includes too high a percent removal requirement for SO/sub 2/ (90%) might be excessively costly and cause operating problems for the industrial operator. More field evaluations of low excess air and low Btu gasification are required to validate these techniques for pollution control under industrial boiler operating conditions. The cost of two small boilers with no SO/sub 2/ controls was less than one large boiler of twice the capacity with SO/sub 2/ controls. The annual cost of operating and maintaining the control system accounted for the difference.

  10. H- extraction from electron-cyclotron-resonance-driven multicusp volume source operated in pulsed mode

    Science.gov (United States)

    Svarnas, P.; Bacal, M.; Auvray, P.; Béchu, S.; Pelletier, J.

    2006-03-01

    H2 microwave (2.45GHz) pulsed plasma is produced from seven elementary electron cyclotron resonance sources installed into the magnetic multipole chamber "Camembert III" (École Polytechnique—Palaiseau) from which H- extraction takes place. The negative-ion and electron extracted currents are studied through electrical measurements and the plasma parameters by means of electrostatic probe under various experimental conditions. The role of the plasma electrode bias and the discharge duty cycle in the extraction process is emphasized. The gas breakdown at the beginning of every pulse gives rise to variations of the plasma characteristic parameters in comparison with those established at the later time of the pulse, where the electron temperature, the plasma potential, and the floating potential converge to the values obtained for a continuous plasma. The electron density is significantly enhanced in the pulsed mode.

  11. Advisory Committee on human radiation experiments. Supplemental Volume 2a, Sources and documentation appendices. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-01-01

    This large document provides a catalog of the location of large numbers of reports pertaining to the charge of the Presidential Advisory Committee on Human Radiation Research and is arranged as a series of appendices. Titles of the appendices are Appendix A- Records at the Washington National Records Center Reviewed in Whole or Part by DoD Personnel or Advisory Committee Staff; Appendix B- Brief Descriptions of Records Accessions in the Advisory Committee on Human Radiation Experiments (ACHRE) Research Document Collection; Appendix C- Bibliography of Secondary Sources Used by ACHRE; Appendix D- Brief Descriptions of Human Radiation Experiments Identified by ACHRE, and Indexes; Appendix E- Documents Cited in the ACHRE Final Report and other Separately Described Materials from the ACHRE Document Collection; Appendix F- Schedule of Advisory Committee Meetings and Meeting Documentation; and Appendix G- Technology Note.

  12. National Synchrotron Light Source annual report 1991. Volume 1, October 1, 1990--September 30, 1991

    Energy Technology Data Exchange (ETDEWEB)

    Hulbert, S.L.; Lazarz, N.M. [eds.

    1992-04-01

    This report discusses the following research conducted at NSLS: atomic and molecular science; energy dispersive diffraction; lithography, microscopy and tomography; nuclear physics; UV photoemission and surface science; x-ray absorption spectroscopy; x-ray scattering and crystallography; x-ray topography; workshop on surface structure; workshop on electronic and chemical phenomena at surfaces; workshop on imaging; UV FEL machine reviews; VUV machine operations; VUV beamline operations; VUV storage ring parameters; x-ray machine operations; x-ray beamline operations; x-ray storage ring parameters; superconducting x-ray lithography source; SXLS storage ring parameters; the accelerator test facility; proposed UV-FEL user facility at the NSLS; global orbit feedback systems; and NSLS computer system.

  13. Status of the Linac based positron source at Saclay

    CERN Document Server

    Rey, J -M; Debu, P; Dzitko, H; Hardy, P; Liszkay, L; Lotrus, P; Muranaka, T; Noel, C; Perez, P; Pierret, O; Ruiz, N; Sacquin, Y

    2013-01-01

    Low energy positron beams are of major interest for fundamental science and materials science. IRFU has developed and built a slow positron source based on a compact, low energy (4.3 MeV) electron linac. The linac-based source will provide positrons for a magnetic storage trap and represents the first step of the GBAR experiment (Gravitational Behavior of Antimatter in Rest) recently approved by CERN for an installation in the Antiproton Decelerator hall. The installation built in Saclay will be described with its main characteristics. The ultimate target of the GBAR experiment will be briefly presented as well as the foreseen development of an industrial positron source dedicated for materials science laboratories.

  14. Note: Localization based on estimated source energy homogeneity

    Science.gov (United States)

    Turkaya, Semih; Toussaint, Renaud; Eriksen, Fredrik Kvalheim; Lengliné, Olivier; Daniel, Guillaume; Flekkøy, Eirik G.; Mâløy, Knut Jørgen

    2016-09-01

    Acoustic signal localization is a complex problem with a wide range of industrial and academic applications. Herein, we propose a localization method based on energy attenuation and inverted source amplitude comparison (termed estimated source energy homogeneity, or ESEH). This inversion is tested on both synthetic (numerical) data using a Lamb wave propagation model and experimental 2D plate data (recorded with 4 accelerometers sensitive up to 26 kHz). We compare the performance of this technique with classic source localization algorithms: arrival time localization, time reversal localization, and localization based on energy amplitude. Our technique is highly versatile and out-performs the conventional techniques in terms of error minimization and cost (both computational and financial).

  15. Scan-based volume animation driven by locally adaptive articulated registrations.

    Science.gov (United States)

    Rhee, Taehyun; Lewis, J P; Neumann, Ulrich; Nayak, Krishna S

    2011-03-01

    This paper describes a complete system to create anatomically accurate example-based volume deformation and animation of articulated body regions, starting from multiple in vivo volume scans of a specific individual. In order to solve the correspondence problem across volume scans, a template volume is registered to each sample. The wide range of pose variations is first approximated by volume blend deformation (VBD), providing proper initialization of the articulated subject in different poses. A novel registration method is presented to efficiently reduce the computation cost while avoiding strong local minima inherent in complex articulated body volume registration. The algorithm highly constrains the degrees of freedom and search space involved in the nonlinear optimization, using hierarchical volume structures and locally constrained deformation based on the biharmonic clamped spline. Our registration step establishes a correspondence across scans, allowing a data-driven deformation approach in the volume domain. The results provide an occlusion-free person-specific 3D human body model, asymptotically accurate inner tissue deformations, and realistic volume animation of articulated movements driven by standard joint control estimated from the actual skeleton. Our approach also addresses the practical issues arising in using scans from living subjects. The robustness of our algorithms is tested by their applications on the hand, probably the most complex articulated region in the body, and the knee, a frequent subject area for medical imaging due to injuries.

  16. Solving Information-Based Problems: Evaluating Sources and Information

    Science.gov (United States)

    Brand-Gruwel, Saskia; Stadtler, Marc

    2011-01-01

    The focus of this special section is on the processes involved when solving information-based problems. Solving these problems requires from people that they are able to define the information problem, search and select usable and reliable sources and information and synthesise information into a coherent body of knowledge. An important aspect…

  17. Solving Information-Based Problems: Evaluating Sources and Information

    Science.gov (United States)

    Brand-Gruwel, Saskia; Stadtler, Marc

    2011-01-01

    The focus of this special section is on the processes involved when solving information-based problems. Solving these problems requires from people that they are able to define the information problem, search and select usable and reliable sources and information and synthesise information into a coherent body of knowledge. An important aspect…

  18. Asymmetrical transformer-based embedded Z-source inverters

    DEFF Research Database (Denmark)

    Wei, Mo; Loh, Poh Chiang; Blaabjerg, Frede

    2013-01-01

    their performances, a number of asymmetrical transformer-based embedded Z-source inverters are proposed. Through theoretical derivation and experiments, the proposed inverters have been shown to draw a smooth input current and produce a high gain by varying the transformer turns ratio n. The range of variation for n...

  19. Fast Tunable Wavelength Sources Based on the Laser Diode Array

    Institute of Scientific and Technical Information of China (English)

    Sung-Chan; Cho; Hyun; Ha; Hong; Byoung-Whi; Kim

    2003-01-01

    We report a demonstration of a fast wavelength tunable source (TWS) based on the laser diode array coupled to the arrayed waveguide grating (AWG) multiplexer. The switching and optical characteristics of TWS make it a candidate for implementing the wavelength-division space switch fabric for an optical packet/burst switching.

  20. High brightness single photon sources based on photonic wires

    DEFF Research Database (Denmark)

    Claudon, J.; Bleuse, J.; Bazin, M.

    2009-01-01

    We present a novel single-photon-source based on the emission of a semiconductor quantum dot embedded in a single-mode photonic wire. This geometry ensures a very large coupling (> 95%) of the spontaneous emission to the guided mode. Numerical simulations show that a photon collection efficiency...

  1. Conceptional Design of the Laser Ion Source based Hadrontherapy Facility

    OpenAIRE

    Xie, Xiucui; Song, Mingtao; Zhang, Xiaohu

    2013-01-01

    Laser ion source (LIS), which can provide carbon beam with highly stripped state (C6+) and high intensity (several tens mA), would significantly change the overall design of the hadrontherapy facility. A LIS based hadrontherapy facility is proposed with the advantage of short linac length, simple injection scheme and small synchrotron size. With the experience from the DPIS and HITFiL project that had conducted in IMP, a conceptional design of the LIS based hadrontherapy facility will be pres...

  2. Definitions of database files and fields of the Personal Computer-Based Water Data Sources Directory

    Science.gov (United States)

    Green, J. Wayne

    1991-01-01

    This report describes the data-base files and fields of the personal computer-based Water Data Sources Directory (WDSD). The personal computer-based WDSD was derived from the U.S. Geological Survey (USGS) mainframe computer version. The mainframe version of the WDSD is a hierarchical data-base design. The personal computer-based WDSD is a relational data- base design. This report describes the data-base files and fields of the relational data-base design in dBASE IV (the use of brand names in this abstract is for identification purposes only and does not constitute endorsement by the U.S. Geological Survey) for the personal computer. The WDSD contains information on (1) the type of organization, (2) the major orientation of water-data activities conducted by each organization, (3) the names, addresses, and telephone numbers of offices within each organization from which water data may be obtained, (4) the types of data held by each organization and the geographic locations within which these data have been collected, (5) alternative sources of an organization's data, (6) the designation of liaison personnel in matters related to water-data acquisition and indexing, (7) the volume of water data indexed for the organization, and (8) information about other types of data and services available from the organization that are pertinent to water-resources activities.

  3. Polarized light source based on graphene-nanoribbon hybrid structure

    Science.gov (United States)

    Xu, Pengfei; Zhang, Han; Qian, Haoliang; Chen, Bigeng; Jiang, Xiaoshun; Wu, Yuanpeng; Liu, Xiaowei; Liu, Xu; Yang, Qing

    2017-07-01

    Nanoscale light source is the key element for on-chip integrated optical communication system. As an important property of light source, polarization can be exploited to improve the information capacity of optical communication and the sensitivity of optical sensing. We demonstrate a novel TE-polarized light source based on graphene-nanoribbon (G-NR) hybrid structure. Thanks to the polarizing dependent absorption along graphene layer, the random polarized emission of nanoribbon (NR) can be transferred into the same TE polarization. In addition, lasing action in G-NR hybrid structure is also investigated. We attribute the polarization control to the differential attenuation of electromagnetic modes in graphene. Our simulation revealed electromagnetic field distribution and far field polar images of TE and TM modes in nanoribbon, which is consistent with experimental results. The compact G-NR hybrid structure light source offers a new way to realize the polarization controllable nanoscale light source and facilitate the practical applications of nanowire or nanoribbon light source.

  4. Temperature and relative density of atomic hydrogen in a multicusp H sup minus volume source

    Energy Technology Data Exchange (ETDEWEB)

    Bruneteau, A.M.; Hollos, G.; Bacal, M. (Laboratoire de Physique des Milieux Ionises, Laboratoire du Centre National de la Recherche Scientifique, Ecole Polytechnique, 91128 Palaiseau Cedex, (France)); Bretagne, J. (Laboratoire de Physique des Gaz et des Plasmas, LA73 du Centre National de la Recherche Scientifique, Universite de Paris-Sud, 91405 Orsay (France))

    1990-06-15

    The Balmer {beta} and {gamma} line shapes have been analyzed to determine the relative density and the temperature of hydrogen atoms in magnetic multicusp plasma generators. Results for a 90-V, 4--40-mTorr, 1--18-A conventional multicusp plasma generator and a 50-V, 4-mTorr, 1--15-A hybrid multicusp plasma generator are presented. The relative number density of hydrogen atoms increased smoothly with pressure and discharge current but never exceeded 10%. The absolute atomic number density in a 90-V 10-A discharge varied in proportion with pressure. The atomic temperature (in the 0.1--0.4-eV range) decreased with pressure and slowly increased with the discharge current. The role of atoms in the processes determining the H{sup {minus}} temperature and the H{sub 2} vibrational and rotational temperatures is discussed. The results confirm that in multicusp negative-ion sources collisional excitation of ground state atoms and molecules by energetic electrons is the dominant process in Balmer-{beta} and -{gamma} light emission.

  5. Atomic temperature and density in multicusp H sup minus volume sources

    Energy Technology Data Exchange (ETDEWEB)

    Bruneteau, A.M.; Hollos, G.; Leroy, R.; Berlemont, P.; Bacal, M. (Laboratoire du C.N.R.S., Ecole Polytechnique, 91128 Palaiseau Cedex (France)); Bertagne, J. (Laboratoire de Physique des Gaz et des Plasmas, LA73 du CNRS, Universite de Paris-Sud, 91405 Orsay (France))

    1990-08-05

    The Balmer {beta} and {gamma} line shapes have been analyzed to determine the relative density and the temperature of hydrogen atoms in magnetic multicusp plasma generators. Results for a 90 V, 4--40 mTorr, 1--18 A conventional multicusp plasma generator and a 50 V, 4 mTorr, 1--15 A hybrid multicusp plasma generator are presented. The relative number density of hydrogen atoms increases smoothly with pressure and discharge current but never exceeds 10%. The absolute atomic number density in a 90 V--10 A discharge varies in proportion with pressure. The atomic temperature (in the 0.1--0.4 eV range) decreases with pressure and slowly increases with the discharge current. The role of atoms in the processes determining the H{sup {minus}} temperature and the H{sub 2} vibrational and rotational temperatures is discussed. The results confirm that in multicusp negative ion sources collisional excitation of ground-state atoms and molecules by energetic electrons is the dominant process in Balmer {beta} and {gamma} light emission.

  6. New conformity indices based on the calculation of distances between the target volume and the volume of reference isodose

    Science.gov (United States)

    Park, J M; Park, S-Y; Ye, S-J; Kim, J H; Carlson, J

    2014-01-01

    Objective: To present conformity indices (CIs) based on the distance differences between the target volume (TV) and the volume of reference isodose (VRI). Methods: The points on the three-dimensional surfaces of the TV and the VRI were generated. Then, the averaged distances between the points on the TV and the VRI were calculated (CIdistance). The performance of the presented CIs were evaluated by analysing six situations, which were a perfect match, an expansion and a reduction of the distance from the centroid to the VRI compared with the distance from the centroid to the TV by 10%, a lateral shift of the VRI by 3 cm, a rotation of the VRI by 45° and a spherical-shaped VRI having the same volume as the TV. The presented CIs were applied to the clinical prostate and head and neck (H&N) plans. Results: For the perfect match, CIdistance was 0 with 0 as the standard deviation (SD). When expanding and reducing, CIdistance was 10 and −10 with SDs 11. The average value of the CIdistance in the prostate and H&N plans was 0.13 ± 7.44 and 6.04 ± 23.27, respectively. Conclusion: The performance of the CIdistance was equal or better than those of the conventional CIs. Advances in knowledge: The evaluation of target conformity by the distances between the surface of the TV and the VRI could be more accurate than evaluation with volume information. PMID:25225915

  7. Web-based volume slicer for 3D electron-microscopy data from EMDB.

    Science.gov (United States)

    Salavert-Torres, José; Iudin, Andrii; Lagerstedt, Ingvar; Sanz-García, Eduardo; Kleywegt, Gerard J; Patwardhan, Ardan

    2016-05-01

    We describe the functionality and design of the Volume slicer - a web-based slice viewer for EMDB entries. This tool uniquely provides the facility to view slices from 3D EM reconstructions along the three orthogonal axes and to rapidly switch between them and navigate through the volume. We have employed multiple rounds of user-experience testing with members of the EM community to ensure that the interface is easy and intuitive to use and the information provided is relevant. The impetus to develop the Volume slicer has been calls from the EM community to provide web-based interactive visualisation of 2D slice data. This would be useful for quick initial checks of the quality of a reconstruction. Again in response to calls from the community, we plan to further develop the Volume slicer into a fully-fledged Volume browser that provides integrated visualisation of EMDB and PDB entries from the molecular to the cellular scale.

  8. An accelerator-based epithermal photoneutron source for boron neutron capture therapy

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, H.E.

    1996-04-01

    Boron neutron capture therapy is an experimental binary cancer radiotherapy modality in which a boronated pharmaceutical that preferentially accumulates in malignant tissue is first administered, followed by exposing the tissue in the treatment volume to a thermal neutron field. Current usable beams are reactor-based but a viable alternative is the production of an epithermal neutron beam from an accelerator. Current literature cites various proposed accelerator-based designs, most of which are based on proton beams with beryllium or lithium targets. This dissertation examines the efficacy of a novel approach to BNCT treatments that incorporates an electron linear accelerator in the production of a photoneutron source. This source may help to resolve some of the present concerns associated with accelerator sources, including that of target cooling. The photoneutron production process is discussed as a possible alternate source of neutrons for eventual BNCT treatments for cancer. A conceptual design to produce epithermal photoneutrons by high photons (due to bremsstrahlung) impinging on deuterium targets is presented along with computational and experimental neutron production data. A clinically acceptable filtered epithermal neutron flux on the order of 10{sup 7} neutrons per second per milliampere of electron current is shown to be obtainable. Additionally, the neutron beam is modified and characterized for BNCT applications by employing two unique moderating materials (an Al/AlF{sub 3} composite and a stacked Al/Teflon design) at various incident electron energies.

  9. An accelerator-based epithermal photoneutron source for boron neutron capture therapy

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Hannah E. [Georgia Inst. of Technology, Atlanta, GA (United States)

    1996-04-01

    Boron neutron capture therapy is an experimental binary cancer radiotherapy modality in which a boronated pharmaceutical that preferentially accumulates in malignant tissue is first administered, followed by exposing the tissue in the treatment volume to a thermal neutron field. Current usable beams are reactor-based but a viable alternative is the production of an epithermal neutron beam from an accelerator. Current literature cites various proposed accelerator-based designs, most of which are based on proton beams with beryllium or lithium targets. This dissertation examines the efficacy of a novel approach to BNCT treatments that incorporates an electron linear accelerator in the production of a photoneutron source. This source may help to resolve some of the present concerns associated with accelerator sources, including that of target cooling. The photoneutron production process is discussed as a possible alternate source of neutrons for eventual BNCT treatments for cancer. A conceptual design to produce epithermal photoneutrons by high photons (due to bremsstrahlung) impinging on deuterium targets is presented along with computational and experimental neutron production data. A clinically acceptable filtered epithermal neutron flux on the order of 107 neutrons per second per milliampere of electron current is shown to be obtainable. Additionally, the neutron beam is modified and characterized for BNCT applications by employing two unique moderating materials (an Al/AlF3 composite and a stacked Al/Teflon design) at various incident electron energies.

  10. Laser wakefield accelerator based light sources: potential applications and requirements

    Energy Technology Data Exchange (ETDEWEB)

    Albert, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). NIF and Photon Sciences; Thomas, A. G. [Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Nuclear Engineering and Radiological Sciences; Mangles, S. P.D. [Imperial College, London (United Kingdom). Blackett Lab.; Banerjee, S. [Univ. of Nebraska, Lincoln, NE (United States); Corde, S. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Flacco, A. [ENSTA, CNRS, Ecole Polytechnique, Palaiseau (France); Litos, M. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Neely, D. [Science and Technology Facilities Council (STFC), Oxford (United Kingdom). Rutherford Appleton Lab. (RAL). Central Laser Facility; Viera, J. [Univ. of Lisbon (Portugal). GoLP-Inst. de Plasmas e Fusao Nuclear-Lab. Associado; Najmudin, Z. [Imperial College, London (United Kingdom). Blackett Lab.; Bingham, R. [Science and Technology Facilities Council (STFC), Oxford (United Kingdom). Rutherford Appleton Lab. (RAL). Central Laser Facility; Joshi, C. [Univ. of California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Katsouleas, T. [Duke Univ., Durham, NC (United States). Platt School of Engineering

    2015-01-15

    In this article we review the prospects of laser wakefield accelerators as next generation light sources for applications. This work arose as a result of discussions held at the 2013 Laser Plasma Accelerators Workshop. X-ray phase contrast imaging, X-ray absorption spectroscopy, and nuclear resonance fluorescence are highlighted as potential applications for laser-plasma based light sources. We discuss ongoing and future efforts to improve the properties of radiation from plasma betatron emission and Compton scattering using laser wakefield accelerators for these specific applications.

  11. A volume-based method for denoising on curved surfaces

    KAUST Repository

    Biddle, Harry

    2013-09-01

    We demonstrate a method for removing noise from images or other data on curved surfaces. Our approach relies on in-surface diffusion: we formulate both the Gaussian diffusion and Perona-Malik edge-preserving diffusion equations in a surface-intrinsic way. Using the Closest Point Method, a recent technique for solving partial differential equations (PDEs) on general surfaces, we obtain a very simple algorithm where we merely alternate a time step of the usual Gaussian diffusion (and similarly Perona-Malik) in a small 3D volume containing the surface with an interpolation step. The method uses a closest point function to represent the underlying surface and can treat very general surfaces. Experimental results include image filtering on smooth surfaces, open surfaces, and general triangulated surfaces. © 2013 IEEE.

  12. Dynamic Gain Equalizer Based on the H-PDLC Volume Phase Grating

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The structure and Bragg diffraction characteristics of volume phase gratings based on H-PDLC technology are presented, and the principles and simulation aided design of dynamic gain equalizers with the gratings are discussed.

  13. Annual Conference on HAN-Based Liquid Propellants. Volume 1

    Science.gov (United States)

    1989-05-01

    the methods showed excellent agreement. For mixtures we used a revised Redlich - Kwong equation [10] with eqn. (2). Henry’s constants for gases in...NAME OF MONITORING ORGANIZATION (If applicable ) US Armiy Ballistic Rsch Lab SLCBR-IB 6c. ADDRESS (City, State , and ZIP Code) 7b. ADDRESS (City...NUMBER ORGANIZATION (If applicable ) 8c. ADDRESS (City, State , and ZIP Code) 10 SOURCE OF FUNDING NUMBERS PROGRAM PROJECT TASK IWORK UNIT ELEMENT NO. NO

  14. Volume-outcome relation for acute appendicitis: evidence from a nationwide population-based study.

    Directory of Open Access Journals (Sweden)

    Po-Li Wei

    Full Text Available BACKGROUND: Although procedures like appendectomy have been studied extensively, the relative importance of each surgeon's surgical volume-to-ruptured appendicitis has not been explored. The purpose of this study was to investigate the rate of ruptured appendicitis by surgeon-volume groups as a measure of quality of care for appendicitis by using a nationwide population-based dataset. METHODS: We identified 65,339 first-time hospitalizations with a discharge diagnosis of acute appendicitis (International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM codes 540, 540.0, 540.1 and 540.9 between January 2007 and December 2009. We used "whether or not a patient had a perforated appendicitis" as the outcome measure. A conditional (fixed-effect logistic regression model was performed to explore the odds of perforated appendicitis among surgeon case volume groups. RESULTS: Patients treated by low-volume surgeons had significantly higher morbidity rates than those treated by high-volume (28.1% vs. 26.15, p<0.001 and very-high-volume surgeons (28.1% vs. 21.4%, p<0.001. After adjusting for surgeon practice location, and teaching status of practice hospital, and patient age, gender, and Charlson Comorbidity Index, and hospital acute appendicitis volume, patients treated by low-volume surgeons had significantly higher rates of perforated appendicitis than those treated by medium-volume surgeons (OR = 1.09, p<0.001, high-volume surgeons (OR = 1.16, p<0.001, or very-high-volume surgeons (OR = 1.54, p<0.001. CONCLUSION: Our study suggested that surgeon volume is an important factor with regard to the rate of ruptured appendicitis.

  15. A Well-Clear Volume Based on Time to Entry Point

    Science.gov (United States)

    Narkawicz, Anthony J.; Munoz, Cesar A.; Upchurch, Jason M.; Chamberlain, James P.; Consiglio, Maria C.

    2014-01-01

    A well-clear volume is a key component of NASA's Separation Assurance concept for the integration of UAS in the NAS. This paper proposes a mathematical definition of the well-clear volume that uses, in addition to distance thresholds, a time threshold based on time to entry point (TEP). The mathematical model that results from this definition is more conservative than other candidate definitions of the wellclear volume that are based on range over closure rate and time to closest point of approach.

  16. Optimal Source-Based Filtering of Malicious Traffic

    CERN Document Server

    Soldo, Fabio; Markopoulou, Athina

    2010-01-01

    In this paper, we consider the problem of blocking malicious traffic on the Internet, via source-based filtering. In particular, we consider filtering via access control lists (ACLs): these are already available at the routers today but are a scarce resource because they are stored in the expensive ternary content addressable memory (TCAM). Aggregation (by filtering source prefixes instead of individual IP addresses) helps reduce the number of filters, but comes also at the cost of blocking legitimate traffic originating from the filtered prefixes. We show how to optimally choose which source prefixes to filter, for a variety of realistic attack scenarios and operators' policies. In each scenario, we design optimal, yet computationally efficient, algorithms. Using logs from Dshield.org, we evaluate the algorithms and demonstrate that they bring significant benefit in practice.

  17. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  18. Transaction-Based Building Controls Framework, Volume 1: Reference Guide

    Energy Technology Data Exchange (ETDEWEB)

    Somasundaram, Sriram [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Pratt, Robert G. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Akyol, Bora A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fernandez, Nicholas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Foster, Nikolas AF [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Mayhorn, Ebony T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Somani, Abhishek [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Steckley, Andrew C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Taylor, Zachary T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.

  19. Transaction-Based Building Controls Framework, Volume 1: Reference Guide

    Energy Technology Data Exchange (ETDEWEB)

    Somasundaram, Sriram; Pratt, Robert G.; Akyol, Bora A.; Fernandez, Nicholas; Foster, Nikolas AF; Katipamula, Srinivas; Mayhorn, Ebony T.; Somani, Abhishek; Steckley, Andrew C.; Taylor, Zachary T.

    2014-04-28

    This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.

  20. A Novel Video Data-Source Authentication Model Based on Digital Watermarking and MAC in Multicast

    Institute of Scientific and Technical Information of China (English)

    ZHAO Anjun; LU Xiangli; GUO Lei

    2006-01-01

    A novel video data authentication model based on digital video watermarking and MAC (message authentication code) in multicast protocol is proposed in this paper. The digital watermarking which composes of the MAC of the significant video content, the key and instant authentication data is embedded into the insignificant video component by the MLUT (modified look-up table) video watermarking technology. We explain a method that does not require storage of each data packet for a time, thus making receiver not vulnerable to DOS (denial of service) attack. So the video packets can be authenticated instantly without large volume buffer in the receivers. TESLA(timed efficient stream loss-tolerant authentication) does not explain how to select the suitable value for d, which is an important parameter in multicast source authentication. So we give a method to calculate the key disclosure delay (number of intervals). Simulation results show that the proposed algorithms improve the performance of data source authentication in multicast.

  1. Cavity-Enhanced Single-Photon Source Based on the Silicon-Vacancy Center in Diamond

    Science.gov (United States)

    Benedikter, Julia; Kaupp, Hanno; Hümmer, Thomas; Liang, Yuejiang; Bommer, Alexander; Becher, Christoph; Krueger, Anke; Smith, Jason M.; Hänsch, Theodor W.; Hunger, David

    2017-02-01

    Single-photon sources are an integral part of various quantum technologies, and solid-state quantum emitters at room temperature appear to be a promising implementation. We couple the fluorescence of individual silicon-vacancy centers in nanodiamonds to a tunable optical microcavity to demonstrate a single-photon source with high efficiency, increased emission rate, and improved spectral purity compared to the intrinsic emitter properties. We use a fiber-based microcavity with a mode volume as small as 3.4 λ3 and a quality factor of 1.9 ×1 04 and observe an effective Purcell factor of up to 9.2. Furthermore, we study modifications of the internal rate dynamics and propose a rate model that closely agrees with the measurements. We observe lifetime changes of up to 31%, limited by the finite quantum efficiency of the emitters studied here. With improved materials, our achieved parameters predict single-photon rates beyond 1 GHz.

  2. Applications of laser wakefield accelerator-based light sources

    Science.gov (United States)

    Albert, Félicie; Thomas, Alec G. R.

    2016-11-01

    Laser-wakefield accelerators (LWFAs) were proposed more than three decades ago, and while they promise to deliver compact, high energy particle accelerators, they will also provide the scientific community with novel light sources. In a LWFA, where an intense laser pulse focused onto a plasma forms an electromagnetic wave in its wake, electrons can be trapped and are now routinely accelerated to GeV energies. From terahertz radiation to gamma-rays, this article reviews light sources from relativistic electrons produced by LWFAs, and discusses their potential applications. Betatron motion, Compton scattering and undulators respectively produce x-rays or gamma-rays by oscillating relativistic electrons in the wakefield behind the laser pulse, a counter-propagating laser field, or a magnetic undulator. Other LWFA-based light sources include bremsstrahlung and terahertz radiation. We first evaluate the performance of each of these light sources, and compare them with more conventional approaches, including radio frequency accelerators or other laser-driven sources. We have then identified applications, which we discuss in details, in a broad range of fields: medical and biological applications, military, defense and industrial applications, and condensed matter and high energy density science.

  3. Robust Source Localization in Shallow Water Based on Vector Optimization

    Institute of Scientific and Technical Information of China (English)

    SONG Hai-yan; SHI Jie; LIU Bo-sheng

    2013-01-01

    Owing to the multipath effect,the source localization in shallow water has been an area of active interest.However,most methods for source localization in shallow water are sensitive to the assumed model of the underwater environment and have poor robustness against the underwater channel uncertainty,which limit their further application in practical engineering.In this paper,a new method of source localization in shallow water,based on vector optimization concept,is described,which is highly robust against environmental factors affecting the localization,such as the channel depth,the bottom reflection coefficients,and so on.Through constructing the uncertainty set of the source vector errors and extracting the multi-path sound rays from the sea surface and bottom,the proposed method can accurately localize one or more sources in shallow water dominated by multipath propagation.It turns out that the natural formulation of our approach involves minimization of two quadratic functions subject to infinitely many nonconvex quadratic constraints.It shows that this problem (originally intractable) can be reformulated in a convex form as the so-called second-order cone program (SOCP) and solved efficiently by using the well-established interior point method,such as the software tool,SeDuMi.Computer simulations show better performance of the proposed method as compared with existing algorithms and establish a theoretical foundation for the practical engineering application.

  4. Source distance determination based on the spherical harmonics

    Science.gov (United States)

    Koutny, Adam; Jiricek, Ondrej; Thomas, Jean-Hugh; Brothanek, Marek

    2017-02-01

    This paper deals with the processing of signals measured by a spherical microphone array, focusing on the utilization of near-field information of such an array. The processing, based on the spherical harmonics decomposition, is performed in order to investigate the radial-dependent spherical functions and extract their argument - distance to the source. Using the low-frequency approximation of these functions, the source distance is explicitly expressed. The source distance is also determined from the original equation (using no approximation) by comparing both sides of this equation. The applicability of both methods is first presented in the noise-less data simulation, then validated with data contaminated by the additive white noise of different signal-to-noise ratios. Finally, both methods are tested for real data measured by a rigid spherical microphone array of radius 0.15 m, consisting of 36 microphones for a point source represented by a small speaker. The possibility of determination of the source distance using low-order spherical harmonics is shown.

  5. Robust source localization in shallow water based on vector optimization

    Science.gov (United States)

    Song, Hai-yan; Shi, Jie; Liu, Bo-sheng

    2013-06-01

    Owing to the multipath effect, the source localization in shallow water has been an area of active interest. However, most methods for source localization in shallow water are sensitive to the assumed model of the underwater environment and have poor robustness against the underwater channel uncertainty, which limit their further application in practical engineering. In this paper, a new method of source localization in shallow water, based on vector optimization concept, is described, which is highly robust against environmental factors affecting the localization, such as the channel depth, the bottom reflection coefficients, and so on. Through constructing the uncertainty set of the source vector errors and extracting the multi-path sound rays from the sea surface and bottom, the proposed method can accurately localize one or more sources in shallow water dominated by multipath propagation. It turns out that the natural formulation of our approach involves minimization of two quadratic functions subject to infinitely many nonconvex quadratic constraints. It shows that this problem (originally intractable) can be reformulated in a convex form as the so-called second-order cone program (SOCP) and solved efficiently by using the well-established interior point method, such as the software tool, SeDuMi. Computer simulations show better performance of the proposed method as compared with existing algorithms and establish a theoretical foundation for the practical engineering application.

  6. Source-Based Tasks in Writing Independent and Integrated Essays

    Directory of Open Access Journals (Sweden)

    Javad Gholami

    2017-07-01

    Full Text Available Integrated writing tasks have gained considerable attention in ESL and EFL writing assessment and are frequently needed and used in academic settings and daily life. However, they are very rarely practiced and promoted in writing classes. This paper explored the effects of source-based writing practice on EFL learners’ composing abilities and investigated the probable differences between those tasks and independent writing ones in improving Iranian EFL learners’ essay writing abilities. To this end, a quasi-experimental design was implemented to gauge EFL learners’ writing improvements using a pretest-posttest layout. Twenty female learners taking a TOEFL iBT preparation course were randomly divided into an only-writing group with just independent writing instruction and essay practice, and a hybrid-writing-approach group receiving instruction and practice on independent writing plus source-based essay writing for ten sessions. Based on the findings, the participants with hybrid writing practice outperformed their counterparts in integrated essay tests. Their superior performance was not observed in the case of traditional independent writing tasks. The present study calls for incorporating more source-based writing tasks in writing courses.

  7. Research of mine water source identification based on LIF technology

    Science.gov (United States)

    Zhou, Mengran; Yan, Pengcheng

    2016-09-01

    According to the problem that traditional chemical methods to the mine water source identification takes a long time, put forward a method for rapid source identification system of mine water inrush based on the technology of laser induced fluorescence (LIF). Emphatically analyzes the basic principle of LIF technology. The hardware composition of LIF system are analyzed and the related modules were selected. Through the fluorescence experiment with the water samples of coal mine in the LIF system, fluorescence spectra of water samples are got. Traditional water source identification mainly according to the ion concentration representative of the water, but it is hard to analysis the ion concentration of the water from the fluorescence spectra. This paper proposes a simple and practical method of rapid identification of water by fluorescence spectrum, which measure the space distance between unknown water samples and standard samples, and then based on the clustering analysis, the category of the unknown water sample can be get. Water source identification for unknown samples verified the reliability of the LIF system, and solve the problem that the current coal mine can't have a better real-time and online monitoring on water inrush, which is of great significance for coal mine safety in production.

  8. Fizeau simultaneous phase-shifting interferometry based on extended source

    Science.gov (United States)

    Wang, Shanshan; Zhu, Qiudong; Hou, Yinlong; Cao, Zheng

    2016-09-01

    Coaxial Fizeau simultaneous phase-shifting interferometer plays an important role in many fields for its characteristics of long optical path, miniaturization, and elimination of reference surface high-frequency error. Based on the matching of coherence between extended source and interferometer, orthogonal polarization reference wave and measurement wave can be obtained by Fizeau interferometry with Michelson interferometer preposed. Through matching spatial coherence length between preposed interferometer and primary interferometer, high contrast interference fringes can be obtained and additional interference fringes can be eliminated. Thus, the problem of separation of measurement and reference surface in the common optical path Fizeau interferometer is solved. Numerical simulation and principle experiment is conducted to verify the feasibility of extended source interferometer. Simulation platform is established by using the communication technique of DDE (dynamic data exchange) to connect Zemax and Matlab. The modeling of the extended source interferometer is realized by using Zemax. Matlab codes are programmed to automatically rectify the field parameters of the optical system and conveniently calculate the visibility of interference fringes. Combined with the simulation, the experimental platform of the extended source interferometer is established. After experimental research on the influence law of scattering screen granularity to interference fringes, the granularity of scattering screen is determined. Based on the simulation platform and experimental platform, the impacts on phase measurement accuracy of the imaging system aberration and collimation system aberration of the interferometer are analyzed. Compared the visibility relation curves between experimental measurement and simulation result, the experimental result is in line with the theoretical result.

  9. Volume Based DTM Generation from Very High Resolution Photogrammetric Dsms

    Science.gov (United States)

    Piltz, B.; Bayer, S.; Poznanska, A. M.

    2016-06-01

    In this paper we propose a new algorithm for digital terrain (DTM) model reconstruction from very high spatial resolution digital surface models (DSMs). It represents a combination of multi-directional filtering with a new metric which we call normalized volume above ground to create an above-ground mask containing buildings and elevated vegetation. This mask can be used to interpolate a ground-only DTM. The presented algorithm works fully automatically, requiring only the processing parameters minimum height and maximum width in metric units. Since slope and breaklines are not decisive criteria, low and smooth and even very extensive flat objects are recognized and masked. The algorithm was developed with the goal to generate the normalized DSM for automatic 3D building reconstruction and works reliably also in environments with distinct hillsides or terrace-shaped terrain where conventional methods would fail. A quantitative comparison with the ISPRS data sets Potsdam and Vaihingen show that 98-99% of all building data points are identified and can be removed, while enough ground data points (~66%) are kept to be able to reconstruct the ground surface. Additionally, we discuss the concept of size dependent height thresholds and present an efficient scheme for pyramidal processing of data sets reducing time complexity to linear to the number of pixels, O(WH).

  10. Emergency department spirometric volume and base deficit delineate risk for torso injury in stable patients

    Directory of Open Access Journals (Sweden)

    Sipe Eilynn K

    2004-01-01

    Full Text Available Abstract Background We sought to determine torso injury rates and sensitivities associated with fluid-positive abdominal ultrasound, metabolic acidosis (increased base deficit and lactate, and impaired pulmonary physiology (decreased spirometric volume and PaO2/FiO2. Methods Level I trauma center prospective pilot and post-pilot study (2000–2001 of stable patients. Increased base deficit was 2.5 mmol/L in ethanol-negative and ≥ 3.0 mmol/L in ethanol-positive patients. Decreased PaO2/FiO2 was Results Of 215 patients, 66 (30.7% had a torso injury (abdominal/pelvic injury n = 35 and/or thoracic injury n = 43. Glasgow Coma Scale score was 14.8 ± 0.5 (13–15. Torso injury rates and sensitivities were: abdominal ultrasound negative and normal base deficit, lactate, PaO2/FiO2, and spirometric volume – 0.0% & 0.0%; normal base deficit and normal spirometric volume – 4.2% & 4.5%; chest/abdominal soft tissue injury – 37.8% & 47.0%; increased lactate – 39.7% & 47.0%; increased base deficit – 41.3% & 75.8%; increased base deficit and/or decreased spirometric volume – 43.8% & 95.5%; decreased PaO2/FiO2 – 48.9% & 33.3%; positive abdominal ultrasound – 62.5% & 7.6%; decreased spirometric volume – 73.4% & 71.2%; increased base deficit and decreased spirometric volume – 82.9% & 51.5%. Conclusions Trauma patients with normal base deficit and spirometric volume are unlikely to have a torso injury. Patients with increased base deficit or lactate, decreased spirometric volume, decreased PaO2/FiO2, or positive FAST have substantial risk for torso injury. Increased base deficit and/or decreased spirometric volume are highly sensitive for torso injury. Base deficit and spirometric volume values are readily available and increase or decrease the suspicion for torso injury.

  11. Academy Sharing Knowledge (ASK). The NASA Source for Project Management Magazine, Volume 11, March 2003

    Science.gov (United States)

    2003-01-01

    APPL is a research-based organization that serves NASA program and project managers, as well as project teams, at every level of development. In 1997, APPL was created from an earlier program to underscore the importance that NASA places on project management and project teams through a wide variety of products and services, including knowledge sharing, classroom and online courses, career development guidance, performance support, university partnerships, and advanced technology tools. ASK Magazine grew out of APPL's Knowledge Sharing Initiative. The stories that appear in ASK are written by the 'best of the best' project managers, primarily from NASA, but also from other government agencies and industry. Contributors to this issue include: Teresa Bailey, a librarian at the Jet Propulsion Laboratory, Roy Malone, Deputy Director in the Safety and Mission Assurance (S&MA) Office at the NASA Marshall Space Flight Center (MSFC), W. Scott Cameron, Capital Systems Manager for the Food and Beverage Global Business Unit of Procter and Gamble, Ray Morgan, recent retiree as Vice President of AeroVironment, Inc., Marty Davis, Program Manager of the Geostationary Operational Environmental Satellite (GOES) at the NASA Goddard Space Flight Center (GSFC) in Greenbelt, Maryland, Todd Post, editor of ASK Magazine, and works for EduTech Ltd. in Silver Spring, Maryland, Dr. Owen Gadeken, professor of Engineering Management at the Defense Acquisition University, Ken Schwer, currently the Project Manager of Solar Dynamics Observatory, Dr. Edward Hoffmwan, Director of the NASA Academy of Program and Project Leadership, Frank Snow, a member of the NASA Explorer Program at Goddard Space Flight Center since 1992, Dr. Alexander Laufer, Editor-in-Chief of ASK Magazine and a member of the Advisory Board of the NASA Academy of Program and Project Leadership, Judy Stokley, presently Air Force Program Executive Officer for Weapons in Washington, D.C. and Terry Little, Director of the Kinetic

  12. Evidence Locator: sources of evidence-based dentistry information.

    Science.gov (United States)

    Frantsve-Hawley, Julie

    2008-09-01

    Multiple resources are available to help practitioners access the latest scientific evidence. Evidence-based dentistry (EBD) is an approach to clinical decision making that incorporates the most current and comprehensive scientific evidence with the practitioner's judgment and the patient's needs and preferences. One challenge in implementing this approach is access to evidence, and there are multiple online resources that can be used in this endeavor. This article presents the Evidence Locator, a list of Web sites that provide access to "secondary sources" of evidence. Such "secondary sources" are typically summaries of systematic reviews and evidence-based clinical recommendations or guidelines. Also presented is a list of other Web sites that may be useful to the practitioner in implementing EBD.

  13. Effect of hospital volume on processes of breast cancer care: A National Cancer Data Base study.

    Science.gov (United States)

    Yen, Tina W F; Pezzin, Liliana E; Li, Jianing; Sparapani, Rodney; Laud, Purushuttom W; Nattinger, Ann B

    2017-05-15

    The purpose of this study was to examine variations in delivery of several breast cancer processes of care that are correlated with lower mortality and disease recurrence, and to determine the extent to which hospital volume explains this variation. Women who were diagnosed with stage I-III unilateral breast cancer between 2007 and 2011 were identified within the National Cancer Data Base. Multiple logistic regression models were developed to determine whether hospital volume was independently associated with each of 10 individual process of care measures addressing diagnosis and treatment, and 2 composite measures assessing appropriateness of systemic treatment (chemotherapy and hormonal therapy) and locoregional treatment (margin status and radiation therapy). Among 573,571 women treated at 1755 different hospitals, 38%, 51%, and 10% were treated at high-, medium-, and low-volume hospitals, respectively. On multivariate analysis controlling for patient sociodemographic characteristics, treatment year and geographic location, hospital volume was a significant predictor for cancer diagnosis by initial biopsy (medium volume: odds ratio [OR] = 1.15, 95% confidence interval [CI] = 1.05-1.25; high volume: OR = 1.30, 95% CI = 1.14-1.49), negative surgical margins (medium volume: OR = 1.15, 95% CI = 1.06-1.24; high volume: OR = 1.28, 95% CI = 1.13-1.44), and appropriate locoregional treatment (medium volume: OR = 1.12, 95% CI = 1.07-1.17; high volume: OR = 1.16, 95% CI = 1.09-1.24). Diagnosis of breast cancer before initial surgery, negative surgical margins and appropriate use of radiation therapy may partially explain the volume-survival relationship. Dissemination of these processes of care to a broader group of hospitals could potentially improve the overall quality of care and outcomes of breast cancer survivors. Cancer 2017;123:957-66. © 2016 American Cancer Society. © 2016 American Cancer Society.

  14. Projections for the Production of Bulk Volume Bio-Based Polymers in Europe and Environmental Implications

    NARCIS (Netherlands)

    Patel, M.K.; Crank, M.

    2007-01-01

    In this paper we provide an overview of the most important emerging groups of bio-based polymers for bulk volume applications and we discuss market projections for these types of bio-based polymers in the EU, thereby distinguishing between three scenarios. Bio-based polymers are projected to reach a

  15. Volume-Outcome Relation for Acute Appendicitis: Evidence from a Nationwide Population-Based Study

    Science.gov (United States)

    Wei, Po-Li; Liu, Shih-Ping; Keller, Joseph J.; Lin, Herng-Ching

    2012-01-01

    Background Although procedures like appendectomy have been studied extensively, the relative importance of each surgeon's surgical volume-to-ruptured appendicitis has not been explored. The purpose of this study was to investigate the rate of ruptured appendicitis by surgeon-volume groups as a measure of quality of care for appendicitis by using a nationwide population-based dataset. Methods We identified 65,339 first-time hospitalizations with a discharge diagnosis of acute appendicitis (International Classification of Disease, Ninth Revision, Clinical Modification (ICD-9-CM) codes 540, 540.0, 540.1 and 540.9) between January 2007 and December 2009. We used “whether or not a patient had a perforated appendicitis” as the outcome measure. A conditional (fixed-effect) logistic regression model was performed to explore the odds of perforated appendicitis among surgeon case volume groups. Results Patients treated by low-volume surgeons had significantly higher morbidity rates than those treated by high-volume (28.1% vs. 26.15, pappendicitis volume, patients treated by low-volume surgeons had significantly higher rates of perforated appendicitis than those treated by medium-volume surgeons (OR = 1.09, pappendicitis. PMID:23300703

  16. VOXEL-BASED APPROACH FOR ESTIMATING URBAN TREE VOLUME FROM TERRESTRIAL LASER SCANNING DATA

    Directory of Open Access Journals (Sweden)

    C. Vonderach

    2012-07-01

    Full Text Available The importance of single trees and the determination of related parameters has been recognized in recent years, e.g. for forest inventories or management. For urban areas an increasing interest in the data acquisition of trees can be observed concerning aspects like urban climate, CO2 balance, and environmental protection. Urban trees differ significantly from natural systems with regard to the site conditions (e.g. technogenic soils, contaminants, lower groundwater level, regular disturbance, climate (increased temperature, reduced humidity and species composition and arrangement (habitus and health status and therefore allometric relations cannot be transferred from natural sites to urban areas. To overcome this problem an extended approach was developed for a fast and non-destructive extraction of branch volume, DBH (diameter at breast height and height of single trees from point clouds of terrestrial laser scanning (TLS. For data acquisition, the trees were scanned with highest scan resolution from several (up to five positions located around the tree. The resulting point clouds (20 to 60 million points are analysed with an algorithm based on voxel (volume elements structure, leading to an appropriate data reduction. In a first step, two kinds of noise reduction are carried out: the elimination of isolated voxels as well as voxels with marginal point density. To obtain correct volume estimates, the voxels inside the stem and branches (interior voxels where voxels contain no laser points must be regarded. For this filling process, an easy and robust approach was developed based on a layer-wise (horizontal layers of the voxel structure intersection of four orthogonal viewing directions. However, this procedure also generates several erroneous "phantom" voxels, which have to be eliminated. For this purpose the previous approach was extended by a special region growing algorithm. In a final step the volume is determined layer-wise based on the

  17. Low contrast medium-volume third-generation dual-source computed tomography angiography for transcatheter aortic valve replacement planning

    Energy Technology Data Exchange (ETDEWEB)

    Felmly, Lloyd M. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiothoracic Surgery, Department of Surgery, Charleston, SC (United States); De Cecco, Carlo N.; Varga-Szemes, Akos; McQuiston, Andrew D. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Schoepf, U.J.; Litwin, Sheldon E.; Bayer, Richard R. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Mangold, Stefanie [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University Hospital of Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany); Vogl, Thomas J. [University Hospital Frankfurt, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Wichmann, Julian L. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University Hospital Frankfurt, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany)

    2017-05-15

    To investigate feasibility, image quality and safety of low-tube-voltage, low-contrast-volume comprehensive cardiac and aortoiliac CT angiography (CTA) for planning transcatheter aortic valve replacement (TAVR). Forty consecutive TAVR candidates prospectively underwent combined CTA of the aortic root and vascular access route (270 mgI/ml iodixanol). Patients were assigned to group A (second-generation dual-source CT [DSCT], 100 kV, 60 ml contrast, 4.0 ml/s flow rate) or group B (third-generation DSCT, 70 kV, 40 ml contrast, 2.5 ml/s flow rate). Vascular attenuation, noise, signal-to-noise (SNR) and contrast-to-noise ratios (CNR) were compared. Subjective image quality was assessed by two observers. Estimated glomerular filtration (eGFR) at CTA and follow-up were measured. Besides a higher body-mass-index in group B (24.8±3.8 kg/m{sup 2} vs. 28.1±5.4 kg/m{sup 2}, P=0.0339), patient characteristics between groups were similar (P≥0.0922). Aortoiliac SNR (P=0.0003) was higher in group B. Cardiac SNR (P=0.0003) and CNR (P=0.0181) were higher in group A. Subjective image quality was similar (P≥0.213) except for aortoiliac image noise (4.42 vs. 4.12, P=0.0374). TAVR-planning measurements were successfully obtained in all patients. There were no significant changes in eGFR among and between groups during follow-up (P≥0.302). TAVR candidates can be safely and effectively evaluated by a comprehensive CTA protocol with low contrast volume using low-tube-voltage acquisition. (orig.)

  18. Gamut Volume Index: a color preference metric based on meta-analysis and optimized colour samples.

    Science.gov (United States)

    Liu, Qiang; Huang, Zheng; Xiao, Kaida; Pointer, Michael R; Westland, Stephen; Luo, M Ronnier

    2017-07-10

    A novel metric named Gamut Volume Index (GVI) is proposed for evaluating the colour preference of lighting. This metric is based on the absolute gamut volume of optimized colour samples. The optimal colour set of the proposed metric was obtained by optimizing the weighted average correlation between the metric predictions and the subjective ratings for 8 psychophysical studies. The performance of 20 typical colour metrics was also investigated, which included colour difference based metrics, gamut based metrics, memory based metrics as well as combined metrics. It was found that the proposed GVI outperformed the existing counterparts, especially for the conditions where correlated colour temperatures differed.

  19. Towards Evidence-Based Understanding of Electronic Data Sources

    DEFF Research Database (Denmark)

    Chen, Lianping; Ali Babar, Muhammad; Zhang, He

    2010-01-01

    Identifying relevant papers from various Electronic Data Sources (EDS) is one of the key activities of conducting these kinds of studies. Hence, the selection of EDS for searching the potentially relevant papers is an important decision, which can affect a study’s coverage of relevant papers....... Researchers usually select EDS mainly based on personal knowledge, experience, and preferences and/or recommendations by other researchers. We believe that building an evidence-based understanding of EDS can enable researchers to make more informed decisions about the selection of EDS. This paper reports our...

  20. Conceptional design of the laser ion source based hadrontherapy facility

    Science.gov (United States)

    Xie, Xiu-Cui; Song, Ming-Tao; Zhang, Xiao-Hu

    2014-04-01

    A laser ion source (LIS), which can provide a carbon beam with highly stripped state (C6+) and high intensity (several tens mA), would significantly change the overall design of the hadrontherapy facility. The proposed LIS based hadrontherapy facility has the advantages of short linac length, simple injection scheme, and small synchrotron size. With the experience from the DPIS and HITFiL projects that have been conducted in IMP, a conceptional design of the LIS based hadrontherapy facility will be presented, with special attention given to APF type IH DTL design and simulation.

  1. Conceptional Design of the Laser Ion Source based Hadrontherapy Facility

    CERN Document Server

    Xie, Xiucui; Zhang, Xiaohu

    2013-01-01

    Laser ion source (LIS), which can provide carbon beam with highly stripped state (C6+) and high intensity (several tens mA), would significantly change the overall design of the hadrontherapy facility. A LIS based hadrontherapy facility is proposed with the advantage of short linac length, simple injection scheme and small synchrotron size. With the experience from the DPIS and HITFiL project that had conducted in IMP, a conceptional design of the LIS based hadrontherapy facility will be present with special dedication to APF type IH DTL design and simulation.

  2. An ancient relation between units of length and volume based on a sphere.

    Directory of Open Access Journals (Sweden)

    Elena Zapassky

    Full Text Available The modern metric system defines units of volume based on the cube. We propose that the ancient Egyptian system of measuring capacity employed a similar concept, but used the sphere instead. When considered in ancient Egyptian units, the volume of a sphere, whose circumference is one royal cubit, equals half a hekat. Using the measurements of large sets of ancient containers as a database, the article demonstrates that this formula was characteristic of Egyptian and Egyptian-related pottery vessels but not of the ceramics of Mesopotamia, which had a different system of measuring length and volume units.

  3. A Parallax-based Distance Estimator for Spiral Arm Sources

    Science.gov (United States)

    Reid, M. J.; Dame, T. M.; Menten, K. M.; Brunthaler, A.

    2016-06-01

    The spiral arms of the Milky Way are being accurately located for the first time via trigonometric parallaxes of massive star-forming regions with the Bar and Spiral Structure Legacy Survey, using the Very Long Baseline Array and the European VLBI Network, and with the Japanese VLBI Exploration of Radio Astrometry project. Here we describe a computer program that leverages these results to significantly improve the accuracy and reliability of distance estimates to other sources that are known to follow spiral structure. Using a Bayesian approach, sources are assigned to arms based on their (l, b, v) coordinates with respect to arm signatures seen in CO and H i surveys. A source's kinematic distance, displacement from the plane, and proximity to individual parallax sources are also considered in generating a full distance probability density function. Using this program to estimate distances to large numbers of star-forming regions, we generate a realistic visualization of the Milky Way's spiral structure as seen from the northern hemisphere.

  4. Writable electrochemical energy source based on graphene oxide

    Science.gov (United States)

    Wei, Di

    2015-10-01

    Graphene oxide (GO) was mainly used as raw material for various types of reduced graphene oxide (rGO) as a cost effective method to make graphene like materials. However, applications of its own unique properties such as extraordinary proton conductivity and super-permeability to water were overlooked. Here GO based battery-like planar energy source was demonstrated on arbitrary insulating substrate (e.g. polymer sheet/paper) by coating PEDOT, GO ink and rGO on Ag charge collectors. Energy from such GO battery depends on its length and one unit cell with length of 0.5 cm can generate energy capacity of 30 Ah/L with voltage up to 0.7 V when room temperature ionic liquid (RTIL) is added. With power density up to 0.4 W/cm3 and energy density of 4 Wh/L, GO battery was demonstrated to drive an electrochromic device. This work is the first attempt to generate decent energy using the fast transported water molecules inside GO. It provides very safe energy source that enables new applications otherwise traditional battery technology can not make including building a foldable energy source on paper and platform for futuristic wearable electronics. A disposable energy source made of GO was also written on a plastic glove to demonstrate wearability.

  5. Wavelet-based localization of oscillatory sources from magnetoencephalography data.

    Science.gov (United States)

    Lina, J M; Chowdhury, R; Lemay, E; Kobayashi, E; Grova, C

    2014-08-01

    Transient brain oscillatory activities recorded with Eelectroencephalography (EEG) or magnetoencephalography (MEG) are characteristic features in physiological and pathological processes. This study is aimed at describing, evaluating, and illustrating with clinical data a new method for localizing the sources of oscillatory cortical activity recorded by MEG. The method combines time-frequency representation and an entropic regularization technique in a common framework, assuming that brain activity is sparse in time and space. Spatial sparsity relies on the assumption that brain activity is organized among cortical parcels. Sparsity in time is achieved by transposing the inverse problem in the wavelet representation, for both data and sources. We propose an estimator of the wavelet coefficients of the sources based on the maximum entropy on the mean (MEM) principle. The full dynamics of the sources is obtained from the inverse wavelet transform, and principal component analysis of the reconstructed time courses is applied to extract oscillatory components. This methodology is evaluated using realistic simulations of single-trial signals, combining fast and sudden discharges (spike) along with bursts of oscillating activity. The method is finally illustrated with a clinical application using MEG data acquired on a patient with a right orbitofrontal epilepsy.

  6. [Modeling and analysis of volume conduction based on field-circuit coupling].

    Science.gov (United States)

    Tang, Zhide; Liu, Hailong; Xie, Xiaohui; Chen, Xiufa; Hou, Deming

    2012-08-01

    Numerical simulations of volume conduction can be used to analyze the process of energy transfer and explore the effects of some physical factors on energy transfer efficiency. We analyzed the 3D quasi-static electric field by the finite element method, and developed A 3D coupled field-circuit model of volume conduction basing on the coupling between the circuit and the electric field. The model includes a circuit simulation of the volume conduction to provide direct theoretical guidance for energy transfer optimization design. A field-circuit coupling model with circular cylinder electrodes was established on the platform of the software FEM3.5. Based on this, the effects of electrode cross section area, electrode distance and circuit parameters on the performance of volume conduction system were obtained, which provided a basis for optimized design of energy transfer efficiency.

  7. Anatomical-based Partial Volume Correction for Low-dose Dedicated Cardiac SPECT/CT

    OpenAIRE

    Liu, Hui; Chan, Chung; Grobshtein, Yariv; Ma, Tianyu; Liu, Yaqiang; Wang, Shi; Stacy, Mitchel R.; Sinusas, Albert J.; Liu, Chi

    2015-01-01

    Due to the limited spatial resolution, partial volume effect (PVE) has been a major degrading factor on quantitative accuracy in emission tomography systems. This study aims to investigate the performance of several anatomical-based partial volume correction (PVC) methods for a dedicated cardiac SPECT/CT system (GE Discovery NM/CT 570c) with focused field-of-view (FOV) over a clinically relevant range of high and low count levels for two different radiotracer distributions. These PVC methods ...

  8. WaVPeak: Picking NMR peaks through wavelet-based smoothing and volume-based filtering

    KAUST Repository

    Liu, Zhi

    2012-02-10

    Motivation: Nuclear magnetic resonance (NMR) has been widely used as a powerful tool to determine the 3D structures of proteins in vivo. However, the post-spectra processing stage of NMR structure determination usually involves a tremendous amount of time and expert knowledge, which includes peak picking, chemical shift assignment and structure calculation steps. Detecting accurate peaks from the NMR spectra is a prerequisite for all following steps, and thus remains a key problem in automatic NMR structure determination. Results: We introduce WaVPeak, a fully automatic peak detection method. WaVPeak first smoothes the given NMR spectrum by wavelets. The peaks are then identified as the local maxima. The false positive peaks are filtered out efficiently by considering the volume of the peaks. WaVPeak has two major advantages over the state-of-the-art peak-picking methods. First, through wavelet-based smoothing, WaVPeak does not eliminate any data point in the spectra. Therefore, WaVPeak is able to detect weak peaks that are embedded in the noise level. NMR spectroscopists need the most help isolating these weak peaks. Second, WaVPeak estimates the volume of the peaks to filter the false positives. This is more reliable than intensity-based filters that are widely used in existing methods. We evaluate the performance of WaVPeak on the benchmark set proposed by PICKY (Alipanahi et al., 2009), one of the most accurate methods in the literature. The dataset comprises 32 2D and 3D spectra from eight different proteins. Experimental results demonstrate that WaVPeak achieves an average of 96%, 91%, 88%, 76% and 85% recall on 15N-HSQC, HNCO, HNCA, HNCACB and CBCA(CO)NH, respectively. When the same number of peaks are considered, WaVPeak significantly outperforms PICKY. The Author(s) 2012. Published by Oxford University Press.

  9. Does mode mixing matter in EMD-based highlight volume methods for hydrocarbon detection? Experimental evidence

    Science.gov (United States)

    Xue, Ya-juan; Cao, Jun-xing; Du, Hao-kun; Zhang, Gu-lan; Yao, Yao

    2016-09-01

    Empirical mode decomposition (EMD)-based spectral decomposition methods have been successfully used for hydrocarbon detection. However, mode mixing that occurs during the sifting process of EMD causes the 'true' intrinsic mode function (IMF) to be extracted incorrectly and blurs the physical meaning of the IMF. We address the issue of how the mode mixing influences the EMD-based methods for hydrocarbon detection by introducing mode-mixing elimination methods, specifically ensemble EMD (EEMD) and complete ensemble EMD (CEEMD)-based highlight volumes, as feasible tools that can identify the peak amplitude above average volume and the peak frequency volume. Three schemes, that is, using all IMFs, selected IMFs or weighted IMFs, are employed in the EMD-, EEMD- and CEEMD-based highlight volume methods. When these methods were applied to seismic data from a tight sandstone gas field in Central Sichuan, China, the results demonstrated that the amplitude anomaly in the peak amplitude above average volume captured by EMD, EEMD and CEEMD combined with Hilbert transforms, whether using all IMFs, selected IMFs or weighted IMFs, are almost identical to each other. However, clear distinctions can be found in the peak frequency volume when comparing results generated using all IMFs, selected IMFs, or weighted IMFs. If all IMFs are used, the influence of mode mixing on the peak frequency volume is not readily discernable. However, using selected IMFs or a weighted IMFs' scheme affects the peak frequency in relation to the reservoir thickness in the EMD-based method. Significant improvement in the peak frequency volume can be achieved in EEMD-based highlight volumes using selected IMFs. However, if the weighted IMFs' scheme is adopted (i.e., if the undesired IMFs are included with reduced weights rather than excluded from the analysis entirely), the CEEMD-based peak frequency volume provides a more accurate reservoir thickness estimate compared with the other two methods. This

  10. [Research on living tree volume forecast based on PSO embedding SVM].

    Science.gov (United States)

    Jiao, You-Quan; Feng, Zhong-Ke; Zhao, Li-Xi; Xu, Wei-Heng; Cao, Zhong

    2014-01-01

    In order to establish volume model,living trees have to be fallen and be divided into many sections, which is a kind of destructive experiment. So hundreds of thousands of trees have been fallen down each year in China. To solve this problem, a new method called living tree volume accurate measurement without falling tree was proposed in the present paper. In the method, new measuring methods and calculation ways are used by using photoelectric theodolite and auxiliary artificial measurement. The diameter at breast height and diameter at ground was measured manually, and diameters at other heights were obtained by photoelectric theodolite. Tree volume and height of each tree was calculated by a special software that was programmed by the authors. Zhonglin aspens No. 107 were selected as experiment object, and 400 data records were obtained. Based on these data, a nonlinear intelligent living tree volume prediction model with Particle Swarm Optimization algorithm based on support vector machines (PSO-SVM) was established. Three hundred data records including tree height and diameter at breast height were randomly selected form a total of 400 data records as input data, tree volume as output data, using PSO-SVM tool box of Matlab7.11, thus a tree volume model was obtained. One hundred data records were used to test the volume model. The results show that the complex correlation coefficient (R2) between predicted and measured values is 0. 91, which is 2% higher than the value calculated by classic Spurr binary volume model, and the mean absolute error rates were reduced by 0.44%. Compared with Spurr binary volume model, PSO-SVM model has self-learning and self-adaption ability,moreover, with the characteristics of high prediction accuracy, fast learning speed,and a small sample size requirement, PSO-SVM model with well prospect is worth popularization and application.

  11. Prospect for application of compact accelerator-based neutron source to neutron engineering diffraction

    Science.gov (United States)

    Ikeda, Yoshimasa; Taketani, Atsushi; Takamura, Masato; Sunaga, Hideyuki; Kumagai, Masayoshi; Oba, Yojiro; Otake, Yoshie; Suzuki, Hiroshi

    2016-10-01

    A compact accelerator-based neutron source has been lately discussed on engineering applications such as transmission imaging and small angle scattering as well as reflectometry. However, nobody considers using it for neutron diffraction experiment because of its low neutron flux. In this study, therefore, the neutron diffraction experiments are carried out using Riken Accelerator-driven Compact Neutron Source (RANS), to clarify the capability of the compact neutron source for neutron engineering diffraction. The diffraction pattern from a ferritic steel was successfully measured by suitable arrangement of the optical system to reduce the background noise, and it was confirmed that the recognizable diffraction pattern can be measured by a large sampling volume with 10 mm in cubic for an acceptable measurement time, i.e. 10 min. The minimum resolution of the 110 reflection for RANS is approximately 2.5% at 8 μs of the proton pulse width, which is insufficient to perform the strain measurement by neutron diffraction. The moderation time width at the wavelength corresponding to the 110 reflection is estimated to be approximately 30 μs, which is the most dominant factor to determine the resolution. Therefore, refinements of the moderator system to decrease the moderation time by decreasing a thickness of the moderator or by applying the decoupler system or application of the angular dispersive neutron diffraction technique are important to improve the resolution of the diffraction experiment using the compact neutron source. In contrast, the texture evolution due to plastic deformation was successfully observed by measuring a change in the diffraction peak intensity by RANS. Furthermore, the volume fraction of the austenitic phase in the dual phase mock specimen was also successfully evaluated by fitting the diffraction pattern using a Rietveld code. Consequently, RANS has been proved to be capable for neutron engineering diffraction aiming for the easy access

  12. Three-dimensional measurement of bubble volume based on dual perspective imaging

    Science.gov (United States)

    Xue, Ting; Zhang, Shao-jie; Wu, Bin

    2017-01-01

    This paper presents a new three-dimensional (3D) volume measurement approach of bubble in gas-liquid two-phase flow. According to the dual perspective imaging principle, bubble feature images can be captured from two different view angles. The least square ellipse fitting algorithm is used to figure out the feature parameters from the captured images. Then the 3D volume of bubble can be quantitatively measured. Compaerd with the traditional volume estimation methods based on single perspective imaging, it can effectively reduce the loss of bubble feature information. In the experiment, the 3D volume reconstruction of bubbles from dual perspective images is conducted, and the variation of bubble volume in the bubble rising process is studied. The results show that the measurement accuracy based on the proposed 3D method is higher than those based on traditional methods. The volume of rising bubble is periodically changed, which indicates that bubble achieves periodic rotation and deformation in the rising process.

  13. XMEGA-Based Implementation of Four-Switch, Three-Phase Voltage Source Inverter-Fed Induction Motor Drive

    OpenAIRE

    Abolfazl Halavei Niasar; Ehsan Boloor Kashani

    2013-01-01

    Induction motors offer many advantages tools, and therefore are becoming very popular industrially and commercially. This paper presents the implementation of Xmega microcontroller based PWM inverter controlled of four-switch three phase voltage source inverter (FSTPI) fed induction motor drive. The reduction of the number of power switches from six to four improves the cost effectiveness, volume-compactness and reliability of the three phase inverters in addition to less complexity of contro...

  14. Collision and containment detection between biomechanically based eye muscle volumes.

    Science.gov (United States)

    Santana Sosa, Graciela; Kaltofen, Thomas

    2011-01-01

    Collision and containment detection between three-dimensional objects is a common requirement in simulation systems. However, few solutions exist when exclusively working with deformable bodies. In our ophthalmologic diagnostic software system, the extraocular eye muscles are represented by surface models, which have been reconstructed from magnetic resonance images. Those models are projected onto the muscle paths calculated by the system's biomechanical model. Due to this projection collisions occur. For their detection, three approaches have been implemented, which we present in this paper: one based on image-space techniques using OpenGL, one based on the Bullet physics library and one using an optimized space-array data structure together with software rendering. Finally, an outlook on a possible response to the detected collisions is given.

  15. Knowledge sources for evidence-based practice in rheumatology nursing.

    Science.gov (United States)

    Neher, Margit; Ståhl, Christian; Ellström, Per-Erik; Nilsen, Per

    2015-12-01

    As rheumatology nursing develops and extends, knowledge about current use of knowledge in rheumatology nursing practice may guide discussions about future knowledge needs. To explore what perceptions rheumatology nurses have about their knowledge sources and about what knowledge they use in their practice, 12 nurses working in specialist rheumatology were interviewed using a semi-structured interview guide. The data were analyzed using conventional qualitative content analysis. The analysis yielded four types of knowledge sources in clinical practice: interaction with others in the workplace, contacts outside the workplace, written materials, and previous knowledge and experience. Colleagues, and physicians in particular, were important for informal learning in daily rheumatology practice. Evidence from the medical arena was accessed through medical specialists, while nursing research was used less. Facilitating informal learning and continuing formal education is proposed as a way toward a more evidence-based practice in extended roles. © The Author(s) 2014.

  16. An overview of an accelerator-based neutron spallation source

    Energy Technology Data Exchange (ETDEWEB)

    Lessner, E.S.

    1996-06-01

    An overview of the feasibility study of a 1-MW pulsed spallation source is presented. The machine delivers 1 MW of proton beam power to spallation targets where slow neutrons are produced. The slow neutrons can be used for isotope production, materials irradiation, and neutron scattering research. The neutron source facility is based on a rapid cycling synchrotron (RCS) and consists of a 400-MeV linac, a 30-Hz RCS that accelerates the 400-MeV beam to 2 GeV, and two neutron-generating target stations. The RCS accelerates an average proton beam current of 0.5 mA, corresponding to 1.04 x 10{sup 14} protons per pulse. This intensity is about two times higher than that of existing machines. A key feature of this accelerator system design is that beam losses are minimized from injection to extraction, reducing activation to levels consistent with hands-on maintenance.

  17. Voronoi Diagram Based Optimization of Dynamic Reactive Power Sources

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Weihong [University of Tennessee (UT); Sun, Kai [University of Tennessee (UT); Qi, Junjian [University of Tennessee (UT); Xu, Yan [ORNL

    2015-01-01

    Dynamic var sources can effectively mitigate fault-induced delayed voltage recovery (FIDVR) issues or even voltage collapse. This paper proposes a new approach to optimization of the sizes of dynamic var sources at candidate locations by a Voronoi diagram based algorithm. It first disperses sample points of potential solutions in a searching space, evaluates a cost function at each point by barycentric interpolation for the subspaces around the point, and then constructs a Voronoi diagram about cost function values over the entire space. Accordingly, the final optimal solution can be obtained. Case studies on the WSCC 9-bus system and NPCC 140-bus system have validated that the new approach can quickly identify the boundary of feasible solutions in searching space and converge to the global optimal solution.

  18. A Bright Single Photon Source Based on a Diamond Nanowire

    CERN Document Server

    Babinec, T; Khan, M; Zhang, Y; Maze, J; Hemmer, P R; Loncar, M

    2009-01-01

    The development of a robust light source that emits one photon at a time is an outstanding challenge in quantum science and technology. Here, at the transition from many to single photon optical communication systems, fully quantum mechanical effects may be utilized to achieve new capabilities, most notably perfectly secure communication via quantum cryptography. Practical implementations place stringent requirements on the device properties, including fast and stable photon generation, efficient collection of photons, and room temperature operation. Single photon light emitting devices based on fluorescent dye molecules, quantum dots, nanowires, and carbon nanotube material systems have all been explored, but none have simultaneously demonstrated all criteria. Here, we describe the design, fabrication, and characterization of a bright source of single photons consisting of an individual Nitrogen-vacancy color center (NV center) in a diamond nanowire operating in ambient conditions. The nanowire plays a posit...

  19. 2005 Defense Base Closure and Realignment Commission Report. Volume 2

    Science.gov (United States)

    2005-01-01

    Taxation, Tourism and Procurement. He was also a member of the Foreign Affairs, Armed Services, and Intelligence Committees. He joined the firm ot Kummer...the Transportation Management training to Fort Lee, VA. 1 6/ 123. JOINT CENTER OF EXCELLENCE FOR CULINARY TRAINING (E&T 8) a. Realign...Lackland Air Force Base, TX, by relocating Culinary Training to Fort Lee, VA, establishing it as a Joint Center of Excellence for Culinary Training. 124

  20. The Microgravity Research Experiments (MICREX) Data Base. Volume 1

    Science.gov (United States)

    Winter, C. A.; Jones, J.C.

    1996-01-01

    An electronic data base identifying over 800 fluids and materials processing experiments performed in a low-gravity environment has been created at NASA Marshall Space Flight Center. The compilation, called MICREX (MICrogravity Research Experiments), was designed to document all such experimental efforts performed (1) on U.S. manned space vehicles, (2) on payloads deployed from U.S. manned space vehicles, and (3) on all domestic and international sounding rockets (excluding those of China and the former U.S.S.R.). Data available on most experiments include (1) principal and co-investigators, (2) low-gravity mission, (3) processing facility, (4) experimental objectives and results, (5) identifying key words, (6) sample materials, (7) applications of the processed materials/research area, (8) experiment descriptive publications, and (9) contacts for more information concerning the experiment. This technical memorandum (1) summarizes the historical interest in reduced-gravity fluid dynamics, (2) describes the experimental facilities employed to examine reduced gravity fluid flow, (3) discusses the importance of a low-gravity fluids and materials processing data base, (4) describes the MICREX data base format and computational World Wide Web access procedures, and (5) documents (in hard-copy form) the descriptions of the first 600 fluids and materials processing experiments entered into MICREX.

  1. The Microgravity Research Experiments (MICREX) Data Base, Volume 4

    Science.gov (United States)

    Winter, C. A.; Jones, J. C.

    1996-01-01

    An electronic data base identifying over 800 fluids and materials processing experiments performed in a low-gravity environment has been created at NASA Marshall Space Flight Center. The compilation, called MICREX (MICrogravity Research Experiments), was designed to document all such experimental efforts performed (1) on U.S. manned space vehicles, (2) on payloads deployed from U.S. manned space vehicles, and (3) on all domestic and international sounding rockets (excluding those of China and the former U.S.S.R.). Data available on most experiments include (1) principal and co-investigators (2) low-gravity mission, (3) processing facility, (4) experimental objectives and results, (5) identifying key words, (6) sample materials, (7) applications of the processed materials/research area, (8) experiment descriptive publications, and (9) contacts for more information concerning the experiment. This technical Memorandum (1) summarizes the historical interest in reduced-gravity fluid dynamics, (2) describes the importance of a low-gravity fluids and materials processing data base, (4) describes the MICREX data base format and computational World Wide Web access procedures, and (5) documents (in hard-copy form) the descriptions of the first 600 fluids and materials processing experiments entered into MICREX.

  2. The Microgravity Research Experiments (MICREX) Data Base. Volume 2

    Science.gov (United States)

    Winter, C. A.; Jones, J. C.

    1996-01-01

    An electronic data base identifying over 800 fluids and materials processing experiments performed in a low-gravity environment has been created at NASA Marshall Space Flight Center. The compilation, called MICREX (MICrogravity Research Experiments), was designed to document all such experimental efforts performed (1) on U.S. manned space vehicles, (2) on payloads deployed from U.S. manned space vehicles, and (3) on all domestic and international sounding rockets (excluding those of China and the former U.S.S.R.). Data available on most experiments include (1) principal and co-investigators (2) low-gravity mission, (3) processing facility, (4) experimental objectives and results, (5) identifying key words, (6) sample materials, (7) applications of the processed materials/research area, (8) experiment descriptive publications, and (9) contacts for more information concerning the experiment. This technical memorandum (1) summarizes the historical interest in reduced-gravity fluid dynamics, (2) describes the experimental facilities employed to examine reduced gravity fluid flow, (3) discusses the importance of a low-gravity fluids and materials processing data base, (4) describes the MICREX data base format and computational World Wide Web access procedures, and (5) documents (in hard-copy form) the descriptions of the first 600 fluids and materials processing experiments entered into MICREX.

  3. Design aspects of a compact, single-frequency, permanent magnet ECR ion source with a large uniformly distributed resonant plasma volume

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Y.; Alton, G.D.; Mills, G.D.; Reed, C.A.; Haynes, D.L.

    1997-09-01

    A compact, all-permanent-magnet single-frequency ECR ion source with a large uniformly distributed ECR plasma volume has been designed and is presently under construction at the Oak Ridge National Laboratory (ORNL). The central region of the field is designed to achieve a flat-field (constant mod-B) which extends over the length of the central field region along the axis of symmetry and radially outward to form a uniformly distributed ECR plasma volume. The magnetic field design strongly contrasts with those used in conventional ECR ion sources where the central field regions are approximately parabolic and the consequent ECR zones are surfaces. The plasma confinement magnetic field mirror has a mirror ratio B{sub max}/B{sub ECR} of slightly greater than two. The source is designed to operate at a nominal RF frequency of 6 GHz. The central flat magnetic field region can be easily adjusted by mechanical means to tune the source to the resonant conditions within the limits of 5.5 to 6.8 GHz. The RF injection system is broadband to ensure excitation of transverse electric (TE) modes so that the RF power is largely concentrated in the resonant plasma volume which lies along and surrounds the axis of symmetry of the source. Because of the much larger ECR zone, the probability for absorption of microwave power is dramatically increased thereby increasing the probability for acceleration of electrons, the electron temperature of the plasma and, consequently, the hot electron population within the plasma volume of the source. The creation of an ECR volume rather than a surface is commensurate with higher charge states and higher beam intensities within a particular charge state.

  4. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    Science.gov (United States)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  5. GISCube, an Open Source Web-based GIS Application

    Science.gov (United States)

    Boustani, M.; Mattmann, C. A.; Ramirez, P.

    2014-12-01

    There are many Earth science projects and data systems being developed at the Jet Propulsion Laboratory, California Institute of Technology (JPL) that require the use of Geographic Information Systems (GIS). Three in particular are: (1) the JPL Airborne Snow Observatory (ASO) that measures the amount of water being generated from snow melt in mountains; (2) the Regional Climate Model Evaluation System (RCMES) that compares climate model outputs with remote sensing datasets in the context of model evaluation and the Intergovernmental Panel on Climate Change and for the U.S. National Climate Assessment and; (3) the JPL Snow Server that produces a snow and ice climatology for the Western US and Alaska, for the U.S. National Climate Assessment. Each of these three examples and all other earth science projects are strongly in need of having GIS and geoprocessing capabilities to process, visualize, manage and store GeoSpatial data. Beside some open source GIS libraries and some software like ArcGIS there are comparatively few open source, web-based and easy to use application that are capable of doing GIS processing and visualization. To address this, we present GISCube, an open source web-based GIS application that can store, visualize and process GIS and GeoSpatial data. GISCube is powered by Geothon, an open source python GIS cookbook. Geothon has a variety of Geoprocessing tools such data conversion, processing, spatial analysis and data management tools. GISCube has the capability of supporting a variety of well known GIS data formats in both vector and raster formats, and the system is being expanded to support NASA's and scientific data formats such as netCDF and HDF files. In this talk, we demonstrate how Earth science and other projects can benefit by using GISCube and Geothon, its current goals and our future work in the area.

  6. Online Monitoring Volume Deformation of Cement-based Materials in Multiple Enviroments

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Comparing and analyzing some volume deformation measuring means for cement-based materials at home and abroad, a continuous online monitor of cement-based material volume deformation in multiple environments is developed. The device is designed based on the environmental simulation technology, micro-distance measuring technology of laser and eddy current, and transmission agent online monitoring the deformation of multi-group samples. This device can be used widely, such as glass, ceramics, walling material, and so on, with high precision, low testing cost, and intellectualization.

  7. A prediction model of radiation-induced necrosis for intracranial radiosurgery based on target volume.

    Science.gov (United States)

    Zhao, Bo; Wen, Ning; Chetty, Indrin J; Huang, Yimei; Brown, Stephen L; Snyder, Karen C; Siddiqui, Farzan; Movsas, Benjamin; Siddiqui, M Salim

    2017-08-01

    This study aims to extend the observation that the 12 Gy-radiosurgical-volume (V12Gy) correlates with the incidence of radiation necrosis in patients with intracranial tumors treated with radiosurgery by using target volume to predict V12Gy. V12Gy based on the target volume was used to predict the radiation necrosis probability (P) directly. Also investigated was the reduction in radiation necrosis rates (ΔP) as a result of optimizing the prescription isodose lines for linac-based SRS. Twenty concentric spherical targets and 22 patients with brain tumors were retrospectively studied. For each case, a standard clinical plan and an optimized plan with prescription isodose lines based on gradient index were created. V12Gy were extracted from both plans to analyze the correlation between V12Gy and target volume. The necrosis probability P as a function of V12Gy was evaluated. To account for variation in prescription, the relation between V12Gy and prescription was also investigated. A prediction model for radiation-induced necrosis was presented based on the retrospective study. The model directly relates the typical prescribed dose and the target volume to the radionecrosis probability; V12Gy increased linearly with the target volume (R(2)  > 0.99). The linear correlation was then integrated into a logistic model to predict P directly from the target volume. The change in V12Gy as a function of prescription was modeled using a single parameter, s (=-1.15). Relatively large ΔP was observed for target volumes between 7 and 28 cm(3) with the maximum reduction (8-9%) occurring at approximately 18 cm(3) . Based on the model results, optimizing the prescription isodose line for target volumes between 7 and 28 cm(3) results in a significant reduction in necrosis probability. V12Gy based on the target volume could provide clinicians a predictor of radiation necrosis at the contouring stage thus facilitating treatment decisions. © 2017 American Association of

  8. Integrating Ontological Data Sources Using Viewpoints-Based Approach

    Directory of Open Access Journals (Sweden)

    Bouchra Boulkroun

    2016-12-01

    Full Text Available Within the development of Internet and intranets, information integration from various data sources becomes increasingly important and more challenging issue. Recently, the trend in data integration has favored the semantic integration using ontologies. However, the existing ontology-based approaches do not support the aspect of data multi-representations, which is important in the development of multi-user applications. The motivation of this paper is to address a novel semantic integration approach based on ontologies and viewpoints paradigms. This contribution combines the advantages of existing ontology-based integration approaches while avoiding their drawbacks. The proposed integration approach is evaluated using query processing. Profiles are introduced to offer answers to users according to their viewpoints and choices.

  9. Development of high intensity ion sources for a Tandem-Electrostatic-Quadrupole facility for Accelerator-Based Boron Neutron Capture Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Bergueiro, J. [Gerencia de Investigacion y Aplicaciones, Comision Nacional de Energia Atomica (Argentina)] [CONICET, Buenos Aires (Argentina); Igarzabal, M.; Suarez Sandin, J.C. [Gerencia de Investigacion y Aplicaciones, Comision Nacional de Energia Atomica (Argentina); Somacal, H.R. [Gerencia de Investigacion y Aplicaciones, Comision Nacional de Energia Atomica (Argentina)] [Escuela de Ciencia y Tecnologia, Universidad Nacional de San Martin (Argentina); Thatar Vento, V. [Gerencia de Investigacion y Aplicaciones, Comision Nacional de Energia Atomica (Argentina)] [CONICET, Buenos Aires (Argentina); Huck, H.; Valda, A.A. [Gerencia de Investigacion y Aplicaciones, Comision Nacional de Energia Atomica (Argentina)] [Escuela de Ciencia y Tecnologia, Universidad Nacional de San Martin (Argentina); Repetto, M. [Gerencia de Investigacion y Aplicaciones, Comision Nacional de Energia Atomica (Argentina)

    2011-12-15

    Several ion sources have been developed and an ion source test stand has been mounted for the first stage of a Tandem-Electrostatic-Quadrupole facility For Accelerator-Based Boron Neutron Capture Therapy. A first source, designed, fabricated and tested is a dual chamber, filament driven and magnetically compressed volume plasma proton ion source. A 4 mA beam has been accelerated and transported into the suppressed Faraday cup. Extensive simulations of the sources have been performed using both 2D and 3D self-consistent codes.

  10. Prediction of sonic boom from experimental near-field overpressure data. Volume 2: Data base construction

    Science.gov (United States)

    Glatt, C. R.; Reiners, S. J.; Hague, D. S.

    1975-01-01

    A computerized method for storing, updating and augmenting experimentally determined overpressure signatures has been developed. A data base of pressure signatures for a shuttle type vehicle has been stored. The data base has been used for the prediction of sonic boom with the program described in Volume I.

  11. A control volume based finite difference method for solving the equilibrium equations in terms of displacements

    DEFF Research Database (Denmark)

    Hattel, Jesper; Hansen, Preben

    1995-01-01

    This paper presents a novel control volume based FD method for solving the equilibrium equations in terms of displacements, i.e. the generalized Navier equations. The method is based on the widely used cv-FDM solution of heat conduction and fluid flow problems involving a staggered grid formulati...

  12. A control volume based finite difference method for solving the equilibrium equations in terms of displacements

    DEFF Research Database (Denmark)

    Hattel, Jesper; Hansen, Preben

    1995-01-01

    This paper presents a novel control volume based FD method for solving the equilibrium equations in terms of displacements, i.e. the generalized Navier equations. The method is based on the widely used cv-FDM solution of heat conduction and fluid flow problems involving a staggered grid formulation...

  13. Effects of pore-scale dispersion, degree of heterogeneity, sampling size, and source volume on the concentration moments of conservative solutes in heterogeneous formations

    Science.gov (United States)

    Daniele Tonina; Alberto Bellin

    2008-01-01

    Pore-scale dispersion (PSD), aquifer heterogeneity, sampling volume, and source size influence solute concentrations of conservative tracers transported in heterogeneous porous formations. In this work, we developed a new set of analytical solutions for the concentration ensemble mean, variance, and coefficient of variation (CV), which consider the effects of all these...

  14. Developing seismogenic source models based on geologic fault data

    Science.gov (United States)

    Haller, Kathleen M.; Basili, Roberto

    2011-01-01

    Calculating seismic hazard usually requires input that includes seismicity associated with known faults, historical earthquake catalogs, geodesy, and models of ground shaking. This paper will address the input generally derived from geologic studies that augment the short historical catalog to predict ground shaking at time scales of tens, hundreds, or thousands of years (e.g., SSHAC 1997). A seismogenic source model, terminology we adopt here for a fault source model, includes explicit three-dimensional faults deemed capable of generating ground motions of engineering significance within a specified time frame of interest. In tectonically active regions of the world, such as near plate boundaries, multiple seismic cycles span a few hundred to a few thousand years. In contrast, in less active regions hundreds of kilometers from the nearest plate boundary, seismic cycles generally are thousands to tens of thousands of years long. Therefore, one should include sources having both longer recurrence intervals and possibly older times of most recent rupture in less active regions of the world rather than restricting the model to include only Holocene faults (i.e., those with evidence of large-magnitude earthquakes in the past 11,500 years) as is the practice in tectonically active regions with high deformation rates. During the past 15 years, our institutions independently developed databases to characterize seismogenic sources based on geologic data at a national scale. Our goal here is to compare the content of these two publicly available seismogenic source models compiled for the primary purpose of supporting seismic hazard calculations by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) and the U.S. Geological Survey (USGS); hereinafter we refer to the two seismogenic source models as INGV and USGS, respectively. This comparison is timely because new initiatives are emerging to characterize seismogenic sources at the continental scale (e.g., SHARE in the

  15. A Hybrid 3D Learning-and-Interaction-based Segmentation Approach Applied on CT Liver Volumes

    Directory of Open Access Journals (Sweden)

    M. Danciu

    2013-04-01

    Full Text Available Medical volume segmentation in various imaging modalities using real 3D approaches (in contrast to slice-by-slice segmentation represents an actual trend. The increase in the acquisition resolution leads to large amount of data, requiring solutions to reduce the dimensionality of the segmentation problem. In this context, the real-time interaction with the large medical data volume represents another milestone. This paper addresses the twofold problem of the 3D segmentation applied to large data sets and also describes an intuitive neuro-fuzzy trained interaction method. We present a new hybrid semi-supervised 3D segmentation, for liver volumes obtained from computer tomography scans. This is a challenging medical volume segmentation task, due to the acquisition and inter-patient variability of the liver parenchyma. The proposed solution combines a learning-based segmentation stage (employing 3D discrete cosine transform and a probabilistic support vector machine classifier with a post-processing stage (automatic and manual segmentation refinement. Optionally, an optimization of the segmentation can be achieved by level sets, using as initialization the segmentation provided by the learning-based solution. The supervised segmentation is applied on elementary cubes in which the CT volume is decomposed by tilling, thus ensuring a significant reduction of the data to be classified by the support vector machine into liver/not liver. On real volumes, the proposed approach provides good segmentation accuracy, with a significant reduction in the computational complexity.

  16. Open source based cadastral information system : ANCFCC-MOROCCO

    CERN Document Server

    Elasri, Hicham; Jamila, Aatab; Karima, Ganoun

    2012-01-01

    This present project is developing a geographic information system to support the cadastral business. This system based on open source solutions which developed within the National Agency of Land Registry, Cadastre and Cartography (ANCFCC) enabling monitoring and analysis of cadastral procedures as well as offering consumable services by other information systems: consultation and querying spatial data. The project will also assist the various user profiles in the completion of production tasks and the possibility to eliminate the deficiencies identified to ensure an optimum level of productivity

  17. Fusion Based Neutron Sources for Security Applications: Energy Optimisation

    OpenAIRE

    Albright, S.; Seviour, Rebecca

    2014-01-01

    There is a growing interest in the use of neutrons for na- tional security. The majority of work on security focuses on the use of either sealed tube DT fusors or fission sources, e.g. Cf-252. Fusion reactions enable the energy of the neu- tron beam to be chosen to suit the application, rather than the application being chosen based on the available neu- tron beam energy. In this paper we discuss simulations of fusion reactions demonstrating the broad range of energies available and methods f...

  18. An image-based skeletal dosimetry model for the ICRP reference newborn-internal electron sources

    Energy Technology Data Exchange (ETDEWEB)

    Pafundi, Deanna; Lee, Choonsik; Bolch, Wesley [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL (United States); Rajon, Didier [Department of Neurosurgery, University of Florida, Gainesville, FL (United States); Jokisch, Derek [Department of Physics and Astronomy, Francis Marion University, Florence, SC (United States)], E-mail: wbolch@ufl.edu

    2010-04-07

    In this study, a comprehensive electron dosimetry model of newborn skeletal tissues is presented. The model is constructed using the University of Florida newborn hybrid phantom of Lee et al (2007 Phys. Med. Biol. 52 3309-33), the newborn skeletal tissue model of Pafundi et al (2009 Phys. Med. Biol. 54 4497-531) and the EGSnrc-based Paired Image Radiation Transport code of Shah et al (2005 J. Nucl. Med. 46 344-53). Target tissues include the active bone marrow (surrogate tissue for hematopoietic stem cells), shallow marrow (surrogate tissue for osteoprogenitor cells) and unossified cartilage (surrogate tissue for chondrocytes). Monoenergetic electron emissions are considered over the energy range 1 keV to 10 MeV for the following source tissues: active marrow, trabecular bone (surfaces and volumes), cortical bone (surfaces and volumes) and cartilage. Transport results are reported as specific absorbed fractions according to the MIRD schema and are given as skeletal-averaged values in the paper with bone-specific values reported in both tabular and graphic format as electronic annexes (supplementary data). The method utilized in this work uniquely includes (1) explicit accounting for the finite size and shape of newborn ossification centers (spongiosa regions), (2) explicit accounting for active and shallow marrow dose from electron emissions in cortical bone as well as sites of unossified cartilage, (3) proper accounting of the distribution of trabecular and cortical volumes and surfaces in the newborn skeleton when considering mineral bone sources and (4) explicit consideration of the marrow cellularity changes for active marrow self-irradiation as applicable to radionuclide therapy of diseased marrow in the newborn child.

  19. A Hybrid Fresh Apple Export Volume Forecasting Model Based on Time Series and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Lihua Yang

    2015-04-01

    Full Text Available Export volume forecasting of fresh fruits is a complex task due to the large number of factors affecting the demand. In order to guide the fruit growers’ sales, decreasing the cultivating cost and increasing their incomes, a hybrid fresh apple export volume forecasting model is proposed. Using the actual data of fresh apple export volume, the Seasonal Decomposition (SD model of time series and Radial Basis Function (RBF model of artificial neural network are built. The predictive results are compared among the three forecasting model based on the criterion of Mean Absolute Percentage Error (MAPE. The result indicates that the proposed combined forecasting model is effective because it can improve the prediction accuracy of fresh apple export volumes.

  20. Cooling Load Estimation in the Building Based On Heat Sources

    Science.gov (United States)

    Chairani; Sulistyo, S.; Widyawan

    2017-05-01

    Heating, ventilation and air conditioning (HVAC) is the largest source of energy consumption. In this research, we discuss cooling load in the room by considering the different heat source and the number of occupancy. Energy cooling load is affected by external and internal heat sources. External cooling load in this discussion include convection outdoor/exterior using the DOE-2 algorithm, calculation of heat using Thermal Analysis Research Program (TARP), and Conduction Transfer Function (CTF). The internal cooling load is calculated based on the activity of the occupants in the office, a number of occupants, heat gain from lighting, and heat gain from electrics equipment. Weather data used is Surakarta weather and design day used is Jakarta design day. We use the ASHRAE standard for building materials and the metabolic of occupants while on the activity. The results show that the number of occupancies have an influence of cooling load. A large number of occupancy will cause the cooling load is great as well.

  1. Mapping methane emission sources over California based on airborne measurements

    Science.gov (United States)

    Karl, T.; Guha, A.; Peischl, J.; Misztal, P. K.; Jonsson, H.; Goldstein, A. H.; Ryerson, T. B.

    2011-12-01

    The California Global Warming Solutions Act of 2006 (AB 32) has created a need to accurately characterize the emission sources of various greenhouse gases (GHGs) and verify the existing state GHG inventory. Methane (CH4) is a major GHG with a global warming potential of 20 times that of CO2 and currently constitutes about 6% of the total statewide GHG emissions on a CO2 equivalent basis. Some of the major methane sources in the state are area sources where methane is biologically produced (e.g. dairies, landfills and waste treatment plants) making bottom-up estimation of emissions a complex process. Other potential sources include fugitive emissions from oil extraction processes and natural gas distribution network, emissions from which are not well-quantified. The lack of adequate field measurement data to verify the inventory and provide independently generated estimates further contributes to the overall uncertainty in the CH4 inventory. In order to gain a better perspective of spatial distribution of major CH4 sources in California, a real-time measurement instrument based on Cavity Ring Down Spectroscopy (CRDS) was installed in a Twin Otter aircraft for the CABERNET (California Airborne BVOC Emissions Research in Natural Ecosystems Transects) campaign, where the driving research goal was to understand the spatial distribution of biogenic VOC emissions. The campaign took place in June 2011 and encompassed over forty hours of airborne CH4 and CO2 measurements during eight unique flights which covered much of the Central Valley and its eastern edge, the Sacramento-San Joaquin delta and the coastal range. The coincident VOC measurements, obtained through a high frequency proton transfer reaction mass spectrometer (PTRMS), aid in CH4 source identification. High mixing ratios of CH4 (> 2000 ppb) are observed consistently in all the flight transects above the Central Valley. These high levels of CH4 are accompanied by high levels of methanol which is an important

  2. Dose-volume relationships between enteritis and irradiated bowel volumes during 5-fluorouracil and oxaliplatin based chemoradiotherapy in locally advanced rectal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Gunnlaugsson, Adalsteinn; Kjellen, Elisabeth; Bendahl, Paer-Ola; Johnsson, A nders [Dept. of Oncology, Lund Univ. Hospital, Lund (Sweden); Nilsson, Per [Dept. o f Radiation Physics, Lund Univ. Hospital, Lund (Sweden); Willner, Julian [Dept. of Radiology, Lund Univ. Hospital, Lund (Sweden)

    2007-10-15

    Purpose. Radiation enteritis is the main acute side-effect during pelvic irradiation. The aim of this study was to quantify the dose-volume relationship between irradiated bowel volumes and acute enteritis during combined chemoradiotherapy for rectal cancer. Material and methods. Twenty-eight patients with locally advanced rectal cancer received chemoradiotherapy. The radiation therapy was given with a traditional multi-field technique to a total dose of 50 Gy, with concurrent 5-Fluorouracil (5-FU) and oxaliplatin (OXA) based chemotherapy. All patients underwent three-dimensional CT-based treatment planning. Individual loops of small and large bowel as well as a volume defined as 'whole abdomen' were systematically contoured on each CT slice, and dose-volume histograms were generated. Diarrhea during treatment was scored retrospectively according to the NCI Common Toxicity Criteria scale. Results. There was a strong correlation between the occurrence of grade 2+diarrhea and irradiated small bowel volume, most notably at doses >15 Gy. Neither irradiated large bowel volume, nor irradiated 'whole abdomen' volume correlated significantly with diarrhea. Clinical or treatment related factors such as age, gender, hypertension, previous surgery, enterostomy, or dose fractionation (1.8 vs. 2.0 Gy/fraction) did not correlate with grade 2+diarrhea. Discussion. This study indicates a strong dose-volume relationship between small bowel volume and radiation enteritis during 5-FU-OXA-based chemoradiotherapy. These findings support the application of maneuvers to minimize small bowel irradiation, such as using a 'belly board' or the use of IMRT technique aiming at keeping the small bowel volume receiving more than 15 Gy under 150 cc.

  3. Quantitative radiology: automated measurement of polyp volume in computed tomography colonography using Hessian matrix-based shape extraction and volume growing

    Science.gov (United States)

    Epstein, Mark L.; Obara, Piotr R.; Chen, Yisong; Liu, Junchi; Zarshenas, Amin; Makkinejad, Nazanin; Dachman, Abraham H.

    2015-01-01

    Background Current measurement of the single longest dimension of a polyp is subjective and has variations among radiologists. Our purpose was to develop a computerized measurement of polyp volume in computed tomography colonography (CTC). Methods We developed a 3D automated scheme for measuring polyp volume at CTC. Our scheme consisted of segmentation of colon wall to confine polyp segmentation to the colon wall, extraction of a highly polyp-like seed region based on the Hessian matrix, a 3D volume growing technique under the minimum surface expansion criterion for segmentation of polyps, and sub-voxel refinement and surface smoothing for obtaining a smooth polyp surface. Our database consisted of 30 polyp views (15 polyps) in CTC scans from 13 patients. Each patient was scanned in the supine and prone positions. Polyp sizes measured in optical colonoscopy (OC) ranged from 6-18 mm with a mean of 10 mm. A radiologist outlined polyps in each slice and calculated volumes by summation of volumes in each slice. The measurement study was repeated 3 times at least 1 week apart for minimizing a memory effect bias. We used the mean volume of the three studies as “gold standard”. Results Our measurement scheme yielded a mean polyp volume of 0.38 cc (range, 0.15-1.24 cc), whereas a mean “gold standard” manual volume was 0.40 cc (range, 0.15-1.08 cc). The “gold-standard” manual and computer volumetric reached excellent agreement (intra-class correlation coefficient =0.80), with no statistically significant difference [P (F≤f) =0.42]. Conclusions We developed an automated scheme for measuring polyp volume at CTC based on Hessian matrix-based shape extraction and volume growing. Polyp volumes obtained by our automated scheme agreed excellently with “gold standard” manual volumes. Our fully automated scheme can efficiently provide accurate polyp volumes for radiologists; thus, it would help radiologists improve the accuracy and efficiency of polyp volume

  4. Narrow-Bandwidth Diode-Laser-Based Ultraviolet Light Source

    Institute of Scientific and Technical Information of China (English)

    PENG Yu; FANG Zhan-Jun; ZANG Er-Jun

    2011-01-01

    A compact, tunable and narrow-bandwidth laser source for ultraviolet radiation is presented. A grating stabilized diode laser at 1064 nm is frequency-stabilized to below 10 kHz by using a ultra low expansion (ULE) cavity. Injecting light of the diode laser into a tapered amplifier yields a power of 290mW. In a first frequency-doubling stage, about 47 mW of green light at 532nm is generated by using a periodical// poled KTP crystal. Subsequent second-harwonic generation employing a BBO crystal leads to about 30μW of ultraviolet light at 266nm.%A compact,tunable and narrow-bandwidth laser source for ultraviolet radiation is presented.A grating stabilized diode laser at 1064nm is frequency-stabilized to below 10kHz by using a ultra low expansion (ULE) cavity.Injecting light of the diode laser into a tapered amplifier yields a power of 290 mW.In a first frequency-doubling stage,about 47mW of green light at 532nm is generated by using a periodically poled KTP crystal.Subsequent second-harmonic generation employing a BBO crystal leads to about 30 μ W of ultraviolet light at 266nm.Hg is,so far,the heaviest nonradioactive atom that has been laser-cooled and trapped.Systematic evaluation of various sources of uncertainty for the Hg-based optical lattice clock is obtained and an accuracy of better than 10-1s is attainable,which is an order of magnitude of improvement over Sr or Yb based clocks because of the reduced susceptibility to the blackbody radiation field,which sets a major limitation on the accuracy of atomic clocks.[1] The 1S0-3p0 transition at 265.6 nm will be exploited as a clock transition.

  5. A silicon-based electrical source of surface plasmon polaritons.

    Science.gov (United States)

    Walters, R J; van Loon, R V A; Brunets, I; Schmitz, J; Polman, A

    2010-01-01

    After decades of process scaling driven by Moore's law, the silicon microelectronics world is now defined by length scales that are many times smaller than the dimensions of typical micro-optical components. This size mismatch poses an important challenge for those working to integrate photonics with complementary metal oxide semiconductor (CMOS) electronics technology. One promising solution is to fabricate optical systems at metal/dielectric interfaces, where electromagnetic modes called surface plasmon polaritons (SPPs) offer unique opportunities to confine and control light at length scales below 100 nm (refs 1, 2). Research groups working in the rapidly developing field of plasmonics have now demonstrated many passive components that suggest the potential of SPPs for applications in sensing and optical communication. Recently, active plasmonic devices based on III-V materials and organic materials have been reported. An electrical source of SPPs was recently demonstrated using organic semiconductors by Koller and colleagues. Here we show that a silicon-based electrical source for SPPs can be fabricated using established low-temperature microtechnology processes that are compatible with back-end CMOS technology.

  6. Error Sources in Proccessing LIDAR Based Bridge Inspection

    Science.gov (United States)

    Bian, H.; Chen, S. E.; Liu, W.

    2017-09-01

    Bridge inspection is a critical task in infrastructure management and is facing unprecedented challenges after a series of bridge failures. The prevailing visual inspection was insufficient in providing reliable and quantitative bridge information although a systematic quality management framework was built to ensure visual bridge inspection data quality to minimize errors during the inspection process. The LiDAR based remote sensing is recommended as an effective tool in overcoming some of the disadvantages of visual inspection. In order to evaluate the potential of applying this technology in bridge inspection, some of the error sources in LiDAR based bridge inspection are analysed. The scanning angle variance in field data collection and the different algorithm design in scanning data processing are the found factors that will introduce errors into inspection results. Besides studying the errors sources, advanced considerations should be placed on improving the inspection data quality, and statistical analysis might be employed to evaluate inspection operation process that contains a series of uncertain factors in the future. Overall, the development of a reliable bridge inspection system requires not only the improvement of data processing algorithms, but also systematic considerations to mitigate possible errors in the entire inspection workflow. If LiDAR or some other technology can be accepted as a supplement for visual inspection, the current quality management framework will be modified or redesigned, and this would be as urgent as the refine of inspection techniques.

  7. Cardiac magnetic source imaging based on current multipole model

    Institute of Scientific and Technical Information of China (English)

    Tang Fa-Kuan; Wang Qian; Hua Ning; Lu Hong; Tang Xue-Zheng; Ma Ping

    2011-01-01

    It is widely accepted that the heart current source can be reduced into a current multipole. By adopting three linear inverse methods, the cardiac magnetic imaging is achieved in this article based on the current multipole model expanded to the first order terms. This magnetic imaging is realized in a reconstruction plane in the centre of human heart, where the current dipole array is employed to represent realistic cardiac current distribution. The current multipole as testing source generates magnetic fields in the measuring plane, serving as inputs of cardiac magnetic inverse problem. In the heart-torso model constructed by boundary element method, the current multipole magnetic field distribution is compared with that in the homogeneous infinite space, and also with the single current dipole magnetic field distribution.Then the minimum-norm least-squares (MNLS) method, the optimal weighted pseuDOInverse method (OWPIM), and the optimal constrained linear inverse method (OCLIM) are selected as the algorithms for inverse computation based on current multipole model innovatively, and the imaging effects of these three inverse methods are compared. Besides,two reconstructing parameters, residual and mean residual, are also discussed, and their trends under MNLS, OWPIM and OCLIM each as a function of SNR are obtained and compared.

  8. Iraqi Perspectives Project. Primary Source Materials for Saddam and Terrorism: Emerging Insights from Captured Iraqi Documents. Volume 4 (Redacted)

    Science.gov (United States)

    2007-11-01

    II series. Volume I examines the relationships between the regime of Saddam Hussein and terrorism in its local, regional, and global context. Volumes 2...be behind this accident because of the number of American casualties. Then, he xpresses his suspects about Israel to be the real criminal, and then...Saddam Hussein and terrorism in its local, regional, and global context. Volumes 2 through 4 contain the English translations and detailed summaries of

  9. Iterative volume morphing and learning for mobile tumor based on 4DCT

    Science.gov (United States)

    Mao, Songan; Wu, Huanmei; Sandison, George; Fang, Shiaofen

    2017-02-01

    During image-guided cancer radiation treatment, three-dimensional (3D) tumor volumetric information is important for treatment success. However, it is typically not feasible to image a patient’s 3D tumor continuously in real time during treatment due to concern over excessive patient radiation dose. We present a new iterative morphing algorithm to predict the real-time 3D tumor volume based on time-resolved computed tomography (4DCT) acquired before treatment. An offline iterative learning process has been designed to derive a target volumetric deformation function from one breathing phase to another. Real-time volumetric prediction is performed to derive the target 3D volume during treatment delivery. The proposed iterative deformable approach for tumor volume morphing and prediction based on 4DCT is innovative because it makes three major contributions: (1) a novel approach to landmark selection on 3D tumor surfaces using a minimum bounding box; (2) an iterative morphing algorithm to generate the 3D tumor volume using mapped landmarks; and (3) an online tumor volume prediction strategy based on previously trained deformation functions utilizing 4DCT. The experimental performance showed that the maximum morphing deviations are 0.27% and 1.25% for original patient data and artificially generated data, which is promising. This newly developed algorithm and implementation will have important applications for treatment planning, dose calculation and treatment validation in cancer radiation treatment.

  10. Future Synchrotron Light Sources Based on Ultimate Storage Rings

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yunhai; /SLAC

    2012-04-09

    The main purpose of this talk is to describe how far one might push the state of the art in storage ring design. The talk will start with an overview of the latest developments and advances in the design of synchrotron light sources based on the concept of an 'ultimate' storage ring. The review will establish how bright a ring based light source might be, where the frontier of technological challenges are, and what the limits of accelerator physics are. Emphasis will be given to possible improvements in accelerator design and developments in technology toward the goal of achieving an ultimate storage ring. An ultimate storage ring (USR), defined as an electron ring-based light source having an emittance in both transverse planes at the diffraction limit for the range of X-ray wavelengths of interest for a scientific community, would provide very high brightness photons having high transverse coherence that would extend the capabilities of X-ray imaging and probe techniques beyond today's performance. It would be a cost-effective, high-coherence 4th generation light source, competitive with one based on energy recovery linac (ERL) technology, serving a large number of users studying material, chemical, and biological sciences. Furthermore, because of the experience accumulated over many decades of ring operation, it would have the great advantage of stability and reliability. In this paper we consider the design of an USR having 10-pm-rad emittance. It is a tremendous challenge to design a storage ring having such an extremely low emittance, a factor of 100 smaller than those in existing light sources, especially such that it has adequate dynamic aperture and beam lifetime. In many ultra-low emittance designs, the injection acceptances are not large enough for accumulation of the electron beam, necessitating on-axis injection where stored electron bunches are completely replaced with newly injected ones. Recently, starting with the MAX-IV 7-bend

  11. Analysis of potential combustion source impacts on acid deposition using an independently derived inventory. Volume II, appendices

    Energy Technology Data Exchange (ETDEWEB)

    1983-12-01

    This document contains 2 appendices. The first documents the methodologies used to calculate production, unit energy consumption, fuel type and emission estimates for 16 industries and 35 types of facilities utilizing direct-fired industrial combustion processes, located in 26 states (and the District of Columbia) east of the Mississippi River. As discussed in the text of this report, a U.S. total of 16 industries and 45 types of facilities utilizing direct-fired combustion processes were identified by an elimination type method that was developed based on evaluation of fuel use in industrial SIC codes 20-39 to identify pollutant sources contributing to acid rain. The final population included only plants that have direct-fired fuel consumption greater than or equal to 100 x 10/sup 9/ Btu/yr of equivalent energy consumption. The goal for this analysis was to provide at least a 1980 base year for the data. This was achieved for all of the industries and in fact, 1981 data were used for a number of the industries evaluated. The second contains an analysis of all consumption of major fossil fuels to: (1) identify all fuel usage categories, and (2) identify the kinds of combustion equipment used within each category. This analysis provides a frame of reference for the balance of the study and permits using an energy accounting methodology to quantify the degree to which the inventoried sources in individual consuming sectors are complete and representative of the total population for the sector.

  12. Effect of cataract surgery volume constraints on recently graduated ophthalmologists: a population-based cohort study

    Science.gov (United States)

    Campbell, Robert J.; El-Defrawy, Sherif R.; Bell, Chaim M.; Gill, Sudeep S.; Hooper, Philip L.; Whitehead, Marlo; Campbell, Erica de L.P.; Nesdole, Robert; Warder, Daniel; ten Hove, Martin

    2017-01-01

    BACKGROUND: Across Canada, graduates from several medical and surgical specialties have recently had difficulty securing practice opportunities, especially in specialties dependent on limited resources such as ophthalmology. We aimed to investigate whether resource constraints in the health care system have a greater impact on the volume of cataract surgery performed by recent graduates than on established physicians. METHODS: We used population-based administrative data from Ontario for the period Jan. 1, 1994, to June 30, 2013, to compare health services provided by recent graduates and established ophthalmologists. The primary outcome was volume of cataract surgery, a resource-intensive service for which volume is controlled by the province. RESULTS: When cataract surgery volume in Ontario entered a period of government-mandated zero growth in 2007, the mean number of cataract operations performed by recent graduates dropped significantly (−46.37 operations/quarter, 95% confidence interval [CI] −62.73 to −30.00 operations/quarter), whereas the mean rate for established ophthalmologists remained stable (+5.89 operations/quarter, 95% CI 95% CI −1.47 to +13.24 operations/quarter). Decreases in service provision among recent graduates did not occur for services without volume control. The proportion of recent graduates providing exclusively cataract surgery increased over the study period, and recent graduates in this group were 5.24 times (95% CI 2.15 to 12.76 times) more likely to fall within the lowest quartile for cataract surgical volume during the period of zero growth in provincial cataract volume (2007–2013) than in the preceding period (1996–2006). INTERPRETATION: Recent ophthalmology graduates performed many fewer cataract surgery procedures after volume controls were implemented in Ontario. Integrated initiatives involving multiple stakeholders are needed to address the issues facing recently graduated physicians in Canada. PMID:27920012

  13. Accelerating Time-Varying Hardware Volume Rendering Using TSP Trees and Color-Based Error Metrics

    Science.gov (United States)

    Ellsworth, David; Chiang, Ling-Jen; Shen, Han-Wei; Kwak, Dochan (Technical Monitor)

    2000-01-01

    This paper describes a new hardware volume rendering algorithm for time-varying data. The algorithm uses the Time-Space Partitioning (TSP) tree data structure to identify regions within the data that have spatial or temporal coherence. By using this coherence, the rendering algorithm can improve performance when the volume data is larger than the texture memory capacity by decreasing the amount of textures required. This coherence can also allow improved speed by appropriately rendering flat-shaded polygons instead of textured polygons, and by not rendering transparent regions. To reduce the polygonization overhead caused by the use of the hierarchical data structure, we introduce an optimization method using polygon templates. The paper also introduces new color-based error metrics, which more accurately identify coherent regions compared to the earlier scalar-based metrics. By showing experimental results from runs using different data sets and error metrics, we demonstrate that the new methods give substantial improvements in volume rendering performance.

  14. An evaluation of volume-based morphometry for prediction of mild cognitive impairment and Alzheimer's disease

    Directory of Open Access Journals (Sweden)

    Daniel Schmitter

    2015-01-01

    Full Text Available Voxel-based morphometry from conventional T1-weighted images has proved effective to quantify Alzheimer's disease (AD related brain atrophy and to enable fairly accurate automated classification of AD patients, mild cognitive impaired patients (MCI and elderly controls. Little is known, however, about the classification power of volume-based morphometry, where features of interest consist of a few brain structure volumes (e.g. hippocampi, lobes, ventricles as opposed to hundreds of thousands of voxel-wise gray matter concentrations. In this work, we experimentally evaluate two distinct volume-based morphometry algorithms (FreeSurfer and an in-house algorithm called MorphoBox for automatic disease classification on a standardized data set from the Alzheimer's Disease Neuroimaging Initiative. Results indicate that both algorithms achieve classification accuracy comparable to the conventional whole-brain voxel-based morphometry pipeline using SPM for AD vs elderly controls and MCI vs controls, and higher accuracy for classification of AD vs MCI and early vs late AD converters, thereby demonstrating the potential of volume-based morphometry to assist diagnosis of mild cognitive impairment and Alzheimer's disease.

  15. Brain-volume changes in young and middle-aged smokers: a DARTEL-based voxel-based morphometry study.

    Science.gov (United States)

    Peng, Peng; Wang, Zhenchang; Jiang, Tao; Chu, Shuilian; Wang, Shuangkun; Xiao, Dan

    2015-09-25

    Many studies have reported brain volume changes in smokers. However, the volume differences of grey matter (GM) and white matter (WM) in young and middle-aged male smokers with different lifetime tobacco consumption (pack-years) remain uncertain. To examine the brain volume change, especially whether more pack-years smoking would be associated with smaller gray matter and white matter volume in young and middle-aged male smokers. We used a 3T MR scanner and performed Diffeomorphic anatomical registration through exponentiated lie algebra (DARTEL)-based voxel-based morphometry on 53 long-term male smokers (30.72 ± 4.19 years) and 53 male healthy non-smokers (30.83 ± 5.18 years). We separated smokers to light and heavy smokers by pack-years and compared brain volume between different smoker groups and non-smokers. And then we did analysis of covariance (ANCOVA) between smokers and non-smokers by setting pack-years as covariates. Light and heavy smokers all displayed smaller GM and WM volume than non-smokers and more obviously in heavy smokers. The main smaller areas in light and heavy smokers were superior temporal gyrus, insula, middle occipital gyrus, posterior cingulate, precuneus in GM and posterior cingulate, thalamus and midbrain in WM, in addition, we also observed more pack-years smoking was associated with some certain smaller GM and WM volumes by ANCOVA. Young and middle-aged male smokers had many smaller brain areas than non-smokers. Some of these areas' volume had negative correlation with pack-years, while some had not. These may due to different pathophysiological role of smokings. © 2015 John Wiley & Sons Ltd.

  16. Coronary revascularization treatment based on dual-source computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Dikkers, R.; Willems, T.P.; Jonge, G.J. de; Zaag-Loonen, H.J. van der; Ooijen, P.M.A. van; Oudkerk, M. [University of Groningen, Department of Radiology, Groningen (Netherlands); University Medical Center, Groningen (Netherlands); Piers, L.H.; Tio, R.A.; Zijlstra, F. [University of Groningen, Department of Cardiology, Groningen (Netherlands); University Medical Center, Groningen (Netherlands)

    2008-09-15

    Therapy advice based on dual-source computed tomography (DSCT) in comparison with coronary angiography (CAG) was investigated and the results evaluated after 1-year follow-up. Thirty-three consecutive patients (mean age 61.9 years) underwent DSCT and CAG and were evaluated independently. In an expert reading (the ''gold standard''), CAG and DSCT examinations were evaluated simultaneously by an experienced radiologist and cardiologist. Based on the presence of significant stenosis and current guidelines, therapy advice was given by all readers blinded from the results of other readings and clinical information. Patients were treated based on a multidisciplinary team evaluation including all clinical information. In comparison with the gold standard, CAG had a higher specificity (91%) and positive predictive value (PPV) (95%) compared with DSCT (82% and 91%, respectively). DSCT had a higher sensitivity (96%) and negative predictive value (NPV) (89%) compared with CAG (91% and 83%, respectively). The DSCT-based therapy advice did not lead to any patient being denied the revascularization they needed according to the multidisciplinary team evaluation. During follow-up, two patients needed additional revascularization. The high NPV for DSCT for revascularization assessment indicates that DSCT could be safely used to select patients benefiting from medical therapy only. (orig.)

  17. NaI(Tl Detector Efficiency Computation Using Radioactive Parallelepiped Sources Based on Efficiency Transfer Principle

    Directory of Open Access Journals (Sweden)

    Mohamed S. Badawi

    2015-01-01

    Full Text Available The efficiency transfer (ET principle is considered as a simple numerical simulation method, which can be used to calculate the full-energy peak efficiency (FEPE of 3″×3″ NaI(Tl scintillation detector over a wide energy range. In this work, the calculations of FEPE are based on computing the effective solid angle ratio between a radioactive point and parallelepiped sources located at various distances from the detector surface. Besides, the attenuation of the photon by the source-to-detector system (detector material, detector end cap, and holder material was considered and determined. This method is straightforwardly useful in setting up the efficiency calibration curve for NaI(Tl scintillation detector, when no calibration sources exist in volume shape. The values of the efficiency calculations using theoretical method are compared with the measured ones and the results show that the discrepancies in general for all the measurements are found to be less than 6%.

  18. Potential evaluation of biomass-based energy sources for Turkey

    OpenAIRE

    Mustafa Ozcan; Semra Öztürk; Yuksel Oguz

    2015-01-01

    Turkey has great potential with respect to renewable energy sources (RES) and, among such sources, “biomass energy” is of particular importance. The purpose of this study is to determine the primary electrical energy potential obtainable from the biomass potential, according to different biomass source types. In this study, the biomass sources of municipal solid wastes, energy crops, animal manure and urban wastewater treatment sludge are evaluated. For each source, individual biogas and biom...

  19. Design of a micro-irrigation system based on the control volume method

    Directory of Open Access Journals (Sweden)

    Chasseriaux G.

    2006-01-01

    Full Text Available A micro-irrigation system design based on control volume method using the back step procedure is presented in this study. The proposed numerical method is simple and consists of delimiting an elementary volume of the lateral equipped with an emitter, called « control volume » on which the conservation equations of the fl uid hydrodynamicʼs are applied. Control volume method is an iterative method to calculate velocity and pressure step by step throughout the micro-irrigation network based on an assumed pressure at the end of the line. A simple microcomputer program was used for the calculation and the convergence was very fast. When the average water requirement of plants was estimated, it is easy to choose the sum of the average emitter discharge as the total average fl ow rate of the network. The design consists of exploring an economical and effi cient network to deliver uniformly the input fl ow rate for all emitters. This program permitted the design of a large complex network of thousands of emitters very quickly. Three subroutine programs calculate velocity and pressure at a lateral pipe and submain pipe. The control volume method has already been tested for lateral design, the results from which were validated by other methods as fi nite element method, so it permits to determine the optimal design for such micro-irrigation network

  20. Central Issues in the Use of Computer-Based Materials for High Volume Entrepreneurship Education

    Science.gov (United States)

    Cooper, Billy

    2007-01-01

    This article discusses issues relating to the use of computer-based learning (CBL) materials for entrepreneurship education at university level. It considers CBL as a means of addressing the increased volume and range of provision required in the current context. The issues raised in this article have importance for all forms of computer-based…

  1. Volumetry of human molars with flat panel-based volume CT in vitro

    NARCIS (Netherlands)

    Hannig, C.; Krieger, E.; Dullin, C.; Merten, H.A.; Attin, T.; Grabbe, E.; Heidrich, G.

    2006-01-01

    The flat panel-based volume computed tomography (fpVCT) is a new CT device applicable for experimental, three-dimensional evaluation of teeth at a resolution of about 150 microm in the high contrast region. The aim of this study was to investigate whether fpVCT was suitable for quantification of the

  2. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME II: PROCESS OVERVIEW

    Science.gov (United States)

    This volume presents initial results of a study to identify the issues and barriers associated with retrofitting existing solvent-based equipment to accept waterbased adhesives as part of an EPA effort to improve equipment cleaning in the coated and laminated substrate manufactur...

  3. Social Studies Teachers' Use of Classroom-Based and Web-Based Historical Primary Sources

    Science.gov (United States)

    Hicks, David; Doolittle, Peter; Lee, John K.

    2004-01-01

    A limited body of research examines the extent to which social studies teachers are actually utilizing primary sources that are accessible in traditional classroom-based formats versus web-based formats. This paper initiates an exploration of this gap in the literature by reporting on the result of a survey of secondary social studies teachers,…

  4. Current Source Converter Based Wind Energy Conversion Systems

    Institute of Scientific and Technical Information of China (English)

    Samir Kouro; Jing-ya DAI; Bin WU

    2011-01-01

    The increase in the installed capacity of wind energy conversion systems (WECS) has triggered the development of more demanding grid codes and additional requirements on performance.In order to meet these requirements the industry trend has shifted to full-scale power converter interfaces in modern multi-megawatt WECS.As consequence,a wide variety of new power converter topologies and WECS configurations have been introduced in recent years.Among them,current source converter(CSC) based configurations have attracted attention due to a series of advantages like:simple structure,grid friendly waveforms,controllable power factor,and reliable grid short-circuit protection.This paper presents the latest developments in CSC interfaces for WECS and related technologies such as modulation methods,control schemes and grid code compatibility.

  5. Phase-Based Road Detection in Multi-Source Images

    Energy Technology Data Exchange (ETDEWEB)

    Sengupta, S K; Lopez, A S; Brase, J M; Paglieroni, D W

    2004-06-16

    The problem of robust automatic road detection in remotely sensed images is complicated by the fact that the sensor, spatial resolution, acquisition conditions, road width, road orientation and road material composition can all vary. A novel technique for detecting road pixels in multi-source remotely sensed images based on the phase (i.e., orientation or directional) information in edge pixels is described. A very dense map of edges extracted from the image is separated into channels, each containing edge pixels whose phases lie within a different range of orientations. The edge map associated with each channel is de-cluttered. A map of road pixels is formed by re-combining the de-cluttered channels into a composite edge image which is itself then separately de-cluttered. Road detection results are provided for DigitalGlobe and TerraServerUSA images. Road representations suitable for various applications are then discussed.

  6. Fuel-Cell Power Source Based on Onboard Rocket Propellants

    Science.gov (United States)

    Ganapathi, Gani; Narayan, Sri

    2010-01-01

    The use of onboard rocket propellants (dense liquids at room temperature) in place of conventional cryogenic fuel-cell reactants (hydrogen and oxygen) eliminates the mass penalties associated with cryocooling and boil-off. The high energy content and density of the rocket propellants will also require no additional chemical processing. For a 30-day mission on the Moon that requires a continuous 100 watts of power, the reactant mass and volume would be reduced by 15 and 50 percent, respectively, even without accounting for boiloff losses. The savings increase further with increasing transit times. A high-temperature, solid oxide, electrolyte-based fuel-cell configuration, that can rapidly combine rocket propellants - both monopropellant system with hydrazine and bi-propellant systems such as monomethyl hydrazine/ unsymmetrical dimethyl hydrazine (MMH/UDMH) and nitrogen tetroxide (NTO) to produce electrical energy - overcomes the severe drawbacks of earlier attempts in 1963-1967 of using fuel reforming and aqueous media. The electrical energy available from such a fuel cell operating at 60-percent efficiency is estimated to be 1,500 Wh/kg of reactants. The proposed use of zirconia-based oxide electrolyte at 800-1,000 C will permit continuous operation, very high power densities, and substantially increased efficiency of conversion over any of the earlier attempts. The solid oxide fuel cell is also tolerant to a wide range of environmental temperatures. Such a system is built for easy refueling for exploration missions and for the ability to turn on after several years of transit. Specific examples of future missions are in-situ landers on Europa and Titan that will face extreme radiation and temperature environments, flyby missions to Saturn, and landed missions on the Moon with 14 day/night cycles.

  7. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    Science.gov (United States)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  8. Bit-Based Joint Source-Channel Decoding of Huffman Encoded Markov Multiple Sources

    Directory of Open Access Journals (Sweden)

    Weiwei Xiang

    2010-04-01

    Full Text Available Multimedia transmission over time-varying channels such as wireless channels has recently motivated the research on the joint source-channel technique. In this paper, we present a method for joint source-channel soft decision decoding of Huffman encoded multiple sources. By exploiting the a priori bit probabilities in multiple sources, the decoding performance is greatly improved. Compared with the single source decoding scheme addressed by Marion Jeanne, the proposed technique is more practical in wideband wireless communications. Simulation results show our new method obtains substantial improvements with a minor increasing of complexity. For two sources, the gain in SNR is around 1.5dB by using convolutional codes when symbol-error rate (SER reaches 10-2 and around 2dB by using Turbo codes.

  9. Comprehensive data base of high-level nuclear waste glasses: September 1987 status report: Volume 2, Additional appendices

    Energy Technology Data Exchange (ETDEWEB)

    Kindle, C.H.; Kreiter, M.R.

    1987-12-01

    The Materials Characterization Center (MCC) is assembling a comprehensive data base (CDB) of experimental data collected for high-level nuclear waste package components. The status of the CDB is summarized in Volume I of this report. Volume II contains appendices that present data from the data base and an evaluation of glass durability models applied to the data base.

  10. Spatial Information Processing: Standards-Based Open Source Visualization Technology

    Science.gov (United States)

    Hogan, P.

    2009-12-01

    . Spatial information intelligence is a global issue that will increasingly affect our ability to survive as a species. Collectively we must better appreciate the complex relationships that make life on Earth possible. Providing spatial information in its native context can accelerate our ability to process that information. To maximize this ability to process information, three basic elements are required: data delivery (server technology), data access (client technology), and data processing (information intelligence). NASA World Wind provides open source client and server technologies based on open standards. The possibilities for data processing and data sharing are enhanced by this inclusive infrastructure for geographic information. It is interesting that this open source and open standards approach, unfettered by proprietary constraints, simultaneously provides for entirely proprietary use of this same technology. 1. WHY WORLD WIND? NASA World Wind began as a single program with specific functionality, to deliver NASA content. But as the possibilities for virtual globe technology became more apparent, we found that while enabling a new class of information technology, we were also getting in the way. Researchers, developers and even users expressed their desire for World Wind functionality in ways that would service their specific needs. They want it in their web pages. They want to add their own features. They want to manage their own data. They told us that only with this kind of flexibility, could their objectives and the potential for this technology be truly realized. World Wind client technology is a set of development tools, a software development kit (SDK) that allows a software engineer to create applications requiring geographic visualization technology. 2. MODULAR COMPONENTRY Accelerated evolution of a technology requires that the essential elements of that technology be modular components such that each can advance independent of the other

  11. A novel correction factor based on extended volume to complement the conformity index.

    Science.gov (United States)

    Jin, F; Wang, Y; Wu, Y-Z

    2012-08-01

    We propose a modified conformity index (MCI), based on extended volume, that improves on existing indices by correcting for the insensitivity of previous conformity indices to reference dose shape to assess the quality of high-precision radiation therapy and present an evaluation of its application. In this paper, the MCI is similar to the conformity index suggested by Paddick (CI(Paddick)), but with a different correction factor. It is shown for three cases: with an extended target volume, with an extended reference dose volume and without an extended volume. Extended volume is generated by expanding the original volume by 0.1-1.1 cm isotropically. Focusing on the simulation model, measurements of MCI employ a sphere target and three types of reference doses: a sphere, an ellipsoid and a cube. We can constrain the potential advantage of the new index by comparing MCI with CI(Paddick). The measurements of MCI in head-neck cancers treated with intensity-modulated radiation therapy and volumetric-modulated arc therapy provide a window on its clinical practice. The results of MCI for a simulation model and clinical practice are presented and the measurements are corrected for limited spatial resolution. The three types of MCI agree with each other, and comparisons between the MCI and CI(Paddick) are also provided. The results from our analysis show that the proposed MCI can provide more objective and accurate conformity measurement for high-precision radiation therapy. In combination with a dose-volume histogram, it will be a more useful conformity index.

  12. Measurement of renal function in a kidney donor: a comparison of creatinine-based and volume-based GFRs

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Don Kyoung; Choi, See Min; Jeong, Byong Chang; Seo, Seong Il; Jeon, Seong Soo; Lee, Hyun Moo; Choi, Han-Yong; Jeon, Hwang Gyun [Sungkyunkwan University School of Medicine, Department of Urology, Samsung Medical Center, Seoul (Korea, Republic of); Park, Bong Hee [The Catholic University of Korea College of Medicine, Department of Urology, Incheon St. Mary' s Hospital, Seoul (Korea, Republic of)

    2015-11-15

    We aimed to evaluate the performance of various GFR estimates compared with direct measurement of GFR (dGFR). We also sought to create a new formula for volume-based GFR (new-vGFR) using kidney volume determined by CT. GFR was measured using creatinine-based methods (MDRD, the Cockcroft-Gault equation, CKD-EPI formula, and the Mayo clinic formula) and the Herts method, which is volume-based (vGFR). We compared performance between GFR estimates and created a new vGFR model by multiple linear regression analysis. Among the creatinine-based GFR estimates, the MDRD and C-G equations were similarly associated with dGFR (correlation and concordance coefficients of 0.359 and 0.369 and 0.354 and 0.318, respectively). We developed the following new kidney volume-based GFR formula: 217.48-0.39XA + 0.25XW-0.46XH-54.01XsCr + 0.02XV-19.89 (if female) (A = age, W = weight, H = height, sCr = serum creatinine level, V = total kidney volume). The MDRD and CKD-EPI had relatively better accuracy than the other creatinine-based methods (30.7 % vs. 32.3 % within 10 % and 78.0 % vs. 73.0 % within 30 %, respectively). However, the new-vGFR formula had the most accurate results among all of the analyzed methods (37.4 % within 10 % and 84.6 % within 30 %). The new-vGFR can replace dGFR or creatinine-based GFR for assessing kidney function in donors and healthy individuals. (orig.)

  13. [Spatiotemporal variation of water source supply service in Three Rivers Source Area of China based on InVEST model].

    Science.gov (United States)

    Pan, Tao; Wu, Shao-Hong; Dai, Er-Fu; Liu, Yu-Jie

    2013-01-01

    The Three Rivers Source Area is the largest ecological function region of water source supply and conservation in China. As affected by a variety of driving factors, the ecosystems in this region are seriously degraded, giving definite impacts on the water source supply service. This paper approached the variation patterns of precipitation and runoff coefficient from 1981 to 2010, quantitatively estimated the water source supply of the ecosystems in the region from 1980 to 2005 based on InVEST model, and analyzed the spatiotemporal variation pattern and its causes of the water source supply in different periods. In 1981-2010, the precipitation in the Three Rivers Source Area had a trend of increase after an initial decrease, while the precipitation runoff coefficient presented an obvious decreasing trend, suggesting a reduced capability of runoff water source supply of this region. The potential evapotranspiration had a declining trend, but not obvious, with a rate of -0.226 mm x a(-1). In 1980-2005, the water source supply of the region represented an overall decreasing trend, which was most obvious in the Yellow River Source Area. The spatiotemporal variation of the water source supply in the Three Rivers Source Area was the results of the combined effects of climate and land use change, and the climate factors affected the water source supply mainly through affecting the precipitation and potential evapotranspiration. Climate and land use change induced the ecosystem degradation and underlying surface change, which could be the main driving forces of the declined water source supply in the Three Rivers Source Area.

  14. A finite volume method for cylindrical heat conduction problems based on local analytical solution

    KAUST Repository

    Li, Wang

    2012-10-01

    A new finite volume method for cylindrical heat conduction problems based on local analytical solution is proposed in this paper with detailed derivation. The calculation results of this new method are compared with the traditional second-order finite volume method. The newly proposed method is more accurate than conventional ones, even though the discretized expression of this proposed method is slightly more complex than the second-order central finite volume method, making it cost more calculation time on the same grids. Numerical result shows that the total CPU time of the new method is significantly less than conventional methods for achieving the same level of accuracy. © 2012 Elsevier Ltd. All rights reserved.

  15. Three-Component Power Decomposition for Polarimetric SAR Data Based on Adaptive Volume Scatter Modeling

    Directory of Open Access Journals (Sweden)

    Sang-Eun Park

    2012-05-01

    Full Text Available In this paper, the three-component power decomposition for polarimetric SAR (PolSAR data with an adaptive volume scattering model is proposed. The volume scattering model is assumed to be reflection-symmetric but parameterized. For each image pixel, the decomposition first starts with determining the adaptive parameter based on matrix similarity metric. Then, a respective scattering power component is retrieved with the established procedure. It has been shown that the proposed method leads to complete elimination of negative powers as the result of the adaptive volume scattering model. Experiments with the PolSAR data from both the NASA/JPL (National Aeronautics and Space Administration/Jet Propulsion Laboratory Airborne SAR (AIRSAR and the JAXA (Japan Aerospace Exploration Agency ALOS-PALSAR also demonstrate that the proposed method not only obtains similar/better results in vegetated areas as compared to the existing Freeman-Durden decomposition but helps to improve discrimination of the urban regions.

  16. Fission, spallation or fusion-based neutron sources

    Indian Academy of Sciences (India)

    Kurt N Clausen

    2008-10-01

    In this paper the most promising technology for high power neutron sources is briefly discussed. The conclusion is that the route to high power neutron sources in the foreseeable future is spallation – short or long pulse or even CW – all of these sources will have areas in which they excel.

  17. Slope excavation quality assessment and excavated volume calculation in hydraulic projects based on laser scanning technology

    Directory of Open Access Journals (Sweden)

    Chao Hu

    2015-04-01

    Full Text Available Slope excavation is one of the most crucial steps in the construction of a hydraulic project. Excavation project quality assessment and excavated volume calculation are critical in construction management. The positioning of excavation projects using traditional instruments is inefficient and may cause error. To improve the efficiency and precision of calculation and assessment, three-dimensional laser scanning technology was used for slope excavation quality assessment. An efficient data acquisition, processing, and management workflow was presented in this study. Based on the quality control indices, including the average gradient, slope toe elevation, and overbreak and underbreak, cross-sectional quality assessment and holistic quality assessment methods were proposed to assess the slope excavation quality with laser-scanned data. An algorithm was also presented to calculate the excavated volume with laser-scanned data. A field application and a laboratory experiment were carried out to verify the feasibility of these methods for excavation quality assessment and excavated volume calculation. The results show that the quality assessment indices can be obtained rapidly and accurately with design parameters and scanned data, and the results of holistic quality assessment are consistent with those of cross-sectional quality assessment. In addition, the time consumption in excavation project quality assessment with the laser scanning technology can be reduced by 70%−90%, as compared with the traditional method. The excavated volume calculated with the scanned data only slightly differs from measured data, demonstrating the applicability of the excavated volume calculation method presented in this study.

  18. Validity and repeatability of a depth camera-based surface imaging system for thigh volume measurement.

    Science.gov (United States)

    Bullas, Alice M; Choppin, Simon; Heller, Ben; Wheat, Jon

    2016-10-01

    Complex anthropometrics such as area and volume, can identify changes in body size and shape that are not detectable with traditional anthropometrics of lengths, breadths, skinfolds and girths. However, taking these complex with manual techniques (tape measurement and water displacement) is often unsuitable. Three-dimensional (3D) surface imaging systems are quick and accurate alternatives to manual techniques but their use is restricted by cost, complexity and limited access. We have developed a novel low-cost, accessible and portable 3D surface imaging system based on consumer depth cameras. The aim of this study was to determine the validity and repeatability of the system in the measurement of thigh volume. The thigh volumes of 36 participants were measured with the depth camera system and a high precision commercially available 3D surface imaging system (3dMD). The depth camera system used within this study is highly repeatable (technical error of measurement (TEM) of <1.0% intra-calibration and ~2.0% inter-calibration) but systematically overestimates (~6%) thigh volume when compared to the 3dMD system. This suggests poor agreement yet a close relationship, which once corrected can yield a usable thigh volume measurement.

  19. Correlation of choroidal thickness and volume measurements with axial length and age using swept source optical coherence tomography and optical low-coherence reflectometry.

    Science.gov (United States)

    Michalewski, Janusz; Michalewska, Zofia; Nawrocka, Zofia; Bednarski, Maciej; Nawrocki, Jerzy

    2014-01-01

    To report choroidal thickness and volume in healthy eyes using swept source optical coherence tomography (SS-OCT). A prospective observational study of 122 patients examined with swept source OCT (DRI-OCT, Topcon, Japan). In each eye, we performed 256 horizontal scans, 12 mm in length and centered on the fovea. We calculated choroidal thickness manually with a built-in caliper and automatically using DRI-OCT mapping software. Choroidal volume was also automatically calculated. We measured axial length with optical low-coherence reflectometry (Lenstar LS 900, Haag-Streit, Switzerland). The choroid has focally increased thickness under the fovea. Choroid was thinnest in the outer nasal quadrant. In stepwise regression analysis, age was estimated as the most significant factor correlating with decreased choroidal thickness (F=23.146, Pchoroidal thickness and volume maps. Choroidal thickness is increased at the fovea and is thinnest nasally. Age and axial length are critical for the estimation of choroidal thickness and volume. Choroidal measurements derived from SS-OCT images have potential value for objectively documenting disease-related choroidal thickness abnormalities and monitoring progressive changes over time.

  20. New aqueous rechargeable power sources based on intercalation compounds

    Energy Technology Data Exchange (ETDEWEB)

    Tian, S.; Liu, L.L.; Qu, Q.T.; Wu, Y.P. [Fudan Univ., New Energy and Materials Laboratory, Shanghai (China). Dept. of Chemistry and Shanghai Key Laboratory of Molecular Catalysis and Innovative Materials

    2010-07-01

    Lithium ion batteries have gained global attention because of their intercalation mechanism. However, when the capacity is very large for large-scale energy storage of electricity, the safety of lithium ion batteries is a challenge. The safest energy storage for large-scale energy storage is based on aqueous solutions. This paper reported on the latest developments related to the results of aqueous rechargeable power sources based on intercalation compounds, notably aqueous rechargeable lithium batteries (ARLBs) and hybrid supercapacitors. The paper provided background information on ARLBs and discussed the use of polypyrrole as anode materials. It was found that this polymer could be doped and un-doped during cycling, which demonstrated excellent cycling behaviour. The paper also discussed the enhancement of the reversible capacity of lithium manganese oxide (LiMn{sub 2}O{sub 4}) and lithium cobalt dioxide (LiCoO{sub 2}) in ARLBs by adopting novel preparation technologies. It was concluded that ARLBs and the new hybrid supercapacitors show significant potential for practical applications in large-scale energy storage that are needed to make advances in sustainable development. 7 refs.

  1. Potential evaluation of biomass-based energy sources for Turkey

    Directory of Open Access Journals (Sweden)

    Mustafa Ozcan

    2015-06-01

    Full Text Available Turkey has great potential with respect to renewable energy sources (RES and, among such sources, “biomass energy” is of particular importance. The purpose of this study is to determine the primary electrical energy potential obtainable from the biomass potential, according to different biomass source types. In this study, the biomass sources of municipal solid wastes, energy crops, animal manure and urban wastewater treatment sludge are evaluated. For each source, individual biogas and biomass energy potential calculations are made. Methods for energy conversion from wastes applicable to the conditions of Turkey, and technical and economic parameters are used. As a result of the calculations made, the total primary energy value of biogas obtainable from the examined sources is 188.21 TWh/year. The total primary energy value related to the potential of the evaluated biomass sources is 278.40 TWh/year.

  2. Volume Averaging Theory (VAT) based modeling and closure evaluation for fin-and-tube heat exchangers

    Science.gov (United States)

    Zhou, Feng; Catton, Ivan

    2012-10-01

    A fin-and-tube heat exchanger was modeled based on Volume Averaging Theory (VAT) in such a way that the details of the original structure was replaced by their averaged counterparts, so that the VAT based governing equations can be efficiently solved for a wide range of parameters. To complete the VAT based model, proper closure is needed, which is related to a local friction factor and a heat transfer coefficient of a Representative Elementary Volume (REV). The terms in the closure expressions are complex and sometimes relating experimental data to the closure terms is difficult. In this work we use CFD to evaluate the rigorously derived closure terms over one of the selected REVs. The objective is to show how heat exchangers can be modeled as a porous media and how CFD can be used in place of a detailed, often formidable, experimental effort to obtain closure for the model.

  3. SHTEREOM I SIMPLE WINDOWS® BASED SOFTWARE FOR STEREOLOGY. VOLUME AND NUMBER ESTIMATIONS

    Directory of Open Access Journals (Sweden)

    Emin Oğuzhan Oğuz

    2011-05-01

    Full Text Available Stereology has been earlier defined by Wiebel (1970 to be: "a body of mathematical methods relating to three dimensional parameters defining the structure from two dimensional measurements obtainable on sections of the structure." SHTEREOM I is a simple windows-based software for stereological estimation. In this first part, we describe the implementation of the number and volume estimation tools for unbiased design-based stereology. This software is produced in Visual Basic and can be used on personal computers operated by Microsoft Windows® operating systems that are connected to a conventional camera attached to a microscope and a microcator or a simple dial gauge. Microsoft NET Framework version 1.1 also needs to be downloaded for full use. The features of the SHTEREOM I software are illustrated through examples of stereological estimations in terms of volume and particle numbers for different magnifications (4X–100X. Point-counting grids are available for area estimations and for use with the most efficient volume estimation tool, the Cavalieri technique and are applied to Lizard testicle volume. An unbiased counting frame system is available for number estimations of the objects under investigation, and an on-screen manual stepping module for number estimations through the optical fractionator method is also available for the measurement of increments along the X and Y axes of the microscope stage for the estimation of rat brain hippocampal pyramidal neurons.

  4. Field lens multiplexing in holographic 3D displays by using Bragg diffraction based volume gratings

    Science.gov (United States)

    Fütterer, G.

    2016-11-01

    Applications, which can profit from holographic 3D displays, are the visualization of 3D data, computer-integrated manufacturing, 3D teleconferencing and mobile infotainment. However, one problem of holographic 3D displays, which are e.g. based on space bandwidth limited reconstruction of wave segments, is to realize a small form factor. Another problem is to provide a reasonable large volume for the user placement, which means to provide an acceptable freedom of movement. Both problems should be solved without decreasing the image quality of virtual and real object points, which are generated within the 3D display volume. A diffractive optical design using thick hologram gratings, which can be referred to as Bragg diffraction based volume gratings, can provide a small form factor and high definition natural viewing experience of 3D objects. A large collimated wave can be provided by an anamorphic backlight unit. The complex valued spatial light modulator add local curvatures to the wave field he is illuminated with. The modulated wave field is focused onto to the user plane by using a volume grating based field lens. Active type liquid crystal gratings provide 1D fine tracking of approximately +/- 8° deg. Diffractive multiplex has to be implemented for each color and for a set of focus functions providing coarse tracking. Boundary conditions of the diffractive multiplexing are explained. This is done in regards to the display layout and by using the coupled wave theory (CWT). Aspects of diffractive cross talk and its suppression will be discussed including longitudinal apodized volume gratings.

  5. LiDAR-based volume assessment of the origin of the Wadena drumlin field, Minnesota, USA

    Science.gov (United States)

    Sookhan, Shane; Eyles, Nick; Putkinen, Niko

    2016-06-01

    The Wadena drumlin field (WDF; ~ 7500 km2) in west-central Minnesota, USA, is bordered along its outer extremity by the till-cored Alexandria moraine marking the furthest extent of the southwesterly-flowing Wadena ice lobe at c. 15,000 kyr BP. Newly available high-resolution Light Detection and Ranging (LiDAR) data reveal new information regarding the number, morphology and extent of streamlined bedforms in the WDF. In addition, a newly-developed quantitative methodology based on relief curvature analysis of LiDAR elevation-based raster data is used to evaluate sediment volumes represented by the WDF and its bounding end moraine. These data are used to evaluate models for the origin of drumlins. High-resolution LiDAR-based mapping doubles the streamlined footprint of the Wadena Lobe to ~ 16,500 km2 increases the number of bedforms from ~ 2000 to ~ 6000, and most significantly, reclassifies large numbers of bedforms mapped previously as 'drumlins' as 'mega-scale glacial lineations' (MSGLs), indicating that the Wadena ice lobe experienced fast ice flow. The total volume of sediment in the Alexandria moraine is ~ 71-110 km3, that in the drumlins and MSGLs is ~ 2.83 km3, and the volume of swales between these bedforms is ~ 74.51 km3. The moraine volume is equivalent to a till layer 6.8 m thick across the entire bed of the Wadena lobe, suggesting drumlinization and moraine formation were accompanied by widespread lowering of the bed. This supports the hypothesis that drumlins and MSGLs are residual erosional features carved from a pre-existing till; swales represent 'missing sediment' that was eroded subglacially and advected downglacier to build the Alexandria Moraine during fast ice flow. Alternatively, the relatively small volume of sediment represented by subglacial bedforms indicates they could have formed rapidly by depositional processes.

  6. Reconstruction and prediction of multi-source acoustic field with the distributed source boundary point method based nearfield acoustic holography

    Institute of Scientific and Technical Information of China (English)

    BI; Chuanxing; CHEN; Jian; CHEN; Xinzhao

    2004-01-01

    In a multi-source acoustic field, the actual measured pressure is a scalar sum of pressures from all the sources. The pressure belonging to every source cannot be separated out with the existing techniques. Consequently, routine formulas cannot be used to reconstruct the acoustic source and predict the acoustic field directly. In this paper, a novel theoretical model of reconstruction and prediction of multi-source acoustic field in the distributed source boundary point method (DSBPM) based nearfield acoustic holography (NAH) is established. Three different methods, namely combination method with single surface measurement, combination method with multi-surface measurement and elimination method with multi-surface measurement, are proposed to realize the holographic reconstruction of sources. With these methods, the problem of reconstruction and prediction of acoustic field existing multiple coherent sources synchronously is solved effectively. Using the particular solutions constructed by the DSBPM to establish the vibro-acoustic transfer matrix, the calculation time, calculation precision and calculation stability are improved. These methods are valuable in localizing acoustic source and predicting acoustic field in engineering field.

  7. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  8. A novel amblyopia treatment system based on LED light source

    Science.gov (United States)

    Zhang, Xiaoqing; Chen, Qingshan; Wang, Xiaoling

    2011-05-01

    A novel LED (light emitting diode) light source of five different colors (white, red, green, blue and yellow) is adopted instead of conventional incandescent lamps for an amblyopia treatment system and seven training methods for rectifying amblyopia are incorporated so as for achieving an integrated therapy. The LED light source is designed to provide uniform illumination, adjustable light intensity and alterable colors. Experimental tests indicate that the LED light source operates steadily and fulfills the technical demand of amblyopia treatment.

  9. Constraint-Based Integration of Geospatial and Online Sources

    Science.gov (United States)

    2007-09-10

    sources to accurately geocode addresses. In Proceedings of the 12th ACM International Symposium on Advances in Geographic Information Systems (ACM- GIS 󈧈...34* Rahul Bakshi Integration and reasoning about online sources to accurately geocode addresses. Master’s thesis, University of Southern California...2004. Rahul Bakshi. Integration and reasoning about online sources to accurately geocode addresses. Master’s thesis, University of Southern

  10. Use of Lagrangian transport models and Sterilized High Volume Sampling to pinpoint the source region of Kawasaki disease and determine the etiologic agent

    Science.gov (United States)

    Curcoll Masanes, Roger; Rodó, Xavier; Anton, Jordi; Ballester, Joan; Jornet, Albert; Nofuentes, Manel; Sanchez-Manubens, Judith; Morguí, Josep-Anton

    2015-04-01

    Kawasaki disease (KD) is an acute, coronary artery vasculitis of young children, and still a medical mystery after more than 40 years. A former study [Rodó et al. 2011] demonstrated that certain patterns of winds in the troposphere above the earth's surface flowing from Asia were associated with the times of the annual peak in KD cases and with days having anomalously high numbers of KD patients. In a later study [Rodó et al. 2014], we used residence times from an Air Transport Model to pinpoint the source region for KD. Simulations were generated from locations spanning Japan from days with either high or low KD incidence. In order to cope with stationarity of synoptic situations, only trajectories for the winter months, when there is the maximum in KD cases, were considered. Trajectories traced back in time 10 days for each dataset and location were generated using the flexible particle Lagrangian dispersion model (FLEXPART Version 8.23 [Stohl et al. 2005]) run in backward mode. The particles modeled were air tracers, with 10,000 particles used on each model run. The model output used was residence time, with an output grid of 0.5° latitude × longitude and a time resolution of 3 h. The data input used for the FLEXPART model was gridded atmospheric wind velocity from the European Center for Medium-Range Weather Forecasts Re-Analysis (ERA-Interim at 1°). Aggregates of winter period back-trajectories were calculated for three different regions of Japan. A common source of wind air masses was located for periods with High Kawasaki disease. Knowing the trajectories of winds from the air transport models, a sampling methodology was developed in order to capture the possible etiological agent or other tracers that could have been released together. This methodology is based on the sterilized filtering of high volumes of the transported air at medium tropospheric levels by aircraft sampling and a later analyze these filters with adequate techniques. High purity

  11. A Muon Source Proton Driver at JPARC-based Parameters

    Energy Technology Data Exchange (ETDEWEB)

    Neuffer, David [Fermilab

    2016-06-01

    An "ultimate" high intensity proton source for neutrino factories and/or muon colliders was projected to be a ~4 MW multi-GeV proton source providing short, intense proton pulses at ~15 Hz. The JPARC ~1 MW accelerators provide beam at parameters that in many respects overlap these goals. Proton pulses from the JPARC Main Ring can readily meet the pulsed intensity goals. We explore these parameters, describing the overlap and consider extensions that may take a JPARC-like facility toward this "ultimate" source. JPARC itself could serve as a stage 1 source for such a facility.

  12. Tip-based electron source for femtosecond electron diffraction

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Jan-Paul; Hoffrogge, Johannes; Schenk, Markus; Krueger, Michael; Baum, Peter; Hommelhoff, Peter [Max-Planck-Institut fuer Quantenoptik, Hans-Kopfermann-Strasse 1, 85748 Garching bei Muenchen (Germany)

    2012-07-01

    Illumination of a sharp tungsten tip with femtosecond laser pulses leads to the emission of ultrashort, high brightness electron pulses that are ideally suited for ultrafast electron diffraction (UED) experiments [1]. The tip's small virtual source size ({proportional_to}5 nm) results in a large transverse coherence length of the electron pulse and therefore better spatial resolution as compared to a conventional flat cathode design. The enhanced electric field at the tip apex (2 GV/m) is about two orders of magnitude larger than the maximum electric field applicable in a plate capacitor based setup (20 MV/m). This reduces the influence of the initial energy distribution on the pulse duration at the target and improves the timing jitter. Simulations show that a setup with a sharp tip as the cathode in combination with two anodes yields an electron pulse duration of about 50 fs at the sample. The electron energy is 30 keV and the gun to sample distance is 3 cm. We implemented the two anode setup with the tip experimentally. We present the experimental characteristics of the emitted electron beam both in static field emission and in laser triggered emission.

  13. Observation of Neutron Skyshine from an Accelerator Based Neutron Source

    Energy Technology Data Exchange (ETDEWEB)

    Franklyn, C. B. [Radiation Science Department, Necsa, PO Box 582, Pretoria 0001 (South Africa)

    2011-12-13

    A key feature of neutron based interrogation systems is the need for adequate provision of shielding around the facility. Accelerator facilities adapted for fast neutron generation are not necessarily suitably equipped to ensure complete containment of the vast quantity of neutrons generated, typically >10{sup 11} n{center_dot}s{sup -1}. Simulating the neutron leakage from a facility is not a simple exercise since the energy and directional distribution can only be approximated. Although adequate horizontal, planar shielding provision is made for a neutron generator facility, it is sometimes the case that vertical shielding is minimized, due to structural and economic constraints. It is further justified by assuming the atmosphere above a facility functions as an adequate radiation shield. It has become apparent that multiple neutron scattering within the atmosphere can result in a measurable dose of neutrons reaching ground level some distance from a facility, an effect commonly known as skyshine. This paper describes a neutron detection system developed to monitor neutrons detected several hundred metres from a neutron source due to the effect of skyshine.

  14. Wave equation based microseismic source location and velocity inversion

    Science.gov (United States)

    Zheng, Yikang; Wang, Yibo; Chang, Xu

    2016-12-01

    The microseismic event locations and velocity information can be used to infer the stress field and guide hydraulic fracturing process, as well as to image the subsurface structures. How to get accurate microseismic event locations and velocity model is the principal problem in reservoir monitoring. For most location methods, the velocity model has significant relation with the accuracy of the location results. The velocity obtained from log data is usually too rough to be used for location directly. It is necessary to discuss how to combine the location and velocity inversion. Among the main techniques for locating microseismic events, time reversal imaging (TRI) based on wave equation avoids traveltime picking and offers high-resolution locations. Frequency dependent wave equation traveltime inversion (FWT) is an inversion method that can invert velocity model with source uncertainty at certain frequency band. Thus we combine TRI with FWT to produce improved event locations and velocity model. In the proposed approach, the location and model information are interactively used and updated. Through the proposed workflow, the inverted model is better resolved and the event locations are more accurate. We test this method on synthetic borehole data and filed data of a hydraulic fracturing experiment. The results verify the effectiveness of the method and prove it has potential for real-time microseismic monitoring.

  15. Compact X-ray source based on Compton backscattering

    CERN Document Server

    Bulyak, E V; Zelinsky, A; Karnaukhov, I; Kononenko, S; Lapshin, V G; Mytsykov, A; Telegin, Yu P; Khodyachikh, A; Shcherbakov, A; Molodkin, V; Nemoshkalenko, V; Shpak, A

    2002-01-01

    The feasibility study of an intense X-ray source based on the interaction between the electron beam in a compact storage ring and the laser pulse accumulated in an optical resonator is carried out. We propose to reconstruct the 160 MeV electron storage ring N-100, which was shutdown several years ago. A new magnetic lattice will provide a transverse of electron beam size of approx 35 mu m at the point of electron beam-laser beam interaction. The proposed facility is to generate X-ray beams of intensity approx 2.6x10 sup 1 sup 4 s sup - sup 1 and spectral brightness approx 10 sup 1 sup 2 phot/0.1%bw/s/mm sup 2 /mrad sup 2 in the energy range from 10 keV up to 0.5 MeV. These X-ray beam parameters meet the requirements for most of technological and scientific applications. Besides, we plan to use the new facility for studying the laser cooling effect.

  16. Neutron Source Facility Training Simulator Based on EPICS

    Energy Technology Data Exchange (ETDEWEB)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.; Grelle, Austin L.; Dworzanski, Pawel L.; Gohar, Yousry

    2015-01-01

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has been widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.

  17. A Source Anonymity-Based Lightweight Secure AODV Protocol for Fog-Based MANET

    Directory of Open Access Journals (Sweden)

    Weidong Fang

    2017-06-01

    Full Text Available Fog-based MANET (Mobile Ad hoc networks is a novel paradigm of a mobile ad hoc network with the advantages of both mobility and fog computing. Meanwhile, as traditional routing protocol, ad hoc on-demand distance vector (AODV routing protocol has been applied widely in fog-based MANET. Currently, how to improve the transmission performance and enhance security are the two major aspects in AODV’s research field. However, the researches on joint energy efficiency and security seem to be seldom considered. In this paper, we propose a source anonymity-based lightweight secure AODV (SAL-SAODV routing protocol to meet the above requirements. In SAL-SAODV protocol, source anonymous and secure transmitting schemes are proposed and applied. The scheme involves the following three parts: the source anonymity algorithm is employed to achieve the source node, without being tracked and located; the improved secure scheme based on the polynomial of CRC-4 is applied to substitute the RSA digital signature of SAODV and guarantee the data integrity, in addition to reducing the computation and energy consumption; the random delayed transmitting scheme (RDTM is implemented to separate the check code and transmitted data, and achieve tamper-proof results. The simulation results show that the comprehensive performance of the proposed SAL-SAODV is a trade-off of the transmission performance, energy efficiency, and security, and better than AODV and SAODV.

  18. A Source Anonymity-Based Lightweight Secure AODV Protocol for Fog-Based MANET.

    Science.gov (United States)

    Fang, Weidong; Zhang, Wuxiong; Xiao, Jinchao; Yang, Yang; Chen, Wei

    2017-06-17

    Fog-based MANET (Mobile Ad hoc networks) is a novel paradigm of a mobile ad hoc network with the advantages of both mobility and fog computing. Meanwhile, as traditional routing protocol, ad hoc on-demand distance vector (AODV) routing protocol has been applied widely in fog-based MANET. Currently, how to improve the transmission performance and enhance security are the two major aspects in AODV's research field. However, the researches on joint energy efficiency and security seem to be seldom considered. In this paper, we propose a source anonymity-based lightweight secure AODV (SAL-SAODV) routing protocol to meet the above requirements. In SAL-SAODV protocol, source anonymous and secure transmitting schemes are proposed and applied. The scheme involves the following three parts: the source anonymity algorithm is employed to achieve the source node, without being tracked and located; the improved secure scheme based on the polynomial of CRC-4 is applied to substitute the RSA digital signature of SAODV and guarantee the data integrity, in addition to reducing the computation and energy consumption; the random delayed transmitting scheme (RDTM) is implemented to separate the check code and transmitted data, and achieve tamper-proof results. The simulation results show that the comprehensive performance of the proposed SAL-SAODV is a trade-off of the transmission performance, energy efficiency, and security, and better than AODV and SAODV.

  19. State-of-the-Art in GPU-Based Large-Scale Volume Visualization

    KAUST Repository

    Beyer, Johanna

    2015-05-01

    This survey gives an overview of the current state of the art in GPU techniques for interactive large-scale volume visualization. Modern techniques in this field have brought about a sea change in how interactive visualization and analysis of giga-, tera- and petabytes of volume data can be enabled on GPUs. In addition to combining the parallel processing power of GPUs with out-of-core methods and data streaming, a major enabler for interactivity is making both the computational and the visualization effort proportional to the amount and resolution of data that is actually visible on screen, i.e. \\'output-sensitive\\' algorithms and system designs. This leads to recent output-sensitive approaches that are \\'ray-guided\\', \\'visualization-driven\\' or \\'display-aware\\'. In this survey, we focus on these characteristics and propose a new categorization of GPU-based large-scale volume visualization techniques based on the notions of actual output-resolution visibility and the current working set of volume bricks-the current subset of data that is minimally required to produce an output image of the desired display resolution. Furthermore, we discuss the differences and similarities of different rendering and data traversal strategies in volume rendering by putting them into a common context-the notion of address translation. For our purposes here, we view parallel (distributed) visualization using clusters as an orthogonal set of techniques that we do not discuss in detail but that can be used in conjunction with what we present in this survey. © 2015 The Eurographics Association and John Wiley & Sons Ltd.

  20. FLUID-BASED SIMULATION APPROACH FOR HIGH VOLUME CONVEYOR TRANSPORTATION SYSTEMS

    Institute of Scientific and Technical Information of China (English)

    Ying WANG; Chen ZHOU

    2004-01-01

    High volume conveyor systems in distribution centers have very large footprint and can handle large volumes and hold thousands of items. Traditional discrete-event cell-based approach to simulate such networks becomes computationally challenging. An alternative approach, in which the traffic is represented by segments of fluid flow of different density instead of individual packages, is presented in this paper to address this challenge. The proposed fluid-based simulation approach is developed using a Hybrid Petri Nets framework. The underlying model is a combination of an extension of a Batches Petri Nets (BPN) and a Stochastic Petri Nets (SPN). The extensions are in the inclusion of random elements and relaxation of certain structural constraints. Some adaptations are also made to fit the target system modeling. The approach is presented with an example.

  1. Concept for Risk-based Prioritisation of Point Sources

    DEFF Research Database (Denmark)

    Overheu, N.D.; Troldborg, Mads; Tuxen, N.

    2010-01-01

    estimates on a local scale from all the sources, and 3D catchment-scale fate and transport modelling. It handles point sources at various knowledge levels and accounts for uncertainties. The tool estimates the impacts on the water supply in the catchment and provides an overall prioritisation of the sites...

  2. Voltage sag source location based on instantaneous energy detection

    DEFF Research Database (Denmark)

    Chen, Zhe; Xinzhou, Dong; Wei, Kong

    2007-01-01

    Voltage sag is the major power quality problem, which could disrupt the operation of sensitive equipment. This paper presents the applications of instantaneous energy direction for voltage sag source detection. Simulations have been performed to provide the analysis for system with distributed ge...... generation units. The studies show that the presented method can effectively detect the location of the voltage sag source....

  3. 'Orbital volume restoration rate after orbital fracture'; a CT-based orbital volume measurement for evaluation of orbital wall reconstructive effect.

    Science.gov (United States)

    Wi, J M; Sung, K H; Chi, M

    2017-01-13

    PurposeTo evaluate the effect of orbital reconstruction and factors related to the effect of orbital reconstruction by assessing of orbital volume using orbital computed tomography (CT) in cases of orbital wall fracture.MethodsIn this retrospective study, 68 patients with isolated blowout fractures were evaluated. The volumes of orbits and herniated orbital tissues were determined by CT scans using a three-dimensional reconstruction technique (the Eclipse Treatment Planning System). Orbital CT was performed preoperatively, immediately after surgery, and at final follow ups (minimum of 6 months). We evaluated the reconstructive effect of surgery making a new formula, 'orbital volume reconstruction rate' from orbital volume differences between fractured and contralateral orbits before surgery, immediately after surgery, and at final follow up.ResultsMean volume of fractured orbits before surgery was 23.01±2.60 cm(3) and that of contralateral orbits was 21.31±2.50 cm(3) (P=0.005). Mean volume of the fractured orbits immediately after surgery was 21.29±2.42 cm(3), and that of the contralateral orbits was 21.33±2.52 cm(3) (P=0.921). Mean volume of fractured orbits at final follow up was 21.50±2.44 cm(3), and that of contralateral orbits was 21.32±2.50 cm(3) (P=0.668). The mean orbital volume reconstruction rate was 100.47% immediately after surgery and 99.17% at final follow up. No significant difference in orbital volume reconstruction rate was observed with respect to fracture site or orbital implant type. Patients that underwent operation within 14 days of trauma had a better reconstruction rate at final follow up than patients who underwent operation over 14 days after trauma (P=0.039).ConclusionComputer-based measurements of orbital fracture volume can be used to evaluate the reconstructive effect of orbital implants and provide useful quantitative information. Significant reduction of orbital volume is observed immediately after orbital wall

  4. Local digital estimators of intrinsic volumes for Boolean models and in the design based setting

    DEFF Research Database (Denmark)

    Svane, Anne Marie

    In order to estimate the specific intrinsic volumes of a planar Boolean model from a binary image, we consider local digital algorithms based on weigted sums of 2×2 configuration counts. For Boolean models with balls as grains, explicit formulas for the bias of such algorithms are derived...... for the bias obtained for Boolean models are applied to existing algorithms in order to compare their accuracy....

  5. Dose–Volume Relationships Associated With Temporal Lobe Radiation Necrosis After Skull Base Proton Beam Therapy

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, Mark W., E-mail: markmcdonaldmd@gmail.com [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, Indiana (United States); Indiana University Health Proton Therapy Center, Bloomington, Indiana (United States); Linton, Okechukwu R. [Department of Radiation Oncology, Indiana University School of Medicine, Indianapolis, Indiana (United States); Calley, Cynthia S.J. [Department of Biostatistics, Indiana University School of Medicine, Indianapolis, Indiana (United States)

    2015-02-01

    Purpose: We evaluated patient and treatment parameters correlated with development of temporal lobe radiation necrosis. Methods and Materials: This was a retrospective analysis of a cohort of 66 patients treated for skull base chordoma, chondrosarcoma, adenoid cystic carcinoma, or sinonasal malignancies between 2005 and 2012, who had at least 6 months of clinical and radiographic follow-up. The median radiation dose was 75.6 Gy (relative biological effectiveness [RBE]). Analyzed factors included gender, age, hypertension, diabetes, smoking status, use of chemotherapy, and the absolute dose:volume data for both the right and left temporal lobes, considered separately. A generalized estimating equation (GEE) regression analysis evaluated potential predictors of radiation necrosis, and the median effective concentration (EC50) model estimated dose–volume parameters associated with radiation necrosis. Results: Median follow-up time was 31 months (range 6-96 months) and was 34 months in patients who were alive. The Kaplan-Meier estimate of overall survival at 3 years was 84.9%. The 3-year estimate of any grade temporal lobe radiation necrosis was 12.4%, and for grade 2 or higher radiation necrosis was 5.7%. On multivariate GEE, only dose–volume relationships were associated with the risk of radiation necrosis. In the EC50 model, all dose levels from 10 to 70 Gy (RBE) were highly correlated with radiation necrosis, with a 15% 3-year risk of any-grade temporal lobe radiation necrosis when the absolute volume of a temporal lobe receiving 60 Gy (RBE) (aV60) exceeded 5.5 cm{sup 3}, or aV70 > 1.7 cm{sup 3}. Conclusions: Dose–volume parameters are highly correlated with the risk of developing temporal lobe radiation necrosis. In this study the risk of radiation necrosis increased sharply when the temporal lobe aV60 exceeded 5.5 cm{sup 3} or aV70 > 1.7 cm{sup 3}. Treatment planning goals should include constraints on the volume of temporal lobes receiving

  6. MULTI-SOURCE REMOTE SENSING IMAGE FUSION BASED ON SUPPORT VECTOR MACHINE

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Remote Sensing image fusion is an effective way to use the large volume of data from multi-source images.This paper introduces a new method of remote sensing image fusion based on support vector machine (SVM), using highspatial resolution data SPIN-2 and multi-spectral remote sensing data SPOT-4. Firstly, the new method is established bybuilding a model of remote sensing image fusion based on SVM. Then by using SPIN-2 data and SPOT-4 data, image classification fusion is tested. Finally, an evaluation of the fusion result is made in two ways. 1 ) From subjectivity assessment,the spatial resolution of the fused image is improved compared to the SPOT-4. And it is clearly that the texture of thefused image is distinctive. 2) From quantitative analysis, the effect of classification fusion is better. As a whole, the result shows that the accuracy of image fusion based on SVM is high and the SVM algorithm can be recommended for application in remote sensing image fusion processes.

  7. MULTI—SOURCE REMOTE SENSING IMAGE FUSION BASED ON SUPPORT VECTOR MACHINE

    Institute of Scientific and Technical Information of China (English)

    ZHAOShu-he; FENGXue-zhi; 等

    2002-01-01

    Remote Sensing image fusion is an effective way to use the large volume of data from multi-source images.This paper introduces a new method of remote sensing image fusion based on support vector machine(SVM),using high spatial resolution data SPIN-2 and multi-spectral remote sensing data SPOT-4.Firstly,the new method is established by building a model of remote sensing image fusion based on SVM.Then by using SPIN-2 data and SPOT-4 data ,image classify-cation fusion in tested.Finally,and evaluation of the fusion result is made in two ways.1)From subjectivity assessment,the spatial resolution of the fused image is improved compared to the SPOT-4.And it is clearly that the texture of the fused image is distinctive.2)From quantitative analysis,the effect of classification fusion is better.As a whole ,the re-sult shows that the accuracy of image fusion based on SVM is high and the SVM algorithm can be recommended for applica-tion in remote sensing image fusion processes.

  8. Blind source separation based on generalized gaussian model

    Institute of Scientific and Technical Information of China (English)

    YANG Bin; KONG Wei; ZHOU Yue

    2007-01-01

    Since in most blind source separation (BSS) algorithms the estimations of probability density function (pdf) of sources are fixed or can only switch between one sup-Gaussian and other sub-Gaussian model,they may not be efficient to separate sources with different distributions. So to solve the problem of pdf mismatch and the separation of hybrid mixture in BSS, the generalized Gaussian model (GGM) is introduced to model the pdf of the sources since it can provide a general structure of univariate distributions. Its great advantage is that only one parameter needs to be determined in modeling the pdf of different sources, so it is less complex than Gaussian mixture model. By using maximum likelihood (ML) approach, the convergence of the proposed algorithm is improved. The computer simulations show that it is more efficient and valid than conventional methods with fixed pdf estimation.

  9. Setting up of an Open Source based Private Cloud

    Directory of Open Access Journals (Sweden)

    G R Karpagam

    2011-05-01

    Full Text Available Cloud Computing is an attractive concept in IT field, since it allows the resources to be provisioned according to the user needs[11]. It provides services on virtual machines whereby the user can share resources, software and other devices on demand. Cloud services are supported both by Proprietary and Open Source Systems. As Proprietary products are very expensive, customers are not allowed to experiment on their product and security is a major issue in it, Open source systems helps in solving out these problems. Cloud Computing motivated many academic and non academic members to develop Open Source Cloud Setup, here the users are allowed to study the source code and experiment it. This paper describes the configuration of a private cloud using Eucalyptus. Eucalyptus an open source system has been used to implement a private cloud using the hardware and software without making any modification to it and provide various types of services to the cloud computing environment

  10. The element-based finite volume method applied to petroleum reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Cordazzo, Jonas; Maliska, Clovis R.; Silva, Antonio F.C. da; Hurtado, Fernando S.V. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil). Dept. de Engenharia Mecanica

    2004-07-01

    In this work a numerical model for simulating petroleum reservoirs using the Element-based Finite Volume Method (EbFVM) is presented. The method employs unstructured grids using triangular and/or quadrilateral elements, such that complex reservoir geometries can be easily represented. Due to the control-volume approach, local mass conservation is enforced, permitting a direct physical interpretation of the resulting discrete equations. It is demonstrated that this method can deal with the permeability maps without averaging procedures, since this scheme assumes uniform properties inside elements, instead inside of control volumes, avoiding the need of weighting the permeability values at the control volumes interfaces. Moreover, it is easy to include the full permeability tensor in this method, which is an important issue in simulating heterogeneous and anisotropic reservoirs. Finally, a comparison among the results obtained using the scheme proposed in this work in the EbFVM framework with those obtained employing the scheme commonly used in petroleum reservoir simulation is presented. It is also shown that the scheme proposed is less susceptible to the grid orientation effect with the increasing of the mobility ratio. (author)

  11. Sources

    OpenAIRE

    2015-01-01

    SOURCES MANUSCRITES Archives nationales Rôles de taille 1768/71 Z1G-344/18 Aulnay Z1G-343a/02 Gennevilliers Z1G-340/01 Ivry Z1G-340/05 Orly Z1G-334c/09 Saint-Remy-lès-Chevreuse Z1G-344/18 Sevran Z1G-340/05 Thiais 1779/80 Z1G-391a/18 Aulnay Z1G-380/02 Gennevilliers Z1G-385/01 Ivry Z1G-387b/05 Orly Z1G-388a/09 Saint-Remy-lès-Chevreuse Z1G-391a/18 Sevran Z1G-387b/05 Thiais 1788/89 Z1G-451/18 Aulnay Z1G-452/21 Chennevières Z1G-443b/02 Gennevilliers Z1G-440a/01 Ivry Z1G-452/17 Noiseau Z1G-445b/05 ...

  12. Trabecular-Iris Circumference Volume in Open Angle Eyes Using Swept-Source Fourier Domain Anterior Segment Optical Coherence Tomography

    Directory of Open Access Journals (Sweden)

    Mohammed Rigi

    2014-01-01

    Full Text Available Purpose. To introduce a new anterior segment optical coherence tomography parameter, trabecular-iris circumference volume (TICV, which measures the integrated volume of the peripheral angle, and establish a reference range in normal, open angle eyes. Methods. One eye of each participant with open angles and a normal anterior segment was imaged using 3D mode by the CASIA SS-1000 (Tomey, Nagoya, Japan. Trabecular-iris space area (TISA and TICV at 500 and 750 µm were calculated. Analysis of covariance was performed to examine the effect of age and its interaction with spherical equivalent. Results. The study included 100 participants with a mean age of 50 (±15 years (range 20–79. TICV showed a normal distribution with a mean (±SD value of 4.75 µL (±2.30 for TICV500 and a mean (±SD value of 8.90 µL (±3.88 for TICV750. Overall, TICV showed an age-related reduction (P=0.035. In addition, angle volume increased with increased myopia for all age groups, except for those older than 65 years. Conclusions. This study introduces a new parameter to measure peripheral angle volume, TICV, with age-adjusted normal ranges for open angle eyes. Further investigation is warranted to determine the clinical utility of this new parameter.

  13. Creating Point Sources for Codling Moth (Lepidoptera: Tortricidae) with Low-Volume Sprays of a Microencapsulated Sex Pheromone Formulation

    Science.gov (United States)

    Studies were conducted to examine the depositioin of microcapsules and the attractiveness of treated apple leaves for codling moth, Cydia pomonella (L.), following low volume concentrated sprays of a microencapsulated (MEC) sex pheromone formulation (CheckMate CM-F). Nearly 30% of leaves collected f...

  14. Volume Measurement Algorithm for Food Product with Irregular Shape using Computer Vision based on Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Joko Siswantoro

    2014-11-01

    Full Text Available Volume is one of important issues in the production and processing of food product. Traditionally, volume measurement can be performed using water displacement method based on Archimedes’ principle. Water displacement method is inaccurate and considered as destructive method. Computer vision offers an accurate and nondestructive method in measuring volume of food product. This paper proposes algorithm for volume measurement of irregular shape food product using computer vision based on Monte Carlo method. Five images of object were acquired from five different views and then processed to obtain the silhouettes of object. From the silhouettes of object, Monte Carlo method was performed to approximate the volume of object. The simulation result shows that the algorithm produced high accuracy and precision for volume measurement.

  15. Real-time tunability of chip-based light source enabled by microfluidic mixing

    DEFF Research Database (Denmark)

    Olsen, Brian Bilenberg; Rasmussen, Torben; Balslev, Søren

    2006-01-01

    We demonstrate real-time tunability of a chip-based liquid light source enabled by microfluidic mixing. The mixer and light source are fabricated in SU-8 which is suitable for integration in SU-8-based laboratory-on-a-chip microsystems. The tunability of the light source is achieved by changing...

  16. The Chandra Local Volume Survey I: The X-ray Point Source Populations of NGC 55, NGC 2403, and NGC 4214

    CERN Document Server

    Binder, B; Eracleous, M; Plucinsky, P P; Gaetz, T J; Anderson, S F; Skillman, E D; Dalcanton, J J; Kong, A K H; Weisz, D R

    2015-01-01

    We present comprehensive X-ray point source catalogs of NGC~55, NGC~2403, and NGC~4214 as part of the Chandra Local Volume Survey. The combined archival observations have effective exposure times of 56.5 ks, 190 ks, and 79 ks for NGC~55, NGC~2403, and NGC~4214, respectively. When combined with our published catalogs for NGC 300 and NGC 404, our survey contains 629 X-ray sources total down to a limiting unabsorbed luminosity of $\\sim5\\times10^{35}$ erg s$^{-1}$ in the 0.35-8 keV band in each of the five galaxies. We present X-ray hardness ratios, spectral analysis, radial source distributions, and an analysis of the temporal variability for the X-ray sources detected at high significance. To constrain the nature of each X-ray source, we carried out cross-correlations with multi-wavelength data sets. We searched overlapping Hubble Space Telescope observations for optical counterparts to our X-ray detections to provide preliminary classifications for each X-ray source as a likely X-ray binary, background AGN, su...

  17. Independent component analysis based source number estimation and its comparison for mechanical systems

    Science.gov (United States)

    Cheng, Wei; Lee, Seungchul; Zhang, Zhousuo; He, Zhengjia

    2012-11-01

    It has been challenging to correctly separate the mixed signals into source components when the source number is not known a priori. In this paper, we propose a novel source number estimation based on independent component analysis (ICA) and clustering evaluation analysis. We investigate and benchmark three information based source number estimations: Akaike information criterion (AIC), minimum description length (MDL) and improved Bayesian information criterion (IBIC). All the above methods are comparatively studied in both numerical and experimental case studies with typical mechanical signals. The results demonstrate that the proposed ICA based source number estimation with nonlinear dissimilarity measures performs more stable and robust than the information based ones for mechanical systems.

  18. A simplified CT-based definition of the supraclavicular and infraclavicular nodal volumes in breast cancer.

    Science.gov (United States)

    Atean, I; Pointreau, Y; Ouldamer, L; Monghal, C; Bougnoux, A; Bera, G; Barillot, I

    2013-02-01

    The available contouring guidelines for the supraclavicular and infraclavicular lymph nodes appeared to be inadequate for their delineation on non-enhanced computed tomography (CT) scans. For this purpose, we developed delineation guidelines for the clinical target volumes (CTV) of these lymph nodes on non-enhanced CT-slices performed in the treatment position of breast cancer. A fresh female cadaver study as well as delineation and an anatomical descriptions review were performed to propose a simplified definition of the supra- and infraclavicular lymph nodes using readily identifiable anatomical structures. This definition was developed jointly by breast radiologists, breast surgeons, and radiation oncologists. To validate these guidelines, the primary investigator and seven radiation oncologists (observers) independently delineated 10 different nodal CTVs. The primary investigator contours were considered to be the gold standard contours. Contour accuracy and concordance were evaluated. Written guidelines for the delineation of supra- and infraclavicular lymph nodes CTVs were developed. Consistent contours with minimal variability existed between the delineated volumes; the mean kappa index was 0.83. The mean common contoured and additional contoured volumes were 84.6% and 18.5%, respectively. The mean overlap volume ratio was 0.71. Simplified CT-based atlas for delineation of the supra- and infraclavicular lymph nodes for locoregional irradiation of the breast on non-enhanced CT-scan, have been developed in this study. This atlas provides a consistent set of guidelines for delineating these volumes. Copyright © 2012 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  19. Effect of Fixed-Volume and Weight-Based Dosing Regimens on the Cost and Volume of Administered Iodinated Contrast Material at Abdominal CT.

    Science.gov (United States)

    Davenport, Matthew S; Parikh, Kushal R; Mayo-Smith, William W; Israel, Gary M; Brown, Richard K J; Ellis, James H

    2017-03-01

    To determine the magnitude of subject-level and population-level cost savings that could be realized by moving from fixed-volume low-osmolality iodinated contrast material administration to an effective weight-based dosing regimen for contrast-enhanced abdominopelvic CT. HIPAA-compliant, institutional review board-exempt retrospective cohort study of 6,737 subjects undergoing contrast-enhanced abdominopelvic CT from 2014 to 2015. Subject height, weight, lean body weight (LBW), and body surface area (BSA) were determined. Twenty-six volume- and weight-based dosing strategies with literature support were compared with a fixed-volume strategy used at the study institution: 125 mL 300 mgI/mL for routine CT, 125 mL 370 mgI/mL for multiphasic CT (single-energy, 120 kVp). The predicted population- and subject-level effects on cost and contrast material utilization were calculated for each strategy and sensitivity analyses were performed. Most subjects underwent routine CT (91% [6,127/6,737]). Converting to lesser-volume higher-concentration contrast material had the greatest effect on cost; a fixed-volume 100 mL 370 mgI/mL strategy resulted in $132,577 in population-level savings with preserved iodine dose at routine CT (37,500 versus 37,000 mgI). All weight-based iodine-content dosing strategies (mgI/kg) with the same maximum contrast material volume (125 mL) were predicted to contribute mean savings compared with the existing fixed-volume algorithm ($4,053-$116,076/strategy in the overall study population, $1-$17/strategy per patient). Similar trends were observed in all sensitivity analyses. Large cost and material savings can be realized at abdominopelvic CT by adopting a weight-based dosing strategy and lowering the maximum volume of administered contrast material. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. Powerful nanosecond light sources based on LEDs for astroparticle physics experiments

    OpenAIRE

    Lubsandorzhiev, B. K.; Poleshuk, R. V.; Shaibonov, B. A. J.; Vyatchin, Y. E.

    2007-01-01

    Powerful nanosecond light sources based on LEDs have been developed for use in astroparticle physics experiments. The light sources use either matrixes of ultra bright blue LEDs or a new generation high power blue LEDs. It's shown that such light sources have light yield of up to 10**10 - 10**12 photons per pulse with very fast light emission kinetics. The described light sources are important for use in calibration systems of Cherenkov and scintillator detectors. The developed light sources ...

  1. Energy Economic Data Base (EEDB) Program: Phase I, Volume I. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-01

    The Energy Economic Data Base Program, which deals with the development of cost data for nuclear and comparison electric power generating stations, provides periodic updates of technical and cost (capital, fuel, and operating and maintenance) information of significance to DOE. The information allows for evaluation and monitoring of US civilian nuclear power programs and provides a consistent means of evaluation the nuclear option against alternatives. Currently, the EEDB contains 6 nuclear electrical generating plant technical models and 5 comparison coal-fired electrical generating plant technical models. Each of these technical plant models is a complete conceptual design for a single unit, steam electric power generating station located on a standard, hypothetical Middletown site. A description of the site is provided in Appendix A-1 (Volume 2) for nuclear plants, and Appendix A-2 (Volume 2) for coal-fired plants. The EEDB also includes a conceptual design of a coal liquefaction plant for comparison purposes. Volume 1 provides a description of the current Data Base, as of September 30, 1978: gives assumptions and ground rules for the initial-cost update; summarizes the initial cost update, with cost results tabulated; details the initial update of the technical conceptual design, the capital cost, the quantities of commodities and their unit costs, and craft labor man hours and costs for each EEDB program model; and details the fuel-cycle-cost initial update and the operating- and maintenance-cost initial update. Finally, an extensive list of references and a glossary are presented.

  2. Optimization of volume to point conduction problem based on a novel thermal conductivity discretization algorithm

    Institute of Scientific and Technical Information of China (English)

    Wenjing Du; Peili Wang; Lipeng Song; Lin Cheng

    2015-01-01

    A conduction heat transfer process is enhanced by filling prescribed quantity and optimized-shaped high thermal conductivity materials to the substrate. Numerical simulations and analyses are performed on a volume to point conduction problem based on the principle of minimum entropy generation. In the optimization, the arrange-ment of high thermal conductivity materials is variable, the quantity of high thermal-conductivity material is constrained, and the objective is to obtain the maximum heat conduction rate as the entropy is the minimum. A novel algorithm of thermal conductivity discretization is proposed based on large quantity of calculations. Compared with other algorithms in literature, the average temperature in the substrate by the new algorithm is lower, while the highest temperature in the substrate is in a reasonable range. Thus the new algorithm is fea-sible. The optimization of volume to point heat conduction is carried out in a rectangular model with radiation boundary condition and constant surface temperature boundary condition. The results demonstrate that the al-gorithm of thermal conductivity discretization is applicable for volume to point heat conduction problems.

  3. Energy Economic Data Base (EEDB) Program: Phase I, Volume I. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-01

    The Energy Economic Data Base Program, which deals with the development of cost data for nuclear and comparison electric power generating stations, provides periodic updates of technical and cost (capital, fuel, and operating and maintenance) information of significance to DOE. The information allows for evaluation and monitoring of US civilian nuclear power programs and provides a consistent means of evaluation the nuclear option against alternatives. Currently, the EEDB contains 6 nuclear electrical generating plant technical models and 5 comparison coal-fired electrical generating plant technical models. Each of these technical plant models is a complete conceptual design for a single unit, steam electric power generating station located on a standard, hypothetical Middletown site. A description of the site is provided in Appendix A-1 (Volume 2) for nuclear plants, and Appendix A-2 (Volume 2) for coal-fired plants. The EEDB also includes a conceptual design of a coal liquefaction plant for comparison purposes. Volume 1 provides a description of the current Data Base, as of September 30, 1978: gives assumptions and ground rules for the initial-cost update; summarizes the initial cost update, with cost results tabulated; details the initial update of the technical conceptual design, the capital cost, the quantities of commodities and their unit costs, and craft labor man hours and costs for each EEDB program model; and details the fuel-cycle-cost initial update and the operating- and maintenance-cost initial update. Finally, an extensive list of references and a glossary are presented.

  4. A method to analyze "source-sink" structure of non-point source pollution based on remote sensing technology.

    Science.gov (United States)

    Jiang, Mengzhen; Chen, Haiying; Chen, Qinghui

    2013-11-01

    With the purpose of providing scientific basis for environmental planning about non-point source pollution prevention and control, and improving the pollution regulating efficiency, this paper established the Grid Landscape Contrast Index based on Location-weighted Landscape Contrast Index according to the "source-sink" theory. The spatial distribution of non-point source pollution caused by Jiulongjiang Estuary could be worked out by utilizing high resolution remote sensing images. The results showed that, the area of "source" of nitrogen and phosphorus in Jiulongjiang Estuary was 534.42 km(2) in 2008, and the "sink" was 172.06 km(2). The "source" of non-point source pollution was distributed mainly over Xiamen island, most of Haicang, east of Jiaomei and river bank of Gangwei and Shima; and the "sink" was distributed over southwest of Xiamen island and west of Shima. Generally speaking, the intensity of "source" gets weaker along with the distance from the seas boundary increase, while "sink" gets stronger.

  5. The Einstein Observatory catalog of IPC x ray sources. Volume 7E: Right ascension range 20h 00m to 23h 59m

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  6. The Einstein Observatory catalog of IPC x ray sources. Volume 5E: Right ascension range 12h 00m to 15h 59m

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics, which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  7. The Einstein Observatory catalog of IPC x ray sources. Volume 6E: Right ascension range 16h 00m to 19h 59m

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2 launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics, which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  8. The Einstein Observatory catalog of IPC x ray sources. Volume 2E: Right ascension range 00h 00m to 03h 59m

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers and data useful for calculating upper limits and fluxes.

  9. The Einstein Observatory catalog of IPC x ray sources. Volume 4E: Right ascension range 08h 00m to 11h 59m

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images, The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentaion describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers, and data useful for calculating upper limits and fluxes.

  10. The Einstein Observatory catalog of IPC x ray sources. Volume 3E: Right ascension range 04h 00m to 07h 59m

    Science.gov (United States)

    Harris, D. E.; Forman, W.; Gioia, I. M.; Hale, J. A.; Harnden, F. R., Jr.; Jones, C.; Karakashian, T.; Maccacaro, T.; Mcsweeney, J. D.; Primini, F. A.

    1993-01-01

    The Einstein Observatory (HEAO-2, launched November 13, 1978) achieved radically improved sensitivity over previous x-ray missions through the use of focusing optics which simultaneously afforded greatly reduced background and produced true images. During its 2.5-yr mission, the Einstein X-Ray Telescope was pointed toward some 5,000 celestial targets, most of which were detected, and discovered several thousand additional 'serendipitous' sources in the observed fields. This catalog contains contour diagrams and source data, obtained with the imaging proportional counter in the 0.16 to 3.5 keV energy band, and describes methods for recovering upper limits for any sky position within the observed images. The main catalog consists of six volumes (numbered 2 through 7) of right ascension ordered pages, each containing data for one observation. Along with the primary documentation describing how the catalog was constructed, volume 1 contains a complete source list, results for merged fields, a reference system to published papers and data useful for calculating upper limits and fluxes.

  11. Characteristics of Optical Fire Detector False Alarm Sources and Qualification Test Procedures to Prove Immunity. Phase 2. Volume 1

    Science.gov (United States)

    1993-09-01

    Features of Typical Lightning Flashes 95 xii LIST OF FIGURES (CONCLUDED) FIGURE TITLE PAGE 32 Examples of Spectral Energy Distribution of 98 Various...Yel IR..IR IR Eastern Electric 3 N/A N/A SB-1O1W 4 N/A N/A 21 UV Bug Lamp 3 0 0O 145 Tabla 29 (Continued)[SOURCE TEST OISTANC CHOP0 NO. SOURCE NO

  12. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    Science.gov (United States)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  13. On source models for (192)Ir HDR brachytherapy dosimetry using model based algorithms.

    Science.gov (United States)

    Pantelis, Evaggelos; Zourari, Kyveli; Zoros, Emmanouil; Lahanas, Vasileios; Karaiskos, Pantelis; Papagiannis, Panagiotis

    2016-06-07

    A source model is a prerequisite of all model based dose calculation algorithms. Besides direct simulation, the use of pre-calculated phase space files (phsp source models) and parameterized phsp source models has been proposed for Monte Carlo (MC) to promote efficiency and ease of implementation in obtaining photon energy, position and direction. In this work, a phsp file for a generic (192)Ir source design (Ballester et al 2015) is obtained from MC simulation. This is used to configure a parameterized phsp source model comprising appropriate probability density functions (PDFs) and a sampling procedure. According to phsp data analysis 15.6% of the generated photons are absorbed within the source, and 90.4% of the emergent photons are primary. The PDFs for sampling photon energy and direction relative to the source long axis, depend on the position of photon emergence. Photons emerge mainly from the cylindrical source surface with a constant probability over  ±0.1 cm from the center of the 0.35 cm long source core, and only 1.7% and 0.2% emerge from the source tip and drive wire, respectively. Based on these findings, an analytical parameterized source model is prepared for the calculation of the PDFs from data of source geometry and materials, without the need for a phsp file. The PDFs from the analytical parameterized source model are in close agreement with those employed in the parameterized phsp source model. This agreement prompted the proposal of a purely analytical source model based on isotropic emission of photons generated homogeneously within the source core with energy sampled from the (192)Ir spectrum, and the assignment of a weight according to attenuation within the source. Comparison of single source dosimetry data obtained from detailed MC simulation and the proposed analytical source model show agreement better than 2% except for points lying close to the source longitudinal axis.

  14. Impacts of DEM uncertainties on critical source areas identification for non-point source pollution control based on SWAT model

    Science.gov (United States)

    Xu, Fei; Dong, Guangxia; Wang, Qingrui; Liu, Lumeng; Yu, Wenwen; Men, Cong; Liu, Ruimin

    2016-09-01

    The impacts of different digital elevation model (DEM) resolutions, sources and resampling techniques on nutrient simulations using the Soil and Water Assessment Tool (SWAT) model have not been well studied. The objective of this study was to evaluate the sensitivities of DEM resolutions (from 30 m to 1000 m), sources (ASTER GDEM2, SRTM and Topo-DEM) and resampling techniques (nearest neighbor, bilinear interpolation, cubic convolution and majority) to identification of non-point source (NPS) critical source area (CSA) based on nutrient loads using the SWAT model. The Xiangxi River, one of the main tributaries of Three Gorges Reservoir in China, was selected as the study area. The following findings were obtained: (1) Elevation and slope extracted from the DEMs were more sensitive to DEM resolution changes. Compared with the results of the 30 m DEM, 1000 m DEM underestimated the elevation and slope by 104 m and 41.57°, respectively; (2) The numbers of subwatersheds and hydrologic response units (HRUs) were considerably influenced by DEM resolutions, but the total nitrogen (TN) and total phosphorus (TP) loads of each subwatershed showed higher correlations with different DEM sources; (3) DEM resolutions and sources had larger effects on CSAs identifications, while TN and TP CSAs showed different response to DEM uncertainties. TN CSAs were more sensitive to resolution changes, exhibiting six distribution patterns at all DEM resolutions. TP CSAs were sensitive to source and resampling technique changes, exhibiting three distribution patterns for DEM sources and two distribution patterns for DEM resampling techniques. DEM resolutions and sources are the two most sensitive SWAT model DEM parameters that must be considered when nutrient CSAs are identified.

  15. A pyramid-based approach to visual exploration of a large volume of vehicle trajectory data

    Institute of Scientific and Technical Information of China (English)

    Jing SUN; Xiang LI

    2012-01-01

    Advances in positioning and wireless communicating technologies make it possible to collect large volumes of trajectory data of moving vehicles in a fast and convenient fashion.These data can be applied to traffic studies.Behind this application,a methodological issue that still requires particular attention is the way these data should be spatially visualized.Trajectory data physically consists of a large number of positioning points.With the dramatic increase of data volume,it becomes a challenge to display and explore these data.Existing commercial software often employs vector-based indexing structures to facilitate the display of a large volume of points,but their performance downgrades quickly when the number of points is very large,for example,tens of millions.In this paper,a pyramid-based approach is proposed.A pyramid method initially is invented to facilitate the display of raster images through the tradeoff between storage space and display time.A pyramid is a set of images at different levels with different resolutions.In this paper,we convert vector-based point data into raster data,and build a gridbased indexing structure in a 2D plane.Then,an image pyramid is built.Moreover,at the same level of a pyramid,image is segmented into mosaics with respect to the requirements of data storage and management.Algorithms or procedures on grid-based indexing structure,image pyramid,image segmentation,and visualization operations are given in this paper.A case study with taxi trajectory data in Shanghai is conducted.Results demonstrate that the proposed method outperforms the existing commercial software.

  16. LIGHT SOURCE: Design of a new compact THz source based on Smith-Purcell radiation

    Science.gov (United States)

    Dai, Dong-Dong; Bei, Hua; Dai, Zhi-Min

    2009-06-01

    In recent years, people are dedicated to the research work of finding compact THz sources with high emission power. Smith-Purcell radiation is qualified for the possibility of coherent enhancement due to the effect of FEL mechanism. The compact experiment device is expected to produce hundreds mW level THz ray. The electron beam with good quality is provided under the optimized design of the electron gun. Besides, the grating is designed as an oscillator without any external feedbacks. While the beam passes through the grating surface, the beam bunching will be strong and the second harmonics enhancement will be evident, as is seen from the simulation results.

  17. Developing an open source-based spatial data infrastructure for integrated monitoring of mining areas

    Science.gov (United States)

    Lahn, Florian; Knoth, Christian; Prinz, Torsten; Pebesma, Edzer

    2014-05-01

    In all phases of mining campaigns, comprehensive spatial information is an essential requirement in order to ensure economically efficient but also safe mining activities as well as to reduce environmental impacts. Earth observation data acquired from various sources like remote sensing or ground measurements is important e.g. for the exploration of mineral deposits, the monitoring of mining induced impacts on vegetation or the detection of ground subsidence. The GMES4Mining project aims at exploring new remote sensing techniques and developing analysis methods on various types of sensor data to provide comprehensive spatial information during mining campaigns (BENECKE et al. 2013). One important task in this project is the integration of the data gathered (e.g. hyperspectral images, spaceborne radar data and ground measurements) as well as results of the developed analysis methods within a web-accessible data source based on open source software. The main challenges here are to provide various types and formats of data from different sensors and to enable access to analysis and processing techniques without particular software or licensing requirements for users. Furthermore the high volume of the involved data (especially hyperspectral remote sensing images) makes data transfer a major issue in this use case. To engage these problems a spatial data infrastructure (SDI) including a web portal as user frontend is being developed which allows users to access not only the data but also several analysis methods. The Geoserver software is used for publishing the data, which is then accessed and visualized in a JavaScript-based web portal. In order to perform descriptive statistics and some straightforward image processing techniques on the raster data (e.g. band arithmetic or principal component analysis) the statistics software R is implemented on a server and connected via Rserve. The analysis is controlled and executed directly by the user through the web portal and

  18. Egg volume prediction using machine vision technique based on pappus theorem and artificial neural network.

    Science.gov (United States)

    Soltani, Mahmoud; Omid, Mahmoud; Alimardani, Reza

    2015-05-01

    Egg size is one of the important properties of egg that is judged by customers. Accordingly, in egg sorting and grading, the size of eggs must be considered. In this research, a new method of egg volume prediction was proposed without need to measure weight of egg. An accurate and efficient image processing algorithm was designed and implemented for computing major and minor diameters of eggs. Two methods of egg size modeling were developed. In the first method, a mathematical model was proposed based on Pappus theorem. In second method, Artificial Neural Network (ANN) technique was used to estimate egg volume. The determined egg volume by these methods was compared statistically with actual values. For mathematical modeling, the R(2), Mean absolute error and maximum absolute error values were obtained as 0.99, 0.59 cm(3) and 1.69 cm(3), respectively. To determine the best ANN, R(2) test and RMSEtest were used as selection criteria. The best ANN topology was 2-28-1 which had the R(2) test and RMSEtest of 0.992 and 0.66, respectively. After system calibration, the proposed models were evaluated. The results which indicated the mathematical modeling yielded more satisfying results. So this technique was selected for egg size determination.

  19. Automated mass detection in contrast-enhanced CT colonography: an approach based on contrast and volume

    Energy Technology Data Exchange (ETDEWEB)

    Luboldt, W. [University Hospital Essen, Clinic and Policlinic of Angiology, Essen (Germany); Multiorgan Screening Foundation (Germany); Tryon, C. [Philips Medical Systems, Best (Netherlands); Kroll, M.; Vogl, T.J. [University Hospital Frankfurt, Department of Radiology, Frankfurt (Germany); Toussaint, T.L. [Multiorgan Screening Foundation (Germany); Holzer, K. [University Hospital Frankfurt, Department of Visceral and Vascular Surgery, Frankfurt (Germany); Hoepffner, N. [University Hospital Frankfurt, Department of Gastroenterology, Frankfurt (Germany)

    2005-02-01

    The purpose of this feasibility study was to design and test an algorithm for automating mass detection in contrast-enhanced CT colonography (CTC). Five patients with known colorectal masses underwent a pre-surgical contrast-enhanced (120 ml volume 1.6 g iodine/s injection rate, 60 s scan delay) CTC in high spatial resolution (16-slice CT: collimation: 16 x 0.75 mm, tablefeed: 24 mm/0.5 s, reconstruction increment: 0.5 mm). A CT-density- and volume-based algorithm searched for masses in the colonic wall, which was extracted before by segmenting and dilating the colonic air lumen and subtracting the inner air. A radiologist analyzed the detections and causes of false positives. All masses were detected, and false positives were easy to identify. Combining CT density with volume as a cut-off is a promising approach for automating mass detection that should be further refined and also tested in contrast-enhanced MR colonography. (orig.)

  20. Automated CT-based segmentation and quantification of total intracranial volume

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar, Carlos; Wahlund, Lars-Olof; Westman, Eric [Karolinska Institute, Department of Neurobiology, Care Sciences and Society (NVS), Division of Clinical Geriatrics, Stockholm (Sweden); Edholm, Kaijsa; Cavallin, Lena; Muller, Susanne; Axelsson, Rimma [Karolinska Institute, Department of Clinical Science, Intervention and Technology, Division of Medical Imaging and Technology, Stockholm (Sweden); Karolinska University Hospital in Huddinge, Department of Radiology, Stockholm (Sweden); Simmons, Andrew [King' s College London, Institute of Psychiatry, London (United Kingdom); NIHR Biomedical Research Centre for Mental Health and Biomedical Research Unit for Dementia, London (United Kingdom); Skoog, Ingmar [Gothenburg University, Department of Psychiatry and Neurochemistry, The Sahlgrenska Academy, Gothenburg (Sweden); Larsson, Elna-Marie [Uppsala University, Department of Surgical Sciences, Radiology, Akademiska Sjukhuset, Uppsala (Sweden)

    2015-11-15

    To develop an algorithm to segment and obtain an estimate of total intracranial volume (tICV) from computed tomography (CT) images. Thirty-six CT examinations from 18 patients were included. Ten patients were examined twice the same day and eight patients twice six months apart (these patients also underwent MRI). The algorithm combines morphological operations, intensity thresholding and mixture modelling. The method was validated against manual delineation and its robustness assessed from repeated imaging examinations. Using automated MRI software, the comparability with MRI was investigated. Volumes were compared based on average relative volume differences and their magnitudes; agreement was shown by a Bland-Altman analysis graph. We observed good agreement between our algorithm and manual delineation of a trained radiologist: the Pearson's correlation coefficient was r = 0.94, tICVml[manual] = 1.05 x tICVml[automated] - 33.78 (R{sup 2} = 0.88). Bland-Altman analysis showed a bias of 31 mL and a standard deviation of 30 mL over a range of 1265 to 1526 mL. tICV measurements derived from CT using our proposed algorithm have shown to be reliable and consistent compared to manual delineation. However, it appears difficult to directly compare tICV measures between CT and MRI. (orig.)

  1. A spatial discretization of the MHD equations based on the finite volume - spectral method

    Energy Technology Data Exchange (ETDEWEB)

    Miyoshi, Takahiro [Japan Atomic Energy Research Inst., Naka, Ibaraki (Japan). Naka Fusion Research Establishment

    2000-05-01

    Based on the finite volume - spectral method, we present new discretization formulae for the spatial differential operators in the full system of the compressible MHD equations. In this approach, the cell-centered finite volume method is adopted in a bounded plane (poloidal plane), while the spectral method is applied to the differential with respect to the periodic direction perpendicular to the poloidal plane (toroidal direction). Here, an unstructured grid system composed of the arbitrary triangular elements is utilized for constructing the cell-centered finite volume method. In order to maintain the divergence free constraint of the magnetic field numerically, only the poloidal component of the rotation is defined at three edges of the triangular element. This poloidal component is evaluated under the assumption that the toroidal component of the operated vector times the radius, RA{sub {phi}}, is linearly distributed in the element. The present method will be applied to the nonlinear MHD dynamics in an realistic torus geometry without the numerical singularities. (author)

  2. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    Energy Technology Data Exchange (ETDEWEB)

    Santos-Villalobos, Hector J [ORNL; Gregor, Jens [University of Tennessee, Knoxville (UTK); Bingham, Philip R [ORNL

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  3. SU-F-207-06: CT-Based Assessment of Tumor Volume in Malignant Pleural Mesothelioma

    Energy Technology Data Exchange (ETDEWEB)

    Qayyum, F; Armato, S; Straus, C; Husain, A; Vigneswaran, W; Kindler, H [The University of Chicago, Chicago, IL (United States)

    2015-06-15

    Purpose: To determine the potential utility of computed tomography (CT) scans in the assessment of physical tumor bulk in malignant pleural mesothelioma patients. Methods: Twenty-eight patients with malignant pleural mesothelioma were used for this study. A CT scan was acquired for each patient prior to surgical resection of the tumor (median time between scan and surgery: 27 days). After surgery, the ex-vivo tumor volume was measured by a pathologist using a water displacement method. Separately, a radiologist identified and outlined the tumor boundary on each CT section that demonstrated tumor. These outlines then were analyzed to determine the total volume of disease present, the number of sections with outlines, and the mean volume of disease per outlined section. Subsets of the initial patient cohort were defined based on these parameters, i.e. cases with at least 30 sections of disease with a mean disease volume of at least 3mL per section. For each subset, the R- squared correlation between CT-based tumor volume and physical ex-vivo tumor volume was calculated. Results: The full cohort of 28 patients yielded a modest correlation between CT-based tumor volume and the ex-vivo tumor volume with an R-squared value of 0.66. In general, as the mean tumor volume per section increased, the correlation of CT-based volume with the physical tumor volume improved substantially. For example, when cases with at least 40 CT sections presenting a mean of at least 2mL of disease per section were evaluated (n=20) the R-squared correlation increased to 0.79. Conclusion: While image-based volumetry for mesothelioma may not generally capture physical tumor volume as accurately as one might expect, there exists a set of conditions in which CT-based volume is highly correlated with the physical tumor volume. SGA receives royalties and licensing fees through the University of Chicago for computer-aided diagnosis technology.

  4. Visible and ultraviolet light sources based nonlinear interaction of lasers

    DEFF Research Database (Denmark)

    Andersen, Martin Thalbitzer; Tidemand-Lichtenberg, Peter; Jain, Mayank;

    Different light sources can be used for optically stimulated luminescence measurements and usually a halogen lamp in combination with filters or light emitting diodes (LED’s) are used to provide the desired stimulation wavelength. However lasers can provide a much more well-defined beam, very...

  5. Quantum-dot-based integrated non-linear sources

    DEFF Research Database (Denmark)

    Bernard, Alice; Mariani, Silvia; Andronico, Alessio

    2015-01-01

    The authors report on the design and the preliminary characterisation of two active non-linear sources in the terahertz and near-infrared range. The former is associated to difference-frequency generation between whispering gallery modes of an AlGaAs microring resonator, whereas the latter is gra...

  6. Advanced RF Sources Based on Novel Nonlinear Transmission Lines

    Science.gov (United States)

    2015-01-26

    crowding and therefore highest temperature due to joule heating, occurs at the constriction corner near the source side , which is point B in Fig. 4(c...Boltzmann transport equation in orifice and disk geometry,” Proc. Phys. Soc. 89, 927 (1966). [23] Peng Zhang, Y. Y. Lau, and R. M. Gilgenbach, “Analysis of

  7. Towards Evidence-Based Understanding of Electronic Data Sources

    DEFF Research Database (Denmark)

    Chen, Lianping; Ali Babar, Muhammad; Zhang, He

    2010-01-01

    Identifying relevant papers from various Electronic Data Sources (EDS) is one of the key activities of conducting these kinds of studies. Hence, the selection of EDS for searching the potentially relevant papers is an important decision, which can affect a study’s coverage of relevant papers. Res...

  8. VOLUME STUDY WITH HIGH DENSITY OF PARTICLES BASED ON CONTOUR AND CORRELATION IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tatyana Yu. Nikolaeva

    2014-11-01

    Full Text Available The subject of study is the techniques of particle statistics evaluation, in particular, processing methods of particle images obtained by coherent illumination. This paper considers the problem of recognition and statistical accounting for individual images of small scattering particles in an arbitrary section of the volume in case of high concentrations. For automatic recognition of focused particles images, a special algorithm for statistical analysis based on contouring and thresholding was used. By means of the mathematical formalism of the scalar diffraction theory, coherent images of the particles formed by the optical system with high numerical aperture were simulated. Numerical testing of the method proposed for the cases of different concentrations and distributions of particles in the volume was performed. As a result, distributions of density and mass fraction of the particles were obtained, and the efficiency of the method in case of different concentrations of particles was evaluated. At high concentrations, the effect of coherent superposition of the particles from the adjacent planes strengthens, which makes it difficult to recognize images of particles using the algorithm considered in the paper. In this case, we propose to supplement the method with calculating the cross-correlation function of particle images from adjacent segments of the volume, and evaluating the ratio between the height of the correlation peak and the height of the function pedestal in the case of different distribution characters. The method of statistical accounting of particles considered in this paper is of practical importance in the study of volume with particles of different nature, for example, in problems of biology and oceanography. Effective work in the regime of high concentrations expands the limits of applicability of these methods for practically important cases and helps to optimize determination time of the distribution character and

  9. In vivo evaluation of battery-operated light-emitting diode-based photodynamic therapy efficacy using tumor volume and biomarker expression as endpoints

    Science.gov (United States)

    Mallidi, Srivalleesha; Mai, Zhiming; Rizvi, Imran; Hempstead, Joshua; Arnason, Stephen; Celli, Jonathan; Hasan, Tayyaba

    2015-01-01

    Abstract. In view of the increase in cancer-related mortality rates in low- to middle-income countries (LMIC), there is an urgent need to develop economical therapies that can be utilized at minimal infrastructure institutions. Photodynamic therapy (PDT), a photochemistry-based treatment modality, offers such a possibility provided that low-cost light sources and photosensitizers are available. In this proof-of-principle study, we focus on adapting the PDT light source to a low-resource setting and compare an inexpensive, portable, battery-powered light-emitting diode (LED) light source with a standard, high-cost laser source. The comparison studies were performed in vivo in a xenograft murine model of human squamous cell carcinoma subjected to 5-aminolevulinic acid-induced protoporphyrin IX PDT. We observed virtually identical control of the tumor burden by both the LED source and the standard laser source. Further insights into the biological response were evaluated by biomarker analysis of necrosis, microvessel density, and hypoxia [carbonic anhydrase IX (CAIX) expression] among groups of control, LED-PDT, and laser-PDT treated mice. There is no significant difference in the percent necrotic volume and CAIX expression in tumors that were treated with the two different light sources. These encouraging preliminary results merit further investigations in orthotopic animal models of cancers prevalent in LMICs. PMID:25909707

  10. In vivo evaluation of battery-operated light-emitting diode-based photodynamic therapy efficacy using tumor volume and biomarker expression as endpoints.

    Science.gov (United States)

    Mallidi, Srivalleesha; Mai, Zhiming; Rizvi, Imran; Hempstead, Joshua; Arnason, Stephen; Celli, Jonathan; Hasan, Tayyaba

    2015-04-01

    In view of the increase in cancer-related mortality rates in low- to middle-income countries (LMIC), there is an urgent need to develop economical therapies that can be utilized at minimal infrastructure institutions. Photodynamic therapy (PDT), a photochemistry-based treatment modality, offers such a possibility provided that low-cost light sources and photosensitizers are available. In this proof-of-principle study, we focus on adapting the PDT light source to a low-resource setting and compare an inexpensive, portable, battery-powered light-emitting diode (LED) light source with a standard, high-cost laser source. The comparison studies were performed in vivo in a xenograft murine model of human squamous cell carcinoma subjected to 5-aminolevulinic acid-induced protoporphyrin IX PDT. We observed virtually identical control of the tumor burden by both the LED source and the standard laser source. Further insights into the biological response were evaluated by biomarker analysis of necrosis, microvessel density, and hypoxia [carbonic anhydrase IX (CAIX) expression] among groups of control, LED-PDT, and laser-PDT treated mice. There is no significant difference in the percent necrotic volume and CAIX expression in tumors that were treated with the two different light sources. These encouraging preliminary results merit further investigations in orthotopic animal models of cancers prevalent in LMICs.

  11. Development of Laser-Produced Tin Plasma-Based EUV Light Source Technology for HVM EUV Lithography

    Directory of Open Access Journals (Sweden)

    Junichi Fujimoto

    2012-01-01

    Full Text Available Since 2002, we have been developing a carbon dioxide (CO2 laser-produced tin (Sn plasma (LPP extreme ultraviolet (EUV light source, which is the most promising solution because of the 13.5 nm wavelength high power (>200 W light source for high volume manufacturing. EUV lithography is used for its high efficiency, power scalability, and spatial freedom around plasma. We believe that the LPP scheme is the most feasible candidate for the EUV light source for industrial use. We have several engineering data from our test tools, which include 93% Sn ionization rate, 98% Sn debris mitigation by a magnetic field, and 68% CO2 laser energy absorption rate. The way of dispersion of Sn by prepulse laser is key to improve conversion efficiency (CE. We focus on prepulsed laser pulsed duration. When we have optimized pulse duration from nanosecond to picosecond, we have obtained maximum 4.7% CE (CO2 laser to EUV; our previous data was 3.8% at 2 mJ EUV pulse energy. Based on these data we are developing our first light source as our product: “GL200E.” The latest data and the overview of EUV light source for the industrial EUV lithography are reviewed in this paper.

  12. Iraqi Perspectives Project. Primary Source Materials for Saddam and Terrorism: Emerging Insights from Captured Iraqi Documents. Volume 2 (Redacted)

    Science.gov (United States)

    2007-11-01

    Nederland. (1) case. Thirdly, we used some ofthe sources working in the Greek field to move toward the American field in order to try to know what the...have homosexual action with me and asked my supervisors what I should do. I was astonished, because the answer was if that deed yields a benefit, it is

  13. Convergence of Cell Based Finite Volume Discretizations for Problems of Control in the Conduction Coefficients

    DEFF Research Database (Denmark)

    Evgrafov, Anton; Gregersen, Misha Marie; Sørensen, Mads Peter

    2011-01-01

    design, in particular shape and topology optimization, and are most often solved numerically utilizing a finite element approach. Within the FV framework for control in the coefficients problems the main difficulty we face is the need to analyze the convergence of fluxes defined on the faces of cells......We present a convergence analysis of a cell-based finite volume (FV) discretization scheme applied to a problem of control in the coefficients of a generalized Laplace equation modelling, for example, a steady state heat conduction. Such problems arise in applications dealing with geometric optimal...

  14. Direction of arrival estimation of coherent sources based on arbitrary plane arrays

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A method of direction of arrival (DOA) estimation of coherent sources is proposed, which is based on arbitrary plane arrays. After constructing the mathematical model of coherent sources, virtual array transformation and MUSIC algorithm are used to realize the azimuth estimation of coherent sources, which improved the DOA estimation performance greatly. According to the computer simulation, its validity is confirmed.

  15. Integrating source-language context into phrase-based statistical machine translation

    NARCIS (Netherlands)

    Haque, R.; Kumar Naskar, S.; Bosch, A.P.J. van den; Way, A.

    2011-01-01

    The translation features typically used in Phrase-Based Statistical Machine Translation (PB-SMT) model dependencies between the source and target phrases, but not among the phrases in the source language themselves. A swathe of research has demonstrated that integrating source context modelling dire

  16. The Great Patriotic War: the Problems of Forming the Source Base

    Directory of Open Access Journals (Sweden)

    Evgeny F. Krinko

    2015-07-01

    Full Text Available The Great Patriotic War was reflected in the different historical sources. The article is devoted to the formation of the source base of the problem. The author examines the dynamics of the situation in the archives and publication of documents. The main attention is paid to the modern study of the sources of the Great Patriotic War.

  17. Simulation of DNAPL migration in heterogeneous translucent porous media based on estimation of representative elementary volume

    Science.gov (United States)

    Wu, Ming; Wu, Jianfeng; Wu, Jichun

    2017-10-01

    When the dense nonaqueous phase liquid (DNAPL) comes into the subsurface environment, its migration behavior is crucially affected by the permeability and entry pressure of subsurface porous media. A prerequisite for accurately simulating DNAPL migration in aquifers is then the determination of the permeability, entry pressure and corresponding representative elementary volumes (REV) of porous media. However, the permeability, entry pressure and corresponding representative elementary volumes (REV) are hard to determine clearly. This study utilizes the light transmission micro-tomography (LTM) method to determine the permeability and entry pressure of two dimensional (2D) translucent porous media and integrates the LTM with a criterion of relative gradient error to quantify the corresponding REV of porous media. As a result, the DNAPL migration in porous media might be accurately simulated by discretizing the model at the REV dimension. To validate the quantification methods, an experiment of perchloroethylene (PCE) migration is conducted in a two-dimensional heterogeneous bench-scale aquifer cell. Based on the quantifications of permeability, entry pressure and REV scales of 2D porous media determined by the LTM and relative gradient error, different models with different sizes of discretization grid are used to simulate the PCE migration. It is shown that the model based on REV size agrees well with the experimental results over the entire migration period including calibration, verification and validation processes. This helps to better understand the microstructures of porous media and achieve accurately simulating DNAPL migration in aquifers based on the REV estimation.

  18. Amniotic fluid volume: Rapid MR-based assessment at 28-32 weeks gestation

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, N.J.; Hawkes, R.; Patterson, A.J.; Graves, M.J.; Priest, A.N.; Hunter, S.; Set, P.A.; Lomas, D.J. [Cambridge University Hospitals NHS Foundation Trust, Department of Radiology, Cambridge (United Kingdom); Lees, C. [Imperial College Healthcare NHS Trust, Department of Obstetrics and Fetal Medicine, London (United Kingdom)

    2016-10-15

    This work evaluates rapid magnetic resonance projection hydrography (PH) based amniotic fluid volume (AFV) estimates against established routine ultrasound single deepest vertical pocket (SDVP) and amniotic fluid index (AFI) measurements, in utero at 28-32 weeks gestation. Manual multi-section planimetry (MSP) based measurement of AFV is used as a proxy reference standard. Thirty-five women with a healthy singleton pregnancy (20-41 years) attending routine antenatal ultrasound were recruited. SDVP and AFI were measured using ultrasound, with same day MRI assessing AFV with PH and MSP. The relationships between the respective techniques were assessed using linear regression analysis and Bland-Altman method comparison statistics. When comparing estimated AFV, a highly significant relationship was observed between PH and the reference standard MSP (R{sup 2} = 0.802, p < 0.001). For the US measurements, SDVP measurement related most closely to amniotic fluid volume, (R{sup 2} = 0.470, p < 0.001), with AFI demonstrating a weaker relationship (R{sup 2} = 0.208, p = 0.007). This study shows that rapid MRI based PH measurement is a better predictor of AFV, relating more closely to our proxy standard than established US techniques. Although larger validation studies across a range of gestational ages are required this approach could form part of MR fetal assessment, particularly where poly- or oligohydramnios is suspected. (orig.)

  19. Volume estimation using food specific shape templates in mobile image-based dietary assessment

    Science.gov (United States)

    Chae, Junghoon; Woo, Insoo; Kim, SungYe; Maciejewski, Ross; Zhu, Fengqing; Delp, Edward J.; Boushey, Carol J.; Ebert, David S.

    2011-03-01

    As obesity concerns mount, dietary assessment methods for prevention and intervention are being developed. These methods include recording, cataloging and analyzing daily dietary records to monitor energy and nutrient intakes. Given the ubiquity of mobile devices with built-in cameras, one possible means of improving dietary assessment is through photographing foods and inputting these images into a system that can determine the nutrient content of foods in the images. One of the critical issues in such the image-based dietary assessment tool is the accurate and consistent estimation of food portion sizes. The objective of our study is to automatically estimate food volumes through the use of food specific shape templates. In our system, users capture food images using a mobile phone camera. Based on information (i.e., food name and code) determined through food segmentation and classification of the food images, our system choose a particular food template shape corresponding to each segmented food. Finally, our system reconstructs the three-dimensional properties of the food shape from a single image by extracting feature points in order to size the food shape template. By employing this template-based approach, our system automatically estimates food portion size, providing a consistent method for estimation food volume.

  20. NASIS data base management system: IBM 360 TSS implementation. Volume 8: Data base administrator user's guide

    Science.gov (United States)

    1973-01-01

    The Data Base Administrator User's Guide for the NASA Aerospace Safety Information System is presented. The subjects discussed are: (1) multi-terminal tasking, (2) data base executive, (3) utilities, (4) maintenance, (5) terminal support, and (6) retrieval subsystem.

  1. Forest stands volume estimation by using Finnish Multi-Source National Forest Inventory in Stołowe Mountains National Park

    Directory of Open Access Journals (Sweden)

    Pachana Przemko

    2016-03-01

    Full Text Available The purpose of the present study was to convey to the reader the method and application of the Finnish Multi-Source National Forest Inventory (MS-NFI that was devised in the Finnish Forest Research Institute. The study area concerned is Stołowe Mountains National Park, which is located in the south-western Poland, near the border with the Czech Republic. To accomplish the above mentioned aim, the following data have been applied: timber volume derived from field sample plots, satellite image, digital map data and digital elevation model. The Pearson correlation coefficient between independent and dependent variables has been verified. Furthermore, the non-parametric k-nearest neighbours (k-NN technique and genetic algorithm have been used in order to estimate forest stands biomass at the pixel level. The error estimates have been obtained by leave-one-out cross-validation method. The main computed forest stands features were total and mean timber volume as well as maximum and minimum biomass occurring in the examined area. In the final step, timber volume map of the growing stock has been created.

  2. Accuracy and variability of right ventricular volumes and mass assessed by dual-source computed tomography: influence of slice orientation in comparison to magnetic resonance imaging

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, Christoph J. [Elisabeth Hospital Essen, Department of Cardiology and Angiology, Essen (Germany); Duke University Medical Center, Duke Cardiovascular Magnetic Resonance Center, Durham, NC (United States); Wolf, Alexander; Eberle, Holger C.; Sabin, Georg V.; Bruder, Oliver [Elisabeth Hospital Essen, Department of Cardiology and Angiology, Essen (Germany); Forsting, Michael; Nassenstein, Kai; Lauenstein, Thomas C.; Schlosser, Thomas [University Hospital Essen, Department of Diagnostic and Interventional Radiology and Neuroradiology, Essen (Germany)

    2011-12-15

    To evaluate the accuracy and variability of right ventricular (RV) volumes and mass using dual-source computed tomography (DSCT) and the influence of slice orientation in comparison to cardiac magnetic resonance imaging (CMR). In 33 patients undergoing cardiac DSCT and CMR, RV parameters were calculated using the short-axis (DSCT, CMR) and axial orientation (DSCT). Intra- and interobserver variability were assessed by Bland-Altman analysis. Short-axis orientation: RV parameters of the two techniques were not statistically different. Axial orientation: RV volumes and mass were significantly overestimated compared with short-axis parameters whereas EF was similar. The short-axis approach resulted in low variability, although the axial orientation had the least amount of intra- and interobserver variability. RV parameters can be more accurately assessed by DSCT compared with CMR using short-axis slice orientation. RV volumes and mass are significantly higher using axial compared with short-axis slices, whereas EF is unaffected. RV parameters derived from both approaches yield high reproducibility. (orig.)

  3. Utilizing a Multi-Source Forest Inventory Technique, MODIS Data and Landsat TM Images in the Production of Forest Cover and Volume Maps for the Terai Physiographic Zone in Nepal

    Directory of Open Access Journals (Sweden)

    Kalle Eerikäinen

    2012-12-01

    Full Text Available An approach based on the nearest neighbors techniques is presented for producing thematic maps of forest cover (forest/non-forest and total stand volume for the Terai region in southern Nepal. To create the forest cover map, we used a combination of Landsat TM satellite data and visual interpretation data, i.e., a sample grid of visual interpretation plots for which we obtained the land use classification according to the FAO standard. These visual interpretation plots together with the field plots for volume mapping originate from an operative forest inventory project, i.e., the Forest Resource Assessment of Nepal (FRA Nepal project. The field plots were also used in checking the classification accuracy. MODIS satellite data were used as a reference in a local correction approach conducted for the relative calibration of Landsat TM images. This study applied a non-parametric k-nearest neighbor technique (k-NN to the forest cover and volume mapping. A tree height prediction approach based on a nonlinear, mixed-effects (NLME modeling procedure is presented in the Appendix. The MODIS image data performed well as reference data for the calibration approach applied to make the Landsat image mosaic. The agreement between the forest cover map and the field observed values of forest cover was substantial in Western Terai (KHAT 0.745 and strong in Eastern Terai (KHAT 0.825. The forest cover and volume maps that were estimated using the k-NN method and the inventory data from the FRA Nepal project are already appropriate and valuable data for research purposes and for the planning of forthcoming forest inventories. Adaptation of the methods and techniques was carried out using Open Source software tools.

  4. Assessment of LWR spent fuel disposal options. Volume 3. Study bases and system design considerations (Appendices). Technical report

    Energy Technology Data Exchange (ETDEWEB)

    1979-07-01

    Volume 3 (Appendices) provides a tabulation of the bases and assumptions used in the study as well as preconceptual design description and cost estimates of the facilities and transportation systems necessary to implement the various study cases.

  5. SOLVENT-BASED TO WATERBASED ADHESIVE-COATED SUBSTRATE RETROFIT - VOLUME III: LABEL MANUFACTURING CASE STUDY: NASHUA CORPORATION

    Science.gov (United States)

    This volume discusses Nashua Corporation's Omaha facility, a label and label stock manufacturing facility that no longer uses solvent-based adhesives. Information obtained includes issues related to the technical, economic, and environmental barriers and opportunities associated ...

  6. Research on laser induced acoustic source based underwater communication system

    Science.gov (United States)

    Lei, Lihua; Zhou, Ju; Zhang, Lei; Wan, Xiaoyun

    2016-10-01

    Acoustic transducers are traditionally used to generate underwater acoustical energy with the device physically immersed in water. Novel methods are required for communicating from an in-air platform or surface vessel to a submerged vessel. One possible noncontact downlink communication system involves the use of laser induced acoustic source. The most common mechanisms of opto-acoustic energy conversion are, by order of increasing laser energy density and efficiency, thermal expansion, surface evaporation and optical breakdown. The laser induced acoustic source inherently bears the obvious advantage of not requiring any physical transducer in the medium. At the same time, acoustic energy propagation is efficient in water, whereas optical energy propagate well in air, leading to a more efficiency opto-acoustic communication method. In this paper, an opto-acoustic underwater Communication system is described, aiming to study and analysis whether laser induced sound could achieve good performance for effective communication in practical application.

  7. Control and Driving Methods for LED Based Intelligent Light Sources

    DEFF Research Database (Denmark)

    Beczkowski, Szymon

    High power light-emitting diodes allow the creation of luminaires capable of generating saturated colour light at very high efficacies. Contrary to traditional light sources like incandescent and high-intensity discharge lamps, where colour is generated using filters, LEDs use additive light mixing......, where the intensity of each primary colour diode has to be adjusted to the needed intensity to generate specified colour. The function of LED driver is to supply the diode with power needed to achieve the desired intensity. Typically, the drivers operate as a current source and the intensity...... current. The model can also be used to create highly accurate luminaire model. Finally, a dual interleaved buck converter has been proposed for driving high power light-emitting diodes. Interleaving two converters lowers the output ripple current thus lowering the requirement on the output capacitor...

  8. A battery-based, low-noise voltage source

    Science.gov (United States)

    Wagner, Anke; Sturm, Sven; Schabinger, Birgit; Blaum, Klaus; Quint, Wolfgang

    2010-06-01

    A highly stable, low-noise voltage source was designed to improve the stability of the electrode bias voltages of a Penning trap. To avoid excess noise and ground loops, the voltage source is completely independent of the public electric network and uses a 12 V car battery to generate output voltages of ±15 and ±5 V. First, the dc supply voltage is converted into ac-voltage and gets amplified. Afterwards, the signal is rectified, filtered, and regulated to the desired output value. Each channel can deliver up to 1.5 A. The current as well as the battery voltage and the output voltages can be read out via a universal serial bus (USB) connection for monitoring purposes. With the presented design, a relative voltage stability of 7×10-7 over 6.5 h and a noise level equal or smaller than 30 nV/√Hz is achieved.

  9. Heavy Ion Injection Into Synchrotrons, Based On Electron String Ion Sources

    CERN Document Server

    Donets, E E; Syresin, E M

    2004-01-01

    A possibility of heavy ions injection into synchrotrons is discussed on the base of two novel ion sources, which are under development JINR during last decade: 1) the electron string ion source (ESIS), which is a modified version of a conventional electron beam ion source (EBIS), working in a reflex mode of operation, and 2) the tubular electron string ion source (TESIS). The Electron String Ion Source "Krion-2" (VBLHE, JINR, Dubna) with an applied confining magnetic field of 3 T was used for injection into the superconducting JINR synchrotron - Nuclotron and during this runs the source provided a high pulse intensity of the highly charged ion beams: Ar16+

  10. Online blind source separation based on joint diagonalization

    Institute of Scientific and Technical Information of China (English)

    Li Ronghua; Zhou Guoxu; Fang Zuyuan; Xie Shengli

    2009-01-01

    A now algorithm is proposed for joint diagonalization. With a modified objective function, the now algorithm not only excludes trivial and unbalanced solutions successfully, but is also easily optimized. In addition, with the new objective function, the proposed algorithm can work well in online blind source separation (BSS) for the first time, although this family of algorithms is always thought to be valid only in batch-mode BSS by far. Simulations show that it is a very competitive joint diagonalization algorithm.

  11. Pulsed neutron source based on accelerator-subcritical-assembly

    Energy Technology Data Exchange (ETDEWEB)

    Inoue, Makoto; Noda, Akira; Iwashita, Yoshihisa; Okamoto, Hiromi; Shirai, Toshiyuki [Kyoto Univ., Uji (Japan). Inst. for Chemical Research

    1997-03-01

    A new pulsed neutron source which consists of a 300MeV proton linac and a nuclear fuel subcritical assembly is proposed. The proton linac produces pulsed spallation neutrons, which are multipied by the subcritical assembly. A prototype proton linac that accelerates protons up to 7MeV has been developed and a high energy section of a DAW structure is studied with a power model. Halo formations in high intensity beam are also being studied. (author)

  12. Performance of positive ion based high power ion source of EAST neutral beam injector

    Science.gov (United States)

    Hu, Chundong; Xie, Yahong; Xie, Yuanlai; Liu, Sheng; Xu, Yongjian; Liang, Lizhen; Jiang, Caichao; Li, Jun; Liu, Zhimin

    2016-02-01

    The positive ion based source with a hot cathode based arc chamber and a tetrode accelerator was employed for a neutral beam injector on the experimental advanced superconducting tokamak (EAST). Four ion sources were developed and each ion source has produced 4 MW @ 80 keV hydrogen beam on the test bed. 100 s long pulse operation with modulated beam has also been tested on the test bed. The accelerator was upgraded from circular shaped to diamond shaped in the latest two ion sources. In the latest campaign of EAST experiment, four ion sources injected more than 4 MW deuterium beam with beam energy of 60 keV into EAST.

  13. Contrast volume reduction using third generation dual source computed tomography for the evaluation of patients prior to transcatheter aortic valve implantation

    Energy Technology Data Exchange (ETDEWEB)

    Bittner, Daniel O. [University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nuernberg (FAU), Department of Internal Medicine 2 (Cardiology), Erlangen (Germany); Harvard Medical School, Cardiac MR PET CT Program, Massachusetts General Hospital, Boston, MA (United States); Arnold, Martin; Klinghammer, Lutz; Schuhbaeck, Annika; Hell, Michaela M.; Muschiol, Gerd; Gauss, Soeren; Achenbach, Stephan; Marwan, Mohamed [University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nuernberg (FAU), Department of Internal Medicine 2 (Cardiology), Erlangen (Germany); Lell, Michael; Uder, Michael [University Hospital Erlangen, Friedrich-Alexander University Erlangen-Nuernberg (FAU), Department of Radiology, Erlangen (Germany); Hoffmann, Udo [Harvard Medical School, Cardiac MR PET CT Program, Massachusetts General Hospital, Boston, MA (United States)

    2016-12-15

    Chronic renal failure is common in patients referred for transcatheter aortic valve implantation (TAVI). CT angiography is recommended and provides crucial information prior to TAVI. We evaluated the feasibility of a reduced contrast volume protocol for pre-procedural CT imaging. Forty consecutive patients were examined with prospectively ECG-triggered high-pitch spiral acquisition using a novel third-generation dual-source CT system; 38 ml contrast agent was used. Image quality was graded on a visual scale (1-4). Contrast attenuation was measured at the level of the aortic root and at the iliac bifurcation. Mean patient age was 82 ± 6 years (23 males; 58 %). Mean attenuation/average image quality was 285 ± 60 HU/1.5 at the aortic annulus compared to 289 ± 74 HU/1.8 at the iliac bifurcation (p = 0.77/p = 0.29). Mean estimated effective radiation dose was 2.9 ± 0.3 mSv. A repeat acquisition was necessary in one patient due to image quality. Out of the 35 patients who underwent TAVI, 31 (89 %) patients had no or mild aortic regurgitation. Thirty-two (91 %) patients were discharged successfully. Pre-procedural CTA with a total of 38 ml contrast volume is feasible and clinically useful, using third-generation dual-source CT, allowing comprehensive imaging for procedural success. (orig.)

  14. Comparative analysis of CO2-based transcritical Rankine cycle and HFC245fa-based subcritical organic Rankine cycle using low-temperature geothermal source

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    A detailed thermodynamic and techno-economic comparison is presented for a CO2-based transcritical Rankine cycle and a subcritical organic Rankine cycle (ORC) using HFC245fa (1,1,1,3,3-pentafluoro-propane) as the working fluid driven by the low-temperature geothermal source,in order to determine the configuration that presents the maximum net power output with a minimum investment.The evaluations of both Rankine cycles have been performed based on equal thermodynamic mean heat rejection temperature by varying certain system operating parameters to achieve each Rankine cycle’s optimum design at various geothermal source temperature levels ranging from 80oC to 120oC.The results obtained show that the optimum ther-modynamic mean heat injection temperatures of both Rankine cycles are distributed in the scope of 55% to 65% of a given geothermal source temperature level,and that the CO2-based transcritical Rankine cycle presents 3% to 7% higher net power output,84% reduction of turbine inlet volume flow rate,47% reduction of expansion ratio and 1.68 times higher total heat transfer capacity compared with the HFC245fa-based subcritical ORC.It is also indicated that using the CO2-based transcritical system can reduce the dimension of turbine design.However,it requires larger heat transfer areas with higher strength heat exchanger materials because of the higher system pressure.

  15. Quantitative estimation of a ratio of intracranial cerebrospinal fluid volume to brain volume based on segmentation of CT images in patients with extra-axial hematoma.

    Science.gov (United States)

    Nguyen, Ha Son; Patel, Mohit; Li, Luyuan; Kurpad, Shekar; Mueller, Wade

    2017-02-01

    Background Diminishing volume of intracranial cerebrospinal fluid (CSF) in patients with space-occupying masses have been attributed to unfavorable outcome associated with reduction of cerebral perfusion pressure and subsequent brain ischemia. Objective The objective of this article is to employ a ratio of CSF volume to brain volume for longitudinal assessment of space-volume relationships in patients with extra-axial hematoma and to determine variability of the ratio among patients with different types and stages of hematoma. Patients and methods In our retrospective study, we reviewed 113 patients with surgical extra-axial hematomas. We included 28 patients (age 61.7 +/- 17.7 years; 19 males, nine females) with an acute epidural hematoma (EDH) ( n = 5) and subacute/chronic subdural hematoma (SDH) ( n = 23). We excluded 85 patients, in order, due to acute SDH ( n = 76), concurrent intraparenchymal pathology ( n = 6), and bilateral pathology ( n = 3). Noncontrast CT images of the head were obtained using a CT scanner (2004 GE LightSpeed VCT CT system, tube voltage 140 kVp, tube current 310 mA, 5 mm section thickness) preoperatively, postoperatively (3.8 ± 5.8 hours from surgery), and at follow-up clinic visit (48.2 ± 27.7 days after surgery). Each CT scan was loaded into an OsiriX (Pixmeo, Switzerland) workstation to segment pixels based on radiodensity properties measured in Hounsfield units (HU). Based on HU values from -30 to 100, brain, CSF spaces, vascular structures, hematoma, and/or postsurgical fluid were segregated from bony structures, and subsequently hematoma and/or postsurgical fluid were manually selected and removed from the images. The remaining images represented overall brain volume-containing only CSF spaces, vascular structures, and brain parenchyma. Thereafter, the ratio between the total number of voxels representing CSF volume (based on values between 0 and 15 HU) to the total number of voxels

  16. Carbon nanotube based X-ray sources: Applications in pre-clinical and medical imaging

    Science.gov (United States)

    Lee, Yueh Z.; Burk, Laurel; Wang, Ko-Han; Cao, Guohua; Lu, Jianping; Zhou, Otto

    2011-08-01

    Field emission offers an alternate method of electron production for Bremsstrahlung based X-ray tubes. Carbon nanotubes (CNTs) serve as very effective field emitters, allowing them to serve as electron sources for X-ray sources, with specific advantages over traditional thermionic tubes. CNT derived X-ray sources can create X-ray pulses of any duration and frequency, gate the X-ray pulse to any source and allow the placement of many sources in close proximity.We have constructed a number of micro-CT systems based on CNT X-ray sources for applications in small animal imaging, specifically focused on the imaging of the heart and lungs. This paper offers a review of the pre-clinical applications of the CNT based micro-CT that we have developed. We also discuss some of the current and potential clinical applications of the CNT X-ray sources.

  17. TVA coal-gasification commercial demonstration plant project. Volume 5. Plant based on Koppers-Totzek gasifier. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-11-01

    This volume presents a technical description of a coal gasification plant, based on Koppers-Totzek gasifiers, producing a medium Btu fuel gas product. Foster Wheeler carried out a conceptual design and cost estimate of a nominal 20,000 TPSD plant based on TVA design criteria and information supplied by Krupp-Koppers concerning the Koppers-Totzek coal gasification process. Technical description of the design is given in this volume.

  18. Advanced NSCLC First Pass Perfusion at 64-slice CT: Reproducibility of Volume-based Quantitative Measurement

    Directory of Open Access Journals (Sweden)

    Jie HU

    2010-05-01

    Full Text Available Background and objective The aim of this study is to explore the reproducibility of volume-based quantitative measurement of non-small cell lung cancer (NSCLC perfusion at 64-slice CT. Methods Fourteen patients with proved advanced NSCLC were enrolled in this dynamic first pass volume-based CT perfusion (CTP study (8×5 mm collimation, and they underwent the second scan within 24 h. According to the longest diameters, those patients were classified to ≤3 cm and >3 cm groups, and each group had 7 patients. Intraclass correlation coefficient (ICC and Bland-Altman statistics were used to evaluate the reproducibility of CTP imaging. Results In both groups of advanced NSCLC, the reproducibility with BF, BV, and PS values were good (ICC >0.75 for all, but mean transit time (MTT values. For advanced NSCLC (≤3 cm, repeatability coefficient (RC values with blood flow (BF, blood volume (BV, MTT and permeability surface area product (PS values were 56%, 45%, 114%, and 78%, respectively, and the 95% change intervals of RC were -39%-53%, -29%-62%, -83%-145%, and -57%-98%, respectively. For advanced NSCLC (>3 cm, those values were 46%, 30%, 59%, and 33%, respectively, and the 95% change intervals of RC were -48%-45%, -33%-26%, -54%-64%, and -18%-48%. Conclusion There is greater reproducibility of tumor size >3 cm than that of ≤3 cm. BF and BV could be addressed for reliable clinical application in antiangiogenesis therapeutic monitoring with advanced NSCLC patients.

  19. LESTO: an Open Source GIS-based toolbox for LiDAR analysis

    Science.gov (United States)

    Franceschi, Silvia; Antonello, Andrea; Tonon, Giustino

    2015-04-01

    During the last five years different research institutes and private companies stared to implement new algorithms to analyze and extract features from LiDAR data but only a few of them also created a public available software. In the field of forestry there are different examples of software that can be used to extract the vegetation parameters from LiDAR data, unfortunately most of them are closed source (even if free), which means that the source code is not shared with the public for anyone to look at or make changes to. In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to 1) preprocessing of LiDAR raw data mainly in LAS format (utilities and filtering); 2) creation of raster derived products; 3) flight-lines identification and normalization of the intensity values; 4) tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach starting with the implementation of some of the most used algorithms in literature. These have been tweaked and applied on LiDAR derived raster datasets (DTM, DSM) as well as point clouds of raw data. The methods range between the simple extraction of tops and crowns from local maxima, the region growing method, the watershed method and individual tree segmentation on point clouds. The validation procedure consists in finding the matching between field and LiDAR-derived measurements at individual tree and plot level. An automatic validation procedure has been developed

  20. Tunable light source with GaN-based violet laser diode

    Science.gov (United States)

    Omori, Masaki; Mori, Naoki; Dejima, Norihiro

    2013-03-01

    GaN based violet Laser Diode has been applying for the industrial market with unique high potential characters. It has possibility Replacing Gas lasers, Dye Lasers, SHG lasers and Solid-state Lasers and more. Diode based laser extreme small and low costs at the high volume range. In addition GaN Laser has high quality with long lifetime and has possibility to cover the wide wavelength range as between 375 to 520nm. However, in general, diode based laser could only lase with Longitudinal Multi Mode. Therefore applicable application field should be limited and it was difficult to apply for the analysis. Recently, Single Longitudinal Mode laser with GaN diode has also be accomplished with external cavity by Nichia Corporation. External cavity laser achieved at least much higher than 20dB SMSR. The feature of installing laser is that Laser on the front facet with AR coating to avoid chip mode lasing. In general, external cavity laser has been required precision of mechanical assembly and Retention Capability. Nichia has gotten rid of the issue with Intelligence Cavity and YAG Laser welding assembly technique. This laser has also been installed unique feature that the longitudinal mode could be maintained to Single Mode lasing with installing internal functional sensors in the tunable laser.*1,*2 This tunable laser source could lock a particular wavelength optionally between 390 to 465nm wavelength range. As the results, researcher will have benefit own study and it will be generated new market with the laser in the near future.

  1. A review of laser and synchrotron based X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R. [Paris-Sud Univ., Orsay (France). LSAI; Key, M.H. [Paris-Sud Univ., Orsay (France). LSAI; Lawrence Livermore National Lab., CA (United States)

    2001-07-01

    The rapid development of laser technology and related progress in research using lasers is shifting the boundaries where laser based sources are preferred over other light sources particularly in the XUV and X-ray spectral region. Laser based sources have exceptional capability for short pulse and high brightness and with improvements in high repetition rate pulsed operation, such sources are also becoming more interesting for their average power capability. This study presents an evaluation of the current capabilities and near term future potential of laser based light sources and summarises, for the purpose of comparison, the characteristics and near term prospects of sources based on synchrotron radiation and free electron lasers. Relative comparisons are given within charts of peak brightness. (orig.)

  2. A Clustering-Based Automatic Transfer Function Design for Volume Visualization

    Directory of Open Access Journals (Sweden)

    Tianjin Zhang

    2016-01-01

    Full Text Available The two-dimensional transfer functions (TFs designed based on intensity-gradient magnitude (IGM histogram are effective tools for the visualization and exploration of 3D volume data. However, traditional design methods usually depend on multiple times of trial-and-error. We propose a novel method for the automatic generation of transfer functions by performing the affinity propagation (AP clustering algorithm on the IGM histogram. Compared with previous clustering algorithms that were employed in volume visualization, the AP clustering algorithm has much faster convergence speed and can achieve more accurate clustering results. In order to obtain meaningful clustering results, we introduce two similarity measurements: IGM similarity and spatial similarity. These two similarity measurements can effectively bring the voxels of the same tissue together and differentiate the voxels of different tissues so that the generated TFs can assign different optical properties to different tissues. Before performing the clustering algorithm on the IGM histogram, we propose to remove noisy voxels based on the spatial information of voxels. Our method does not require users to input the number of clusters, and the classification and visualization process is automatic and efficient. Experiments on various datasets demonstrate the effectiveness of the proposed method.

  3. Producing Terahertz Conherent Synchrotron Radiation Based On Hefei Light Source

    CERN Document Server

    De-Rong, Xu; Yan, Shao

    2014-01-01

    This paper theoretically proves that an electron storage ring can generate coherent radiation in THz region using a quick kicker magnet and an ac sextupole magnet. When the vertical chromaticity is modulated by the ac sextupole magnet, the vertical beam collective motion excited by the kicker produces a wavy spatial structure after a number of longitudinal oscillation periods. We calculate the radiation spectral distribution from the wavy bunch in Hefei Light Source(HLS). If we reduce electron energy to 400MeV, it can produce extremely strong coherent synchrotron radiation(CSR) at 0.115THz.

  4. Silicon-Based Light Sources for Silicon Integrated Circuits

    Directory of Open Access Journals (Sweden)

    L. Pavesi

    2008-01-01

    Full Text Available Silicon the material per excellence for electronics is not used for sourcing light due to the lack of efficient light emitters and lasers. In this review, after having introduced the basics on lasing, I will discuss the physical reasons why silicon is not a laser material and the approaches to make it lasing. I will start with bulk silicon, then I will discuss silicon nanocrystals and Er3+ coupled silicon nanocrystals where significant advances have been done in the past and can be expected in the near future. I will conclude with an optimistic note on silicon lasing.

  5. Polymer and small molecule based hybrid light source

    Science.gov (United States)

    Choong, Vi-En; Choulis, Stelios; Krummacher, Benjamin Claus; Mathai, Mathew; So, Franky

    2010-03-16

    An organic electroluminescent device, includes: a substrate; a hole-injecting electrode (anode) coated over the substrate; a hole injection layer coated over the anode; a hole transporting layer coated over the hole injection layer; a polymer based light emitting layer, coated over the hole transporting layer; a small molecule based light emitting layer, thermally evaporated over the polymer based light emitting layer; and an electron-injecting electrode (cathode) deposited over the electroluminescent polymer layer.

  6. Research on Passivity Based Controller of Three Phase Voltage Source PWM Rectifier

    Directory of Open Access Journals (Sweden)

    Yin Hongren

    2012-09-01

    Full Text Available Euler-Lagrange (EL model of voltage source PWM rectifier is set up based on its model in synchronous dq coordinates. Passivity based controller is designed on the basis of passivity and EL model of voltage source PWM rectifier. Three switching function are educed by passivity based controller. A switching function is only realized in engineering consequently. Voltage source PWM rectifier using passivity based controller has many advantages, such as simpler structure, low total harmonic distortion, and good disturbance rejection performance. Passivity based control law is proved feasible by simulink simulation.  

  7. Developing a beta source based setup for pixel sensor characterization

    CERN Document Server

    Schouwenberg, Jeroen

    2014-01-01

    The main goal of this project is to provide mono-energetic minimum ionizing electrons from a $^{90}$Sr source using a magnetic monochromator, and thus provide a useful tool for in-lab sensor characterization. The monochromator is calibrated using a setup, with a heavy inorganic scintillator and a PMT, which has been calibrated with a $^{22}$Na gamma source. The average energy of the electrons as a function of the current in the monochromator coil is found to be $1.38\\pm0.01$ keV/mA, taking into consideration the effect of the magnetic field on the signal of the PMT. For integration into the pixel sensor test bench, scintillator-counters (a plastic scintillator connected to a PMT) are used. Their response to the electron energies is observed to follow a saturation curve, which leads to a more identical response for high energetic electrons. A preliminary pixel sensor test bench has been set up and properties such as voltage and discriminator settings have been studied as well as count rates for coincidence cou...

  8. ECR ion source based low energy ion beam facility

    Indian Academy of Sciences (India)

    P Kumar; G Rodrigues; U K Rao; C P Safvan; D Kanjilal; A Roy

    2002-11-01

    Mass analyzed highly charged ion beams of energy ranging from a few keV to a few MeV plays an important role in various aspects of research in modern physics. In this paper a unique low energy ion beam facility (LEIBF) set up at Nuclear Science Centre (NSC) for providing low and medium energy multiply charged ion beams ranging from a few keV to a few MeV for research in materials sciences, atomic and molecular physics is described. One of the important features of this facility is the availability of relatively large currents of multiply charged positive ions from an electron cyclotron resonance (ECR) source placed entirely on a high voltage platform. All the electronic and vacuum systems related to the ECR source including 10 GHz ultra high frequency (UHF) transmitter, high voltage power supplies for extractor and Einzel lens are placed on a high voltage platform. All the equipments are controlled using a personal computer at ground potential through optical fibers for high voltage isolation. Some of the experimental facilities available are also described.

  9. About the possible options for models of convective heat transfer in closed volumes with local heating source

    Directory of Open Access Journals (Sweden)

    Maksimov Vyacheslav I.

    2015-01-01

    Full Text Available Results of mathematical modeling of convective heat transfer in air area surrounded on all sides enclosing structures, in the presence of heat source at the lower boundary of the media are presented. Solved the system of differential equations of unsteady Navier-Stokes equations with the appropriate initial and boundary conditions. The process of convective heat transfer is calculated using the models of turbulence Prandtl and Prandtl-Reichard. Takes into account the processes of heat exchange region considered with the environment. Is carried out the analysis of the dimensionless heat transfer coefficient at interfaces “air – enclosures”. The distributions average along the gas temperature range are obtained.

  10. Changes in the planning target volume and liver volume dose based on the selected respiratory phase in respiratory-gated radiation therapy for a hepatocellular carcinoma

    Science.gov (United States)

    Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Baek, Seong-Min

    2013-11-01

    The aim of this study was to quantitatively analyze the changes in the planning target volume (PTV) and liver volume dose based on the respiratory phase to identify the optimal respiratory phase for respiratory-gated radiation therapy for a hepatocellular carcinoma (HCC). Based on the standardized procedure for respiratory-gated radiation therapy, we performed a 4-dimensional computed tomography simulation for 0 ˜ 90%, 30 ˜ 70%, and 40 ˜ 60% respiratory phases to assess the respiratory stability (S R ) and the defined PTV i for each respiratory phase i. A treatment plan was established, and the changes in the PTV i and dose volume of the liver were quantitatively analyzed. Most patients (91.5%) passed the respiratory stability test (S R = 0.111 ± 0.015). With standardized respiration training exercises, we were able to minimize the overall systematic error caused by irregular respiration. Furthermore, a quantitative analysis to identify the optimal respiratory phase revealed that when a short respiratory phase (40 ˜ 60%) was used, the changes in the PTV were concentrated inside the center line; thus, we were able to obtain both a PTV margin accounting for respiration and a uniform radiation dose within the PTV.

  11. Statistical and systematic uncertainties in pixel-based source reconstruction algorithms for gravitational lensing

    CERN Document Server

    Tagore, Amitpal

    2014-01-01

    Gravitational lens modeling of spatially resolved sources is a challenging inverse problem with many observational constraints and model parameters. We examine established pixel-based source reconstruction algorithms for de-lensing the source and constraining lens model parameters. Using test data for four canonical lens configurations, we explore statistical and systematic uncertainties associated with gridding, source regularisation, interpolation errors, noise, and telescope pointing. Specifically, we compare two gridding schemes in the source plane: a fully adaptive grid that follows the lens mapping but is irregular, and an adaptive Cartesian grid. We also consider regularisation schemes that minimise derivatives of the source (using two finite difference methods) and introduce a scheme that minimises deviations from an analytic source profile. Careful choice of gridding and regularisation can reduce "discreteness noise" in the $\\chi^2$ surface that is inherent in the pixel-based methodology. With a grid...

  12. A GIS-based time-dependent seismic source modeling of Northern Iran

    Science.gov (United States)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  13. Volume-assisted estimation of liver function based on Gd-EOB-DTPA-enhanced MR relaxometry

    Energy Technology Data Exchange (ETDEWEB)

    Haimerl, Michael; Schlabeck, Mona; Verloh, Niklas; Fellner, Claudia; Stroszczynski, Christian; Wiggermann, Philipp [University Hospital Regensburg, Department of Radiology, Regensburg (Germany); Zeman, Florian [University Hospital Regensburg, Center for Clinical Trials, Regensburg (Germany); Nickel, Dominik [MR Applications Development, Siemens AG, Healthcare Sector, Erlangen (Germany); Barreiros, Ana Paula [University Hospital Regensburg, Department of Internal Medicine I, Regensburg (Germany); Loss, Martin [University Hospital Regensburg, Department of Surgery, Regensburg (Germany)

    2016-04-15

    To determine whether liver function as determined by indocyanine green (ICG) clearance can be estimated quantitatively from hepatic magnetic resonance (MR) relaxometry with gadoxetic acid (Gd-EOB-DTPA). One hundred and seven patients underwent an ICG clearance test and Gd-EOB-DTPA-enhanced MRI, including MR relaxometry at 3 Tesla. A transverse 3D VIBE sequence with an inline T1 calculation was acquired prior to and 20 minutes post-Gd-EOB-DTPA administration. The reduction rate of T1 relaxation time (rrT1) between pre- and post-contrast images and the liver volume-assisted index of T1 reduction rate (LVrrT1) were evaluated. The plasma disappearance rate of ICG (ICG-PDR) was correlated with the liver volume (LV), rrT1 and LVrrT1, providing an MRI-based estimated ICG-PDR value (ICG-PDR{sub est}). Simple linear regression model showed a significant correlation of ICG-PDR with LV (r = 0.32; p = 0.001), T1{sub post} (r = 0.65; p < 0.001) and rrT1 (r = 0.86; p < 0.001). Assessment of LV and consecutive evaluation of multiple linear regression model revealed a stronger correlation of ICG-PDR with LVrrT1 (r = 0.92; p < 0.001), allowing for the calculation of ICG-PDR{sub est}. Liver function as determined using ICG-PDR can be estimated quantitatively from Gd-EOB-DTPA-enhanced MR relaxometry. Volume-assisted MR relaxometry has a stronger correlation with liver function than does MR relaxometry. (orig.)

  14. A characteristic based volume penalization method for general evolution problems applied to compressible viscous flows

    Science.gov (United States)

    Brown-Dymkoski, Eric; Kasimov, Nurlybek; Vasilyev, Oleg V.

    2014-04-01

    In order to introduce solid obstacles into flows, several different methods are used, including volume penalization methods which prescribe appropriate boundary conditions by applying local forcing to the constitutive equations. One well known method is Brinkman penalization, which models solid obstacles as porous media. While it has been adapted for compressible, incompressible, viscous and inviscid flows, it is limited in the types of boundary conditions that it imposes, as are most volume penalization methods. Typically, approaches are limited to Dirichlet boundary conditions. In this paper, Brinkman penalization is extended for generalized Neumann and Robin boundary conditions by introducing hyperbolic penalization terms with characteristics pointing inward on solid obstacles. This Characteristic-Based Volume Penalization (CBVP) method is a comprehensive approach to conditions on immersed boundaries, providing for homogeneous and inhomogeneous Dirichlet, Neumann, and Robin boundary conditions on hyperbolic and parabolic equations. This CBVP method can be used to impose boundary conditions for both integrated and non-integrated variables in a systematic manner that parallels the prescription of exact boundary conditions. Furthermore, the method does not depend upon a physical model, as with porous media approach for Brinkman penalization, and is therefore flexible for various physical regimes and general evolutionary equations. Here, the method is applied to scalar diffusion and to direct numerical simulation of compressible, viscous flows. With the Navier-Stokes equations, both homogeneous and inhomogeneous Neumann boundary conditions are demonstrated through external flow around an adiabatic and heated cylinder. Theoretical and numerical examination shows that the error from penalized Neumann and Robin boundary conditions can be rigorously controlled through an a priori penalization parameter η. The error on a transient boundary is found to converge as O

  15. Animation framework using volume visualization

    Science.gov (United States)

    Fang, Wenxuan; Wang, Hongli

    2004-03-01

    As the development of computer graphics, scientific visualization and advanced imaging scanner and sensor technology, high quality animation making of volume data set has been a challenging in industries. A simple animation framework by using current volume visualization techniques is proposed in this paper. The framework consists of two pipelines: one is surface based method by using marching cubes algorithm, the other is volume rendering method by using shear-warp method. The volume visualization results can not only be used as key frame sources in the animation making, but also can be directly used as animation when the volume visualization is in stereoscopic mode. The proposed framework can be applied into fields such as medical education, film-making and archaeology.

  16. FEASIBILITY STUDY II OF A MUON BASED NEUTRINO SOURCE.

    Energy Technology Data Exchange (ETDEWEB)

    GALLARDO,J.C.; OZAKI,S.; PALMER,R.B.; ZISMAN,M.

    2001-06-30

    The concept of using a muon storage ring to provide a well characterized beam of muon and electron neutrinos (a Neutrino Factory) has been under study for a number of years now at various laboratories throughout the world. The physics program of a Neutrino Factoryis focused on the relatively unexplored neutrino sector. In conjunction with a detector located a suitable distance from the neutrino source, the facility would make valuable contributions to the study of neutrino masses and lepton mixing. A Neutrino Factory is expected to improve the measurement accuracy of sin{sup 2}(2{theta}{sub 23}) and {Delta}m{sup 2}{sub 32} and provide measurements of sin{sup 2}(2{theta}{sub 13}) and the sign of {Delta}m{sup 2}{sub 32}. It may also be able to measure CP violation in the lepton sector.

  17. A novel convolution-based approach to address ionization chamber volume averaging effect in model-based treatment planning systems

    Science.gov (United States)

    Barraclough, Brendan; Li, Jonathan G.; Lebron, Sharon; Fan, Qiyong; Liu, Chihray; Yan, Guanghua

    2015-08-01

    The ionization chamber volume averaging effect is a well-known issue without an elegant solution. The purpose of this study is to propose a novel convolution-based approach to address the volume averaging effect in model-based treatment planning systems (TPSs). Ionization chamber-measured beam profiles can be regarded as the convolution between the detector response function and the implicit real profiles. Existing approaches address the issue by trying to remove the volume averaging effect from the measurement. In contrast, our proposed method imports the measured profiles directly into the TPS and addresses the problem by reoptimizing pertinent parameters of the TPS beam model. In the iterative beam modeling process, the TPS-calculated beam profiles are convolved with the same detector response function. Beam model parameters responsible for the penumbra are optimized to drive the convolved profiles to match the measured profiles. Since the convolved and the measured profiles are subject to identical volume averaging effect, the calculated profiles match the real profiles when the optimization converges. The method was applied to reoptimize a CC13 beam model commissioned with profiles measured with a standard ionization chamber (Scanditronix Wellhofer, Bartlett, TN). The reoptimized beam model was validated by comparing the TPS-calculated profiles with diode-measured profiles. Its performance in intensity-modulated radiation therapy (IMRT) quality assurance (QA) for ten head-and-neck patients was compared with the CC13 beam model and a clinical beam model (manually optimized, clinically proven) using standard Gamma comparisons. The beam profiles calculated with the reoptimized beam model showed excellent agreement with diode measurement at all measured geometries. Performance of the reoptimized beam model was comparable with that of the clinical beam model in IMRT QA. The average passing rates using the reoptimized beam model increased substantially from 92.1% to

  18. Restraint of appetite and reduced regional brain volumes in anorexia nervosa: a voxel-based morphometric study

    Directory of Open Access Journals (Sweden)

    Brooks Samantha J

    2011-11-01

    Full Text Available Abstract Background Previous Magnetic Resonance Imaging (MRI studies of people with anorexia nervosa (AN have shown differences in brain structure. This study aimed to provide preliminary extensions of this data by examining how different levels of appetitive restraint impact on brain volume. Methods Voxel based morphometry (VBM, corrected for total intracranial volume, age, BMI, years of education in 14 women with AN (8 RAN and 6 BPAN and 21 women (HC was performed. Correlations between brain volume and dietary restraint were done using Statistical Package for the Social Sciences (SPSS. Results Increased right dorsolateral prefrontal cortex (DLPFC and reduced right anterior insular cortex, bilateral parahippocampal gyrus, left fusiform gyrus, left cerebellum and right posterior cingulate volumes in AN compared to HC. RAN compared to BPAN had reduced left orbitofrontal cortex, right anterior insular cortex, bilateral parahippocampal gyrus and left cerebellum. Age negatively correlated with right DLPFC volume in HC but not in AN; dietary restraint and BMI predicted 57% of variance in right DLPFC volume in AN. Conclusions In AN, brain volume differences were found in appetitive, somatosensory and top-down control brain regions. Differences in regional GMV may be linked to levels of appetitive restraint, but whether they are state or trait is unclear. Nevertheless, these discrete brain volume differences provide candidate brain regions for further structural and functional study in people with eating disorders.

  19. Dyslexia and voxel-based morphometry: correlations between five behavioural measures of dyslexia and gray and white matter volumes.

    Science.gov (United States)

    Tamboer, Peter; Scholte, H Steven; Vorst, Harrie C M

    2015-10-01

    In voxel-based morphometry studies of dyslexia, the relation between causal theories of dyslexia and gray matter (GM) and white matter (WM) volume alterations is still under debate. Some alterations are consistently reported, but others failed to reach significance. We investigated GM alterations in a large sample of Dutch students (37 dyslexics and 57 non-dyslexics) with two analyses: group differences in local GM and total GM and WM volume and correlations between GM and WM volumes and five behavioural measures. We found no significant group differences after corrections for multiple comparisons although total WM volume was lower in the group of dyslexics when age was partialled out. We presented an overview of uncorrected clusters of voxels (p  200) with reduced or increased GM volume. We found four significant correlations between factors of dyslexia representing various behavioural measures and the clusters found in the first analysis. In the whole sample, a factor related to performances in spelling correlated negatively with GM volume in the left posterior cerebellum. Within the group of dyslexics, a factor related to performances in Dutch-English rhyme words correlated positively with GM volume in the left and right caudate nucleus and negatively with increased total WM volume. Most of our findings were in accordance with previous reports. A relatively new finding was the involvement of the caudate nucleus. We confirmed the multiple cognitive nature of dyslexia and suggested that experience greatly influences anatomical alterations depending on various subtypes of dyslexia, especially in a student sample.

  20. Restraint of appetite and reduced regional brain volumes in anorexia nervosa: a voxel-based morphometric study.

    Science.gov (United States)

    Brooks, Samantha J; Barker, Gareth J; O'Daly, Owen G; Brammer, Michael; Williams, Steven C R; Benedict, Christian; Schiöth, Helgi B; Treasure, Janet; Campbell, Iain C

    2011-11-17

    Previous Magnetic Resonance Imaging (MRI) studies of people with anorexia nervosa (AN) have shown differences in brain structure. This study aimed to provide preliminary extensions of this data by examining how different levels of appetitive restraint impact on brain volume. Voxel based morphometry (VBM), corrected for total intracranial volume, age, BMI, years of education in 14 women with AN (8 RAN and 6 BPAN) and 21 women (HC) was performed. Correlations between brain volume and dietary restraint were done using Statistical Package for the Social Sciences (SPSS). Increased right dorsolateral prefrontal cortex (DLPFC) and reduced right anterior insular cortex, bilateral parahippocampal gyrus, left fusiform gyrus, left cerebellum and right posterior cingulate volumes in AN compared to HC. RAN compared to BPAN had reduced left orbitofrontal cortex, right anterior insular cortex, bilateral parahippocampal gyrus and left cerebellum. Age negatively correlated with right DLPFC volume in HC but not in AN; dietary restraint and BMI predicted 57% of variance in right DLPFC volume in AN. In AN, brain volume differences were found in appetitive, somatosensory and top-down control brain regions. Differences in regional GMV may be linked to levels of appetitive restraint, but whether they are state or trait is unclear. Nevertheless, these discrete brain volume differences provide candidate brain regions for further structural and functional study in people with eating disorders.

  1. A Latent Source Model for Patch-Based Image Segmentation.

    Science.gov (United States)

    Chen, George H; Shah, Devavrat; Golland, Polina

    2015-10-01

    Despite the popularity and empirical success of patch-based nearest-neighbor and weighted majority voting approaches to medical image segmentation, there has been no theoretical development on when, why, and how well these nonparametric methods work. We bridge this gap by providing a theoretical performance guarantee for nearest-neighbor and weighted majority voting segmentation under a new probabilistic model for patch-based image segmentation. Our analysis relies on a new local property for how similar nearby patches are, and fuses existing lines of work on modeling natural imagery patches and theory for nonparametric classification. We use the model to derive a new patch-based segmentation algorithm that iterates between inferring local label patches and merging these local segmentations to produce a globally consistent image segmentation. Many existing patch-based algorithms arise as special cases of the new algorithm.

  2. The application of large volume airgun sources to the onshore-offshore seismic surveys: implication of the experimental results in northern South China Sea

    Institute of Scientific and Technical Information of China (English)

    QIU XueLin; CHEN Yong; ZHU RiXiang; XU HuiLong; SHI XiaoBin; YE ChunMing; ZHAO MingHui; XIA ShaoHong

    2007-01-01

    Onshore-offshore seismic experiments were carried out for the first time in northern South China Sea using large volume airgun sources at sea and seismic stations on land. The experimental results indicate that seismic signals from the new airgun array of R/V Shiyan 2 can be detected as far as 255 km. The signal effective area reaches nearly 50000 km2, which covers Hong Kong and Pearl River Delta. Compared with the old airgun array, the signal amplitude, propagation distance and effective area of the new airgun array have been increased notably, which demonstrates that the upgrade of the airgun source was successful. Comparisons with previous experimental results in other regions show that the shooting effect of the new airgun array is similar to those best airgun sources in the world. Especially, it is a new breakthrough in using the permanent seismic stations onshore to record long distance airgun signals offshore, which has great significance to the realization of the "seismic radar" concept and the 3D seismic surveys of crustal structure in coastal areas.

  3. Beacon system based on light-emitting diode sources for runways lighting

    Science.gov (United States)

    Montes, Mario González; Vázquez, Daniel; Fernandez-Balbuena, Antonio A.; Bernabeu, Eusebio

    2014-06-01

    New aeronautical ground lighting techniques are becoming increasingly important to ensure the safety and reduce the maintenance costs of the plane's tracks. Until recently, tracks had embedded lighting systems whose sources were based on incandescent lamps. But incandescent lamps have several disadvantages: high energy consumption and frequent breakdowns that result in high maintenance costs (lamp average life-time is ˜1500 operating hours) and the lamp's technology has a lack of new lighting functions, such as signal handling and modification. To solve these problems, the industry has developed systems based on light-emitting diode (LED) technology with improved features: (1) LED lighting consumes one tenth the power, (2) it improves preventive maintenance (an LED's lifetime range is between 25,000 and 100,000 hours), and (3) LED lighting technology can be controlled remotely according to the needs of the track configuration. LEDs have been in use for more than three decades, but only recently, around 2002, have they begun to be used as visual aids, representing the greatest potential change for airport lighting since their inception in the 1920s. Currently, embedded LED systems are not being broadly used due to the specific constraints of the rules and regulations of airports (beacon dimensions, power system technology, etc.). The fundamental requirements applied to embedded lighting systems are to be hosted on a volume where the dimensions are usually critical and also to integrate all the essential components for operation. An embedded architecture that meets the lighting regulations for airport runways is presented. The present work is divided into three main tasks: development of an optical system to optimize lighting according to International Civil Aviation Organization, manufacturing prototype, and model validation.

  4. Modifications to ORIGEN2 for generating N Reactor source terms. Volume 3: ORIGEN2 N-Reactor output files

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    This text is intended to be a brief outline of the ORIGEN2 computer code which is a revised and updated version of the ORIGEN documented in report ORNL-4628 (May 1973). Included here are: a brief description of the functions of ORIGEN2; a listing of the major data sources; a listing of the published documentation concerning ORIGEN2; and an outline of the ORIGEN2 output organization. ORIGEN2 is available from the ORNL Radiation Shielding Information Center (RSIC). Past experience has indicated that many users encounter considerable difficulty in finding the desired information in a ORIGEN2 output which is sometimes rather massive. This section is intended as a brief outline of the organization of ORIGEN2 output.

  5. Modification to ORIGEN2 for generating N Reactor source terms. Volume 2: ORIGEN2 N-Reactor output files

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    This text is intended to be a brief outline of the ORIGEN2 computer code, which is a revised and updated version of the ORIGEN documented i report ORNL-4628 (May 1973). Included here are: a brief description of the functions of ORIGEN2; a listing of the major data sources; a listing of the published documentation concerning ORIGEN2; and an outline of the ORIGEN2 output organization. ORIGEN2 is a available from the ORNL Radiation Shielding Information Center (RSIC). Past experience has indicated that many users encounter considerable difficulty in finding the desired information in ORIGEN2 output which is sometimes rather massive. This section is intended as a brief outline of the organization of ORIGEN2 output.

  6. Modifications to ORIGEN2 for generating N Reactor source terms. Volume 3: ORIGEN2 N-Reactor output files

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-06-01

    This text is intended to be a brief outline of the ORIGEN2 computer code which is a revised and updated version of the ORIGEN documented in report ORNL-4628 (May 1973). Included here are: a brief description of the functions of ORIGEN2; a listing of the major data sources; a listing of the published documentation concerning ORIGEN2; and an outline of the ORIGEN2 output organization. ORIGEN2 is available from the ORNL Radiation Shielding Information Center (RSIC). Past experience has indicated that many users encounter considerable difficulty in finding the desired information in a ORIGEN2 output which is sometimes rather massive. This section is intended as a brief outline of the organization of ORIGEN2 output.

  7. Gray Matter Volume Decreases in Elderly Patients with Schizophrenia: A Voxel-based Morphometry Study

    Science.gov (United States)

    Schuster, Caroline; Schuller, Anne Marie; Paulos, Carlos; Namer, Izzie; Pull, Charles; Danion, Jean Marie; Foucher, Jack René

    2012-01-01

    Background: Aged patients (>50 years old) with residual schizophrenic symptoms differ from young patients. They represent a subpopulation with a more unfavorable Kraepelinian course and have an increased risk (up to 30%) for dementia of unknown origin. However, our current understanding of age-related brain changes in schizophrenia is derived from studies that included less than 17% of patients who were older than 50 years of age. This study investigated the anatomical distribution of gray matter (GM) brain deficits in aged patients with ongoing schizophrenia. Methods: Voxel-based morphometry was applied to 3D-T1 magnetic resonance images obtained from 27 aged patients with schizophrenia (mean age of 60 years) and 40 age-matched normal controls. Results: Older patients with schizophrenia showed a bilateral reduction of GM volume in the thalamus, the prefrontal cortex, and in a large posterior region centered on the occipito-temporo-parietal junction. Only the latter region showed accelerated GM volume loss with increasing age. None of these results could be accounted for by institutionalization, antipsychotic medication, or cognitive scores. Conclusions: This study replicated most common findings in patients with schizophrenia with regard to thalamic and frontal GM deficits. However, it uncovered an unexpected large region of GM atrophy in the posterior tertiary cortices. The latter observation may be specific to this aged and chronically symptomatic subpopulation, as atrophy in this region is rarely reported in younger patients and is accelerated with age. PMID:21205677

  8. Robins Air Force Base integrated resource assessment. Volume 3, Resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, G.P.; Keller, J.M.; Stucky, D.J.; Wahlstrom, R.R.; Larson, L.L.

    1993-10-01

    The US Air Force Materiel Command (AFMC) has tasked the US Department of Energy (DOE) Federal Energy Management Program (FEMP), supported by the Pacific Northwest Laboratory (PNL), to identify, evaluate, and assist in acquiring all cost-effective energy projects at Robins Air Force Base (AFB). This is part of a model program that PNL is designing to support energy-use decisions in the federal sector. This report provides the results of the fossil fuel and electric energy resource opportunity (ERO) assessments performed by PNL at the AFMC Robins AFB facility located approximately 15 miles south of Macon, Georgia. It is a companion report to Volume 1, Executive Summary, and Volume 2, Baseline Detail. The results of the analyses of EROs are presented in 13 common energy end-use categories (e.g., boilers and furnaces, service hot water, and building lighting). A narrative-description of each ERO is provided, including information on the installed cost, energy and dollar savings; impacts on operation and maintenance (O&M); and, when applicable, a discussion of energy supply and demand, energy security, and environmental issues. A description of the evaluation methodologies and technical and cost assumptions is also provided for each ERO. Summary tables present the cost-effectiveness of energy end-use equipment before and after the implementation of each ERO and present the results of the life-cycle cost (LCC) analysis indicating the net present value (NPV) and savings to investment ratio (SIR) of each ERO.

  9. SpringSaLaD: A Spatial, Particle-Based Biochemical Simulation Platform with Excluded Volume.

    Science.gov (United States)

    Michalski, Paul J; Loew, Leslie M

    2016-02-02

    We introduce Springs, Sites, and Langevin Dynamics (SpringSaLaD), a comprehensive software platform for spatial, stochastic, particle-based modeling of biochemical systems. SpringSaLaD models biomolecules in a coarse-grained manner as a group of linked spherical sites with excluded volume. This mesoscopic approach bridges the gap between highly detailed molecular dynamics simulations and the various methods used to study network kinetics and diffusion at the cellular level. SpringSaLaD is a standalone tool that supports model building, simulation, visualization, and data analysis, all through a user-friendly graphical user interface that should make it more accessible than tools built into more comprehensive molecular dynamics infrastructures. Importantly, for bimolecular reactions we derive an exact expression relating the macroscopic on-rate to the various microscopic parameters with the inclusion of excluded volume; this makes SpringSaLaD more accurate than other tools, which rely on approximate relationships between these parameters.

  10. Streaming Model Based Volume Ray Casting Implementation for Cell Broadband Engine

    Directory of Open Access Journals (Sweden)

    Jusub Kim

    2009-01-01

    Full Text Available Interactive high quality volume rendering is becoming increasingly more important as the amount of more complex volumetric data steadily grows. While a number of volumetric rendering techniques have been widely used, ray casting has been recognized as an effective approach for generating high quality visualization. However, for most users, the use of ray casting has been limited to datasets that are very small because of its high demands on computational power and memory bandwidth. However the recent introduction of the Cell Broadband Engine (Cell B.E. processor, which consists of 9 heterogeneous cores designed to handle extremely demanding computations with large streams of data, provides an opportunity to put the ray casting into practical use. In this paper, we introduce an efficient parallel implementation of volume ray casting on the Cell B.E. The implementation is designed to take full advantage of the computational power and memory bandwidth of the Cell B.E. using an intricate orchestration of the ray casting computation on the available heterogeneous resources. Specifically, we introduce streaming model based schemes and techniques to efficiently implement acceleration techniques for ray casting on Cell B.E. In addition to ensuring effective SIMD utilization, our method provides two key benefits: there is no cost for empty space skipping and there is no memory bottleneck on moving volumetric data for processing. Our experimental results show that we can interactively render practical datasets on a single Cell B.E. processor.

  11. Patrick Air Force Base integrated resource assessment. Volume 3, Resource assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, W.F.; Parker, S.A.; King, D.A.; Wahlstrom, R.R.; Elliott, D.B.; Shankle, S.A.

    1993-12-01

    The US Air Force has tasked the Pacific Northwest Laboratory (PNL) in support of the US Department of Energy Federal Energy Management Program to identify, evaluate, and assist in acquiring all cost effective energy projects at Patrick Air Force Base (AFB). This is part of a model program that PNL is designing to support energy-use decisions in the federal sector. This report provides the results of the fossil fuel and electric energy resource opportunity (ERO) assessments performed by PNL at Patrick AFB which is located south of Cocoa Beach, Florida. It is a companion report to Volume 1, Executive Summary, and Volume.2, Baseline Detail. The results of the analyses of EROs are presented in 11 common energy end-use categories. A narrative description of each ERO is provided, including information on the installed cost, energy and dollar savings, impacts on operations and maintenance, and, when applicable, a discussion of energy supply and demand, energy security, and environmental issues. A description of the evaluation methodologies and technical and cost assumptions is also provided for each ERO. Summary tables present the cost-effectiveness of energy end-use equipment before and after the implementation of each ERO and present the results of the life-cycle cost analysis indicating the net present value and value index of each ERO.

  12. Precise measurement of liquid petroleum tank volume based on data cloud analysis

    Science.gov (United States)

    Wang, Jintao; Liu, Ziyong; Zhang, Long; Guo, Ligong; Bao, Xuesong; Tong, Lin

    2010-08-01

    Metal tanks are generally used for the measurement of liquid petroleum products for fiscal or custody transfer application. One tank volume precise measurement method based on data cloud analysis was studied, which was acquired by laser scanning principle. Method of distance measurement by laser phase shift and angular measurement by optical grating were applied to acquire coordinates of points in tank shell under the control of a servo system. Direct Iterative Method (DIM) and Section Area Method (SAM) were used to process measured data for vertical and horizontal tanks respectively. In comparison experiment, one 1000m3 vertical tank and one 30m3 horizontal tank were used as test objects. In the vertical tank experiment, the largest measured radius difference between the new laser method and strapping method (international arbitrary standard) is 2.8mm. In the horizontal tank experiment, the calibration result from laser scanning method is more close to reference than manual geometric method, and the mean deviation in full-scale range of the former and latter method are 75L and 141L respectively; with the increase of liquid level, the relative errors of laser scanning method and manual geometric method become smaller, and the mean relative errors are 0.6% and 1.5% respectively. By using the method discussed, the calibration efficiency of tank volume can be improved.

  13. You save money when you buy in bulk: does volume-based pricing cause people to buy more beer?

    Science.gov (United States)

    Bray, Jeremy W; Loomis, Brett R; Engelen, Mark

    2009-05-01

    This paper uses supermarket scanner data to estimate brand- and packaging-specific own- and cross-price elasticities for beer. We find that brand- and packaging-specific beer sales are highly price elastic. Cross-price elasticity estimates suggest that individuals are more likely to buy a higher-volume package of the same brand of beer than they are to switch brands. Policy simulations suggest that regulation of volume-based price discounts is potentially more effective than a tax increase at reducing beer consumption. Our results suggest that volume-based price discounting induces people to buy larger-volume packages of beer and may lead to an increased overall beer consumption. (c) 2008 John Wiley & Sons, Ltd.

  14. Fixed-point blind source separation algorithm based on ICA

    Institute of Scientific and Technical Information of China (English)

    Hongyan LI; Jianfen MA; Deng'ao LI; Huakui WANG

    2008-01-01

    This paper introduces the fixed-point learning algorithm based on independent component analysis (ICA);the model and process of this algorithm and simulation results are presented.Kurtosis was adopted as the estimation rule of independence.The results of the experiment show that compared with the traditional ICA algorithm based on random grads,this algorithm has advantages such as fast convergence and no necessity for any dynamic parameter,etc.The algorithm is a highly efficient and reliable method in blind signal separation.

  15. Dynamically reconfigurable directionality of plasmon-based single photon sources

    CERN Document Server

    Chen, Yuntian; Koenderink, A Femius

    2010-01-01

    We propose a plasmon-based reconfigurable antenna to controllably distribute emission from single quantum emitters in spatially separated channels. Our calculations show that crossed particle arrays can split the stream of photons from a single emitter into multiple narrow beams. We predict that beams can be switched on and off by switching host refractive index. The design method is based on engineering the dispersion relations of plasmon chains and is generally applicable to traveling wave antennas. Controllable photon delivery has potential applications in classical and quantum communication.

  16. Dynamically reconfigurable directionality of plasmon-based single photon sources

    DEFF Research Database (Denmark)

    Chen, Yuntian; Lodahl, Peter; Koenderink, A. Femius

    2010-01-01

    We propose a plasmon-based reconfigurable antenna to controllably distribute emission from single quantum emitters in spatially separated channels. Our calculations show that crossed particle arrays can split the stream of photons from a single emitter into multiple narrow beams. We predict...... that beams can be switched on and off by switching host refractive index. The design method is based on engineering the dispersion relations of plasmon chains and is generally applicable to traveling wave antennas. Controllable photon delivery has potential applications in classical and quantum communication....

  17. Heat and power sources based on nuclear shipbuilding technologies

    Energy Technology Data Exchange (ETDEWEB)

    Veshnyakov, K.; Fadeev, Y.; Panov, Y.; Polunichev, V. [JSC Afrikantov OKBM, Nizhny Novgorod (Russian Federation)

    2009-07-01

    The report gives information on the application of power units with small-power nuclear reactors as advanced energy sources to provide world consumers with electric power, domestic and industrial heat and fresh water. The report describes the technical concept of ABV unified reactor plant (RP) for floating and ground small power plants (SPP) developed in JSC 'Afrikantov OKBM'. The report contains the technical specification of the ABV RP utilizing an integral water-cooled reactor with thermal power of 38 to 45 MW, natural coolant circulation and improved inherent safety, as well as main characteristics of the reactor and core fuel ensuring acceptable mobility of the RP and NPP as a whole. The indicated refueling interval is 10-12 years. The report gives a detailed description of the concept for RP safety provision and compliance with international radiation and nuclear safety requirements, as well as the description of passive and other safety systems securing stability to any low-probability internal events, personnel errors and external impacts. The report provides data on application and technological properties of the floating and ground SPPs with a unified ABV RP; absence of spent fuel and radioactive waste at floating nuclear power plants (FNPP); FNPP transportation to consumers in a ready-to-operate state; arrangement, operation and disposal requirements.

  18. Alternative current source based Schottky contact with additional electric field

    Science.gov (United States)

    Mamedov, R. K.; Aslanova, A. R.

    2017-07-01

    Additional electric field (AEF) in the Schottky contacts (SC) that covered the peripheral contact region wide and the complete contact region narrow (as TMBS diode) SC. Under the influence of AEF is a redistribution of free electrons produced at certain temperatures of the semiconductor, and is formed the space charge region (SCR). As a result of the superposition of the electric fields SCR and AEF occurs the resulting electric field (REF). The REF is distributed along a straight line perpendicular to the contact surface, so that its intensity (and potential) has a minimum value on the metal surface and the maximum value at a great distance from the metal surface deep into the SCR. Under the influence of AEF as a sided force the metal becomes negative pole and semiconductor - positive pole, therefore, SC with AEF becomes an alternative current source (ACS). The Ni-nSi SC with different diameters (20-1000 μm) under the influence of the AEF as sided force have become ACS with electromotive force in the order of 0.1-1.0 mV, which are generated the electric current in the range of 10-9-10-7 A, flowing through the external resistance 1000 Ohm.

  19. Adaptive Source Localization Based Station Keeping of Autonomous Vehicles

    KAUST Repository

    Guler, Samet

    2016-10-26

    We study the problem of driving a mobile sensory agent to a target whose location is specied only in terms of the distances to a set of sensor stations or beacons. The beacon positions are unknown, but the agent can continuously measure its distances to them as well as its own position. This problem has two particular applications: (1) capturing a target signal source whose distances to the beacons are measured by these beacons and broadcasted to a surveillance agent, (2) merging a single agent to an autonomous multi-agent system so that the new agent is positioned at desired distances from the existing agents. The problem is solved using an adaptive control framework integrating a parameter estimator producing beacon location estimates, and an adaptive motion control law fed by these estimates to steer the agent toward the target. For location estimation, a least-squares adaptive law is used. The motion control law aims to minimize a convex cost function with unique minimizer at the target location, and is further augmented for persistence of excitation. Stability and convergence analysis is provided, as well as simulation results demonstrating performance and transient behavior.

  20. A preference-based multiple-source rough set model

    NARCIS (Netherlands)

    M.A. Khan; M. Banerjee

    2010-01-01

    We propose a generalization of Pawlak’s rough set model for the multi-agent situation, where information from an agent can be preferred over that of another agent of the system while deciding membership of objects. Notions of lower/upper approximations are given which depend on the knowledge base of

  1. Export Control and the U.S. Defense Industrial Base - Revised. Volume 1: Summary Report and Volume 2: Appendices

    Science.gov (United States)

    2008-10-01

    in China. 6 Major microelectronics firms based in several countries—Motorola, Intel, Samsung , Toshiba, TSMC and others—are undertaking Chinese...batches of parts were obsolete. Lean Manufacturing called such machines “monuments,” the worst possible way to manage inventory . In their place, Toyota

  2. Density and molar volumes of imidazolium-based ionic liquid mixtures and prediction by the Jouyban-Acree model

    Science.gov (United States)

    Ghani, Noraini Abd; Sairi, Nor Asrina; Mat, Ahmad Nazeer Che; Khoubnasabjafari, Mehry; Jouyban, Abolghasem

    2016-11-01

    The density of imidazolium-based ionic liquid, 1-ethyl-3-methylimidazolium diethylphosphate with sulfolane were measured at atmospheric pressure. The experiments were performed at T= (293 - 343) K over the complete mole fractions. Physical and thermodynamic properties such as molar volumes, V0, and excess molar volumes, VE for this binary mixtures were derived from the experimental density data. The Jouyban-Acree model was exploited to correlate the physicochemical properties (PCPs) of binary mixtures at various mole fractions and temperatures.

  3. Integration of monolithic porous polymer with droplet-based microfluidics on a chip for nano/picoliter volume sample analysis

    OpenAIRE

    Kim, Jin-Young; Chang, Soo-Ik; Andrew J deMello; O’Hare, Danny

    2014-01-01

    In this paper, a porous polymer nanostructure has been integrated with droplet-based microfluidics in a single planar format. Monolithic porous polymer (MPP) was formed selectively within a microfluidic channel. The resulting analyte bands were sequentially comartmentalised into droplets. This device reduces band broadening and the effects of post-column dead volume by the combination of the two techniques. Moreover it offers the precise control of nano/picoliter volume samples.

  4. Distribution and determinants of choroidal thickness and volume using automated segmentation software in a population-based study.

    Science.gov (United States)

    Gupta, Preeti; Jing, Tian; Marziliano, Pina; Cheung, Carol Y; Baskaran, Mani; Lamoureux, Ecosse L; Wong, Tien Yin; Cheung, Chui Ming Gemmy; Cheng, Ching-Yu

    2015-02-01

    To objectively quantify choroidal thickness and choroidal volume using fully automated choroidal segmentation software applied to images obtained from enhanced depth imaging spectral-domain optical coherence tomography (EDI SD OCT) in a population-based study; and evaluate the ocular and systemic determinants of choroidal thickness and choroidal volume. Prospective cross-sectional study. Participants ranging in age from 45 to 85 years were recruited from the Singapore Malay Eye Study-2 (SiMES-2), a follow-up population-based study. All participants (n = 540) underwent a detailed ophthalmic examination, including EDI SD OCT for measurements of thickness and volume of the choroid. The intrasession repeatability of choroidal thickness at 5 measured horizontal locations and macular choroidal volume using automated choroidal segmentation software was excellent (intraclass correlation coefficient, 0.97-0.99). Choroid was significantly thicker under the fovea (242.28 ± 97.58 μm), followed by 3 mm temporal (207.65 ± 80.98 μm), and was thinnest at 3 mm nasal (142.44 ± 79.19 μm) location. The mean choroidal volume at central macular region (within a circle of 1 mm diameter) was 0.185 ± 0.69 mm(3). Among the range of ocular and systemic factors studied, age, sex, and axial length were the only significant predictors of choroidal thickness and choroidal volume (all P choroidal segmentation software, we provide fast, reliable, and objective measurements of choroidal thickness and volume in a population-based sample. Male sex, younger age, and shorter axial length are the factors independently associated with thicker choroid and larger choroidal volume. These factors should be taken into consideration when interpreting EDI SD OCT-based choroidal thickness measurements in clinics. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Comparison of imaging-based gross tumor volume and pathological volume determined by whole-mount serial sections in primary cervical cancer

    Directory of Open Access Journals (Sweden)

    Zhang Y

    2013-07-01

    Full Text Available Ying Zhang,1,* Jing Hu,1,* Jianping Li,1 Ning Wang,1 Weiwei Li,1 Yongchun Zhou,1 Junyue Liu,1 Lichun Wei,1 Mei Shi,1 Shengjun Wang,2 Jing Wang,2 Xia Li,3 Wanling Ma4 1Department of Radiation Oncology, 2Department of Nuclear Medicine, 3Department of Pathology, 4Department of Radiology, Xijing Hospital, Xi'an, People's Republic of China*These authors contributed equally to this workObjective: To investigate the accuracy of imaging-based gross tumor volume (GTV compared with pathological volume in cervical cancer.Methods: Ten patients with International Federation of Gynecology and Obstetrics stage I–II cervical cancer were eligible for investigation and underwent surgery in this study. Magnetic resonance imaging (MRI and fluorine-18 fluorodeoxyglucose positron emission tomography (18F-FDG PET/computed tomography (CT scans were taken the day before surgery. The GTVs under MRI and 18F-FDG PET/CT (GTV-MRI, GTV-PET, GTV-CT were calculated automatically by Eclipse treatment-planning systems. Specimens of excised uterine cervix and cervical cancer were consecutively sliced and divided into whole-mount serial sections. The tumor border of hematoxylin and eosin-stained sections was outlined under a microscope by an experienced pathologist. GTV through pathological image (GTV-path was calculated with Adobe Photoshop.Results: The GTVs (average ± standard deviation delineated and calculated under CT, MRI, PET, and histopathological sections were 19.41 ± 11.96 cm3, 12.66 ± 10.53 cm3, 11.07 ± 9.44 cm3, and 10.79 ± 8.71 cm3, respectively. The volume of GTV-CT or GTV-MR was bigger than GTV-path, and the difference was statistically significant (P 0.05. Spearman correlation analysis showed that GTV-CT, GTV-MRI, and GTV-PET were significantly correlated with GTV-path (P < 0.01. There was no significant difference in the lesion coverage factor among the three modalities.Conclusion: The present study showed that GTV defined under 40% of maximum standardized

  6. Multi-GPU-based acceleration of the explicit time domain volume integral equation solver using MPI-OpenACC

    KAUST Repository

    Feki, Saber

    2013-07-01

    An explicit marching-on-in-time (MOT)-based time-domain volume integral equation (TDVIE) solver has recently been developed for characterizing transient electromagnetic wave interactions on arbitrarily shaped dielectric bodies (A. Al-Jarro et al., IEEE Trans. Antennas Propag., vol. 60, no. 11, 2012). The solver discretizes the spatio-temporal convolutions of the source fields with the background medium\\'s Green function using nodal discretization in space and linear interpolation in time. The Green tensor, which involves second order spatial and temporal derivatives, is computed using finite differences on the temporal and spatial grid. A predictor-corrector algorithm is used to maintain the stability of the MOT scheme. The simplicity of the discretization scheme permits the computation of the discretized spatio-temporal convolutions on the fly during time marching; no \\'interaction\\' matrices are pre-computed or stored resulting in a memory efficient scheme. As a result, most often the applicability of this solver to the characterization of wave interactions on electrically large structures is limited by the computation time but not the memory. © 2013 IEEE.

  7. Optimization Design and Simulation of a Multi-Source Energy Harvester Based on Solar and Radioisotope Energy Sources

    Directory of Open Access Journals (Sweden)

    Hao Li

    2016-12-01

    Full Text Available A novel multi-source energy harvester based on solar and radioisotope energy sources is designed and simulated in this work. We established the calculation formulas for the short-circuit current and open-circuit voltage, and then studied and analyzed the optimization thickness of the semiconductor, doping concentration, and junction depth with simulation of the transport process of β particles in a semiconductor material using the Monte Carlo simulation program MCNP (version 5, Radiation Safety Information Computational Center, Oak Ridge, TN, USA. In order to improve the efficiency of converting solar light energy into electric power, we adopted PC1D (version 5.9, University of New South Wales, Sydney, Australia to optimize the parameters, and selected the best parameters for converting both the radioisotope energy and solar energy into electricity. The results concluded that the best parameters for the multi-source energy harvester are as follows: Na is 1 × 1019 cm−3, Nd is 3.8 × 1016 cm−3, a PN junction depth of 0.5 μm (using the 147Pm radioisotope source, and so on. Under these parameters, the proposed harvester can achieve a conversion efficiency of 5.05% for the 147Pm radioisotope source (with the activity of 9.25 × 108 Bq and 20.8% for solar light radiation (AM1.5. Such a design and parameters are valuable for some unique micro-power fields, such as applications in space, isolated terrestrial applications, and smart dust in battlefields.

  8. Region-Based Partial Volume Correction Techniques for PET Imaging: Sinogram Implementation and Robustness

    Directory of Open Access Journals (Sweden)

    Mike Sattarivand

    2013-01-01

    Full Text Available Background/Purpose. Limited spatial resolution of positron emission tomography (PET requires partial volume correction (PVC. Region-based PVC methods are based on geometric transfer matrix implemented either in image-space (GTM or sinogram-space (GTMo, both with similar performance. Although GTMo is slower, it more closely simulates the 3D PET image acquisition, accounts for local variations of point spread function, and can be implemented for iterative reconstructions. A recent image-based symmetric GTM (sGTM has shown improvement in noise characteristics and robustness to misregistration over GTM. This study implements the sGTM method in sinogram space (sGTMo, validates it, and evaluates its performance. Methods. Two 3D sphere and brain digital phantoms and a physical sphere phantom were used. All four region-based PVC methods (GTM, GTMo, sGTM, and sGTMo were implemented and their performance was evaluated. Results. All four PVC methods had similar accuracies. Both noise propagation and robustness of the sGTMo method were similar to those of sGTM method while they were better than those of GTMo method especially for smaller objects. Conclusion. The sGTMo was implemented and validated. The performance of the sGTMo in terms of noise characteristics and robustness to misregistration is similar to that of the sGTM method and improved compared to the GTMo method.

  9. GPU-Based Volume Rendering of Noisy Multi-Spectral Astronomical Data

    CERN Document Server

    Hassan, Amr H; Barnes, David G

    2010-01-01

    Traditional analysis techniques may not be sufficient for astronomers to make the best use of the data sets that current and future instruments, such as the Square Kilometre Array and its Pathfinders, will produce. By utilizing the incredible pattern-recognition ability of the human mind, scientific visualization provides an excellent opportunity for astronomers to gain valuable new insight and understanding of their data, particularly when used interactively in 3D. The goal of our work is to establish the feasibility of a real-time 3D monitoring system for data going into the Australian SKA Pathfinder archive. Based on CUDA, an increasingly popular development tool, our work utilizes the massively parallel architecture of modern graphics processing units (GPUs) to provide astronomers with an interactive 3D volume rendering for multi-spectral data sets. Unlike other approaches, we are targeting real time interactive visualization of datasets larger than GPU memory while giving special attention to data with l...

  10. SMILE Microscopy : fast and single-plane based super-resolution volume imaging

    CERN Document Server

    Mondal, Partha Pratim

    2016-01-01

    Fast 3D super-resolution imaging is essential for decoding rapidly occurring biological processes. Encoding single molecules to their respective planes enable simultaneous multi-plane super-resolution volume imaging. This saves the data-acquisition time and as a consequence reduce radiation-dose that lead to photobleaching and other undesirable photochemical reactions. Detection and subsequent identification of the locus of individual molecule (both on the focal plane and off-focal planes) holds the key. Experimentally, this is achieved by accurate calibration of system PSF size and its natural spread in off-focal planes using sub-diffraction fluorescent beads. Subsequently the identification and sorting of single molecules that belong to different axial planes is carried out (by setting multiple cut-offs to respective PSFs). Simultaneous Multiplane Imaging based Localization Encoded (SMILE) microscopy technique eliminates the need for multiple z-plane scanning and thereby provides a truly simultaneous multip...

  11. Transfer function design based on user selected samples for intuitive multivariate volume exploration

    KAUST Repository

    Zhou, Liang

    2013-02-01

    Multivariate volumetric datasets are important to both science and medicine. We propose a transfer function (TF) design approach based on user selected samples in the spatial domain to make multivariate volumetric data visualization more accessible for domain users. Specifically, the user starts the visualization by probing features of interest on slices and the data values are instantly queried by user selection. The queried sample values are then used to automatically and robustly generate high dimensional transfer functions (HDTFs) via kernel density estimation (KDE). Alternatively, 2D Gaussian TFs can be automatically generated in the dimensionality reduced space using these samples. With the extracted features rendered in the volume rendering view, the user can further refine these features using segmentation brushes. Interactivity is achieved in our system and different views are tightly linked. Use cases show that our system has been successfully applied for simulation and complicated seismic data sets. © 2013 IEEE.

  12. Community-based risk assessment of water contamination from high-volume horizontal hydraulic fracturing.

    Science.gov (United States)

    Penningroth, Stephen M; Yarrow, Matthew M; Figueroa, Abner X; Bowen, Rebecca J; Delgado, Soraya

    2013-01-01

    The risk of contaminating surface and groundwater as a result of shale gas extraction using high-volume horizontal hydraulic fracturing (fracking) has not been assessed using conventional risk assessment methodologies. Baseline (pre-fracking) data on relevant water quality indicators, needed for meaningful risk assessment, are largely lacking. To fill this gap, the nonprofit Community Science Institute (CSI) partners with community volunteers who perform regular sampling of more than 50 streams in the Marcellus and Utica Shale regions of upstate New York; samples are analyzed for parameters associated with HVHHF. Similar baseline data on regional groundwater comes from CSI's testing of private drinking water wells. Analytic results for groundwater (with permission) and surface water are made publicly available in an interactive, searchable database. Baseline concentrations of potential contaminants from shale gas operations are found to be low, suggesting that early community-based monitoring is an effective foundation for assessing later contamination due to fracking.

  13. Control theory based airfoil design for potential flow and a finite volume discretization

    Science.gov (United States)

    Reuther, J.; Jameson, A.

    1994-01-01

    This paper describes the implementation of optimization techniques based on control theory for airfoil design. In previous studies it was shown that control theory could be used to devise an effective optimization procedure for two-dimensional profiles in which the shape is determined by a conformal transformation from a unit circle, and the control is the mapping function. The goal of our present work is to develop a method which does not depend on conformal mapping, so that it can be extended to treat three-dimensional problems. Therefore, we have developed a method which can address arbitrary geometric shapes through the use of a finite volume method to discretize the potential flow equation. Here the control law serves to provide computationally inexpensive gradient information to a standard numerical optimization method. Results are presented, where both target speed distributions and minimum drag are used as objective functions.

  14. ITAC volume assessment through a Gaussian hidden Markov random field model-based algorithm.

    Science.gov (United States)

    Passera, Katia M; Potepan, Paolo; Brambilla, Luca; Mainardi, Luca T

    2008-01-01

    In this paper, a semi-automatic segmentation method for volume assessment of Intestinal-type adenocarcinoma (ITAC) is presented and validated. The method is based on a Gaussian hidden Markov random field (GHMRF) model that represents an advanced version of a finite Gaussian mixture (FGM) model as it encodes spatial information through the mutual influences of neighboring sites. To fit the GHMRF model an expectation maximization (EM) algorithm is used. We applied the method to a magnetic resonance data sets (each of them composed by T1-weighted, Contrast Enhanced T1-weighted and T2-weighted images) for a total of 49 tumor-contained slices. We tested GHMRF performances with respect to FGM by both a numerical and a clinical evaluation. Results show that the proposed method has a higher accuracy in quantifying lesion area than FGM and it can be applied in the evaluation of tumor response to therapy.

  15. Distributed location-based query processing on large volumes of moving items

    Institute of Scientific and Technical Information of China (English)

    JEON Se-gil; LEE Chung-woo; NAH Yunmook; KIM Moon-hae; HAN Ki-joon

    2004-01-01

    Recently, new techniques to efficiently manage current and past location information of moving objects have received significant interests in the area of moving object databases and location-based service systems. In this paper, we exploit query processing schemes for location management systems, which consist of multiple data processing nodes to handle massive volume of moving objects such as cellular phone users.To show the usefulness of the proposed schemes, some experimental results showing performance factors regarding distributed query processing are explained. In our experiments, we use two kinds of data set: one is generated by the extended GSTD simulator and another is generated by the real-time data generator which generates location sensing reports of various types of users having different movement patterns.

  16. Semi-Automatic Anatomical Tree Matching for Landmark-Based Elastic Registration of Liver Volumes

    Directory of Open Access Journals (Sweden)

    Klaus Drechsler

    2010-01-01

    Full Text Available One promising approach to register liver volume acquisitions is based on the branching points of the vessel trees as anatomical landmarks inherently available in the liver. Automated tree matching algorithms were proposed to automatically find pair-wise correspondences between two vessel trees. However, to the best of our knowledge, none of the existing automatic methods are completely error free. After a review of current literature and methodologies on the topic, we propose an efficient interaction method that can be employed to support tree matching algorithms with important pre-selected correspondences or after an automatic matching to manually correct wrongly matched nodes. We used this method in combination with a promising automatic tree matching algorithm also presented in this work. The proposed method was evaluated by 4 participants and a CT dataset that we used to derive multiple artificial datasets.

  17. Ablation Properties of the Carbon-Based Composites Used in Artificial Heat Source Under Fire Accident

    Institute of Scientific and Technical Information of China (English)

    TANG; Xian; HUANG; Jin-ming; ZHOU; Shao-jian; LUO; Zhi-fu

    2012-01-01

    <正>The ablation properties of the carbon-based composites used in artificial heat source under fire accident were investigated by the arc heater. In this work, we tested the carbon-based composites referring to Fig. 1. Their linear/mass ablation ratio and ablation morphologies were studied. The results showed that the carbon-based composites used in artificial heat source behaved well

  18. Theoretical Sources and Bases of Pedagogy of Collective Creative Education

    Directory of Open Access Journals (Sweden)

    Kaplunovich I. Ya.

    2014-01-01

    Full Text Available Known pedagogical concept of I. P. Ivanov considered and analyzed from the perspective of two sciences: psychology and cybernetics. It is shown that the basic principles of pedagogy common concern based implicitly and may be explained in particular on the fundamental positions of the two classical disciplines (Ashby laws, the second principle, the initial threshold of complexity, etc. in cybernetics and cultural-historical and activity approach in psychology.

  19. Environmental Monitoring and Characterization of Radiation Sources on UF Campus Using a Large Volume NaI Detector

    Science.gov (United States)

    Bruner, Jesse A.; Gardiner, Hannah E.; Jordan, Kelly A.; Baciak, James E.

    2016-09-01

    Environmental radiation surveys are important for applications such as safety and regulations. This is especially true for areas exposed to emissions from nuclear reactors, such as the University of Florida Training Reactor (UFTR). At the University of Florida, surveys are performed using the RSX-1 NaI detector, developed by Radiation Solutions Inc. The detector uses incoming gamma rays and an Advanced Digital Spectrometer module to produce a linear energy spectrum. These spectra can then be analyzed in real time with a personal computer using the built in software, RadAssist. We report on radiation levels around the University of Florida campus using two mobile detection platforms, car-borne and cart-borne. The car-borne surveys provide a larger, broader map of campus radiation levels. On the other hand, cart-borne surveys provide a more detailed radiation map because of its ability to reach places on campus cars cannot go. Throughout the survey data, there are consistent radon decay product energy peaks in addition to other sources such as medical I-131 found in a large crowd of people. Finally, we investigate further applications of this mobile detection platform, such as tracking the Ar-41 plume emitted from the UFTR and detection of potential environmental hazards.

  20. Correlation of centroid-based breast size, surface-based breast volume, and asymmetry-score-based breast symmetry in three-dimensional breast shape analysis

    Directory of Open Access Journals (Sweden)

    Henseler, Helga

    2016-06-01

    Full Text Available Objective: The aim of this study was to investigate correlations among the size, volume, and symmetry of the female breast after reconstruction based on previously published data. Methods: The centroid, namely the geometric center of a three-dimensional (3D breast-landmark-based configuration, was used to calculate the size of the breast. The surface data of the 3D breast images were used to measure the volume. Breast symmetry was assessed by the Procrustes analysis method, which is based on the 3D coordinates of the breast landmarks to produce an asymmetry score. The relationship among the three measurements was investigated. For this purpose, the data of 44 patients who underwent unilateral breast reconstruction with an extended latissimus dorsi flap were analyzed. The breast was captured by a validated 3D imaging system using multiple cameras. Four landmarks on each breast and two landmarks marking the midline were used.Results: There was a significant positive correlation between the centroid-based breast size of the unreconstructed breast and the measured asymmetry (p=0.024; correlation coefficient, 0.34. There was also a significant relationship between the surface-based breast volume of the unaffected side and the overall asymmetry score (p<0.001; correlation coefficient, 0.556. An increase in size and especially in volume of the unreconstructed breast correlated positively with an increase in breast asymmetry in a linear relationship.Conclusions: In breast shape analysis, the use of more detailed surface-based data should be preferred to centroid-based size data. As the breast size increases, the latissimus dorsi flap for unilateral breast reconstruction increasingly falls short in terms of matching the healthy breast in a linear relationship. Other reconstructive options should be considered for larger breasts. Generally plastic surgeons should view the two breasts as a single unit when assessing breast aesthetics and not view each