Improved algorithm of ray tracing in ICF cryogenic targets
Zhang, Rui; Yang, Yongying; Ling, Tong; Jiang, Jiabin
2016-10-01
The high precision ray tracing inside inertial confinement fusion (ICF) cryogenic targets plays an important role in the reconstruction of the three-dimensional density distribution by algebraic reconstruction technique (ART) algorithm. The traditional Runge-Kutta methods, which is restricted by the precision of the grid division and the step size of ray tracing, cannot make an accurate calculation in the case of refractive index saltation. In this paper, we propose an improved algorithm of ray tracing based on the Runge-Kutta methods and Snell's law of refraction to achieve high tracing precision. On the boundary of refractive index, we apply Snell's law of refraction and contact point search algorithm to ensure accuracy of the simulation. Inside the cryogenic target, the combination of the Runge-Kutta methods and self-adaptive step algorithm are employed for computation. The original refractive index data, which is used to mesh the target, can be obtained by experimental measurement or priori refractive index distribution function. A finite differential method is performed to calculate the refractive index gradient of mesh nodes, and the distance weighted average interpolation methods is utilized to obtain refractive index and gradient of each point in space. In the simulation, we take ideal ICF target, Luneberg lens and Graded index rod as simulation model to calculate the spot diagram and wavefront map. Compared the simulation results to Zemax, it manifests that the improved algorithm of ray tracing based on the fourth-order Runge-Kutta methods and Snell's law of refraction exhibits high accuracy. The relative error of the spot diagram is 0.2%, and the peak-to-valley (PV) error and the root-mean-square (RMS) error of the wavefront map is less than λ/35 and λ/100, correspondingly.
Fox, Christopher; Romeijn, H Edwin; Dempsey, James F
2006-05-01
We present work on combining three algorithms to improve ray-tracing efficiency in radiation therapy dose computation. The three algorithms include: An improved point-in-polygon algorithm, incremental voxel ray tracing algorithm, and stereographic projection of beamlets for voxel truncation. The point-in-polygon and incremental voxel ray-tracing algorithms have been used in computer graphics and nuclear medicine applications while the stereographic projection algorithm was developed by our group. These algorithms demonstrate significant improvements over the current standard algorithms in peer reviewed literature, i.e., the polygon and voxel ray-tracing algorithms of Siddon for voxel classification (point-in-polygon testing) and dose computation, respectively, and radius testing for voxel truncation. The presented polygon ray-tracing technique was tested on 10 intensity modulated radiation therapy (IMRT) treatment planning cases that required the classification of between 0.58 and 2.0 million voxels on a 2.5 mm isotropic dose grid into 1-4 targets and 5-14 structures represented as extruded polygons (a.k.a. Siddon prisms). Incremental voxel ray tracing and voxel truncation employing virtual stereographic projection was tested on the same IMRT treatment planning cases where voxel dose was required for 230-2400 beamlets using a finite-size pencil-beam algorithm. Between a 100 and 360 fold cpu time improvement over Siddon's method was observed for the polygon ray-tracing algorithm to perform classification of voxels for target and structure membership. Between a 2.6 and 3.1 fold reduction in cpu time over current algorithms was found for the implementation of incremental ray tracing. Additionally, voxel truncation via stereographic projection was observed to be 11-25 times faster than the radial-testing beamlet extent approach and was further improved 1.7-2.0 fold through point-classification using the method of translation over the cross product technique.
Study of improved ray tracing parallel algorithm for CGH of 3D objects on GPU
Cong, Bin; Jiang, Xiaoyu; Yao, Jun; Zhao, Kai
2014-11-01
An improved parallel algorithm for holograms of three-dimensional objects was presented. According to the physical characteristics and mathematical properties of the original ray tracing algorithm for computer generated holograms (CGH), using transform approximation and numerical analysis methods, we extract parts of ray tracing algorithm which satisfy parallelization features and implement them on graphics processing unit (GPU). Meanwhile, through proper design of parallel numerical procedure, we did parallel programming to the two-dimensional slices of three-dimensional object with CUDA. According to the experiments, an effective method of dealing with occlusion problem in ray tracing is proposed, as well as generating the holograms of 3D objects with additive property. Our results indicate that the improved algorithm can effectively shorten the computing time. Due to the different sizes of spatial object points and hologram pixels, the speed has increased 20 to 70 times comparing with original ray tracing algorithm.
Validation of Three-Dimensional Ray-Tracing Algorithm for Indoor Wireless Propagations
Majdi Salem; Mahamod Ismail; Norbahiah Misran
2011-01-01
A 3D ray tracing simulator has been developed for indoor wireless networks. The simulator uses geometrical optics (GOs) to propagate the electromagnetic waves inside the buildings. The prediction technique takes into account multiple reflections and transmissions of the propagated waves. An interpolation prediction method (IPM) has been proposed to predict the propagated signal and to make the ray-tracing algorithm faster, accurate, and simple. The measurements have been achieved by using a s...
HUANG Yueqin; ZHANG Jianzhong
2008-01-01
A kind of three-dimensional(3-D) sound ray tracing algorithm in heterogeneous media is studied. This algorithm includes two steps: the first step computes the wavefront traveltimes forward; the second step traces the sound rays backward. In the first step, the computation of wavefront traveltimes at discrete grid points from the sound source, was found on Eikonal equation solutions and carried out by GMM (Group marching method) wavefront marching method based on level set. In the second step, sound ray tracing was proceeded gradually from the receiver to each cell towards the sound source, with wavefront traveltimes computed in the first step. Time values on arbitrary positions in each cuboid cell can be expressed by linear interpolation of wavefront traveltimes at the same cell's grid points. Thus,an algorithm of 3-D sound ray tracing in heterogeneous media is put forward. The simulation results indicate that this method can improve both the accuracy and the efficiency of 3-D sound ray tracing greatly.
GPU-based ray tracing algorithm for high-speed propagation prediction in typical indoor environments
Guo, Lixin; Guan, Xiaowei; Liu, Zhongyu
2015-10-01
A fast 3-D ray tracing propagation prediction model based on virtual source tree is presented in this paper, whose theoretical foundations are geometrical optics(GO) and the uniform theory of diffraction(UTD). In terms of typical single room indoor scene, taking the geometrical and electromagnetic information into account, some acceleration techniques are adopted to raise the efficiency of the ray tracing algorithm. The simulation results indicate that the runtime of the ray tracing algorithm will sharply increase when the number of the objects in the single room is large enough. Therefore, GPU acceleration technology is used to solve that problem. As is known to all, GPU is good at calculation operation rather than logical judgment, so that tens of thousands of threads in CUDA programs are able to calculate at the same time, in order to achieve massively parallel acceleration. Finally, a typical single room with several objects is simulated by using the serial ray tracing algorithm and the parallel one respectively. It can be found easily from the results that compared with the serial algorithm, the GPU-based one can achieve greater efficiency.
Ray trace algorithm description for the study of pump power absorption in double clad fibers
Narro, R.; Rodriguez, E.; Ponce, L.; de Posada, E.; Flores, T.; Arronte, M.
2011-09-01
An algorithm for the analysis of the double clad fiber design is presented. The algorithm developed in the MATLAB computing language, is based on ray tracing method applied to three-dimensional graphics figures which are composed of a set of plans. The algorithm can evaluate thousands of ray paths in sequence and its corresponding pump absorption in each of the elements of the fiber according to the Lambert-Beer law. The beam path is evaluated in 3 dimensions considering the losses by reflexion and refraction in the faces and within the fiber. Due to its flexibility, the algorithm can be used to study the ray propagation in single mode or multimode fibers, bending effects in fibers, variable geometries of the inner clad and the core, and could also be used to study tappers.
Tichý, Vladimír; Hudec, René; Němcová, Šárka
2016-06-01
The algorithm presented is intended mainly for lobster eye optics. This type of optics (and some similar types) allows for a simplification of the classical ray-tracing procedure that requires great many rays to simulate. The method presented performs the simulation of a only few rays; therefore it is extremely effective. Moreover, to simplify the equations, a specific mathematical formalism is used. Only a few simple equations are used, therefore the program code can be simple as well. The paper also outlines how to apply the method to some other reflective optical systems.
Vertex shading of the three-dimensional model based on ray-tracing algorithm
Hu, Xiaoming; Sang, Xinzhu; Xing, Shujun; Yan, Binbin; Wang, Kuiru; Dou, Wenhua; Xiao, Liquan
2016-10-01
Ray Tracing Algorithm is one of the research hotspots in Photorealistic Graphics. It is an important light and shadow technology in many industries with the three-dimensional (3D) structure, such as aerospace, game, video and so on. Unlike the traditional method of pixel shading based on ray tracing, a novel ray tracing algorithm is presented to color and render vertices of the 3D model directly. Rendering results are related to the degree of subdivision of the 3D model. A good light and shade effect is achieved by realizing the quad-tree data structure to get adaptive subdivision of a triangle according to the brightness difference of its vertices. The uniform grid algorithm is adopted to improve the rendering efficiency. Besides, the rendering time is independent of the screen resolution. In theory, as long as the subdivision of a model is adequate, cool effects as the same as the way of pixel shading will be obtained. Our practical application can be compromised between the efficiency and the effectiveness.
Microcellular propagation prediction model based on an improved ray tracing algorithm.
Liu, Z-Y; Guo, L-X; Fan, T-Q
2013-11-01
Two-dimensional (2D)/two-and-one-half-dimensional ray tracing (RT) algorithms for the use of the uniform theory of diffraction and geometrical optics are widely used for channel prediction in urban microcellular environments because of their high efficiency and reliable prediction accuracy. In this study, an improved RT algorithm based on the "orientation face set" concept and on the improved 2D polar sweep algorithm is proposed. The goal is to accelerate point-to-point prediction, thereby making RT prediction attractive and convenient. In addition, the use of threshold control of each ray path and the handling of visible grid points for reflection and diffraction sources are adopted, resulting in an improved efficiency of coverage prediction over large areas. Measured results and computed predictions are also compared for urban scenarios. The results indicate that the proposed prediction model works well and is a useful tool for microcellular communication applications.
Betremieux, Yan
2015-01-01
Atmospheric refraction affects to various degrees exoplanet transit, lunar eclipse, as well as stellar occultation observations. Exoplanet retrieval algorithms often use analytical expressions for the column abundance along a ray traversing the atmosphere as well as for the deflection of that ray, which are first order approximations valid for low densities in a spherically symmetric homogeneous isothermal atmosphere. We derive new analytical formulae for both of these quantities, which are valid for higher densities, and use them to refine and validate a new ray tracing algorithm which can be used for arbitrary atmospheric temperature-pressure profiles. We illustrate with simple isothermal atmospheric profiles the consequences of our model for different planets: temperate Earth-like and Jovian-like planets, as well as HD189733b, and GJ1214b. We find that, for both hot exoplanets, our treatment of refraction does not make much of a difference to pressures as high as 10 atmosphere, but that it is important to ...
N. H. Abd Rahman
2014-01-01
Full Text Available Reflector antennas have been widely used in many areas. In the implementation of parabolic reflector antenna for broadcasting satellite applications, it is essential for the spacecraft antenna to provide precise contoured beam to effectively serve the required region. For this purpose, combinations of more than one beam are required. Therefore, a tool utilizing ray tracing method is developed to calculate precise off-axis beams for multibeam antenna system. In the multibeam system, each beam will be fed from different feed positions to allow the main beam to be radiated at the exact direction on the coverage area. Thus, detailed study on caustics of a parabolic reflector antenna is performed and presented in this paper, which is to investigate the behaviour of the rays and its relation to various antenna parameters. In order to produce accurate data for the analysis, the caustic behaviours are investigated in two distinctive modes: scanning plane and transverse plane. This paper presents the detailed discussions on the derivation of the ray tracing algorithms, the establishment of the equations of caustic loci, and the verification of the method through calculation of radiation pattern.
Construction of Virtual Tuming Scene Based on Local Ray Tracing Algorithm
王国锋; 王子良; 王太勇
2003-01-01
According to the features of the turning simulation, a simplified Whitted lighting model is proposed based on the analysis of Phong and other local illumination model. Moreover, in order to obtain the natural lighting effects, local ray tracing algorithm is given to calculate the light intensity of every position during the course of the simulation. This method can calculate the refresh area before calculating the intersection line,simulate the machining environment accurately and reduce the calculating time. Finally, an example of the virtual cutting scene is shown to demonstrate the effects of the global illumination model. If the CUP is 1.3 G and the internal memory is 128 M, the refreshing time of virtual turning scene can be reduced by nine times. This study plays an important role in the enrichment of the virtual manufacturing theory and the promotion of the development of the advanced manufacturing technology.
Investigation of propagation algorithms for ray-tracing simulation of polarized neutrons
Bergbäck Knudsen, Erik; Tranum-Rømer, A.; Willendrup, Peter Kjær
2014-01-01
Ray-tracing of polarized neutrons faces a challenge when the neutron propagates through an inhomogeneous magnetic field. This affects simulations of novel instruments using encoding of energy or angle into the neutron spin. We here present a new implementation of propagation of polarized neutrons...
A Sub-band Divided Ray Tracing Algorithm Using the DPS Subspace in UWB Indoor Scenarios
Gan, Mingming; Xu, Zhinan; Hofer, Markus
2015-01-01
Sub-band divided ray tracing (SDRT) is one technique that has been extensively used to obtain the channel characteristics for ultra-wideband (UWB) radio wave propagation in realistic indoor environments. However, the computational complexity of SDRT scales directly with the number of sub-bands. A...
Gatland, Ian R.
2002-01-01
Proposes a ray tracing approach to thin lens analysis based on a vector form of Snell's law for paraxial rays as an alternative to the usual approach in introductory physics courses. The ray tracing approach accommodates skew rays and thus provides a complete analysis. (Author/KHR)
Lam, Wai Sze Tiffany
Optical components made of anisotropic materials, such as crystal polarizers and crystal waveplates, are widely used in many complex optical system, such as display systems, microlithography, biomedical imaging and many other optical systems, and induce more complex aberrations than optical components made of isotropic materials. The goal of this dissertation is to accurately simulate the performance of optical systems with anisotropic materials using polarization ray trace. This work extends the polarization ray tracing calculus to incorporate ray tracing through anisotropic materials, including uniaxial, biaxial and optically active materials. The 3D polarization ray tracing calculus is an invaluable tool for analyzing polarization properties of an optical system. The 3x3 polarization ray tracing P matrix developed for anisotropic ray trace assists tracking the 3D polarization transformations along a ray path with series of surfaces in an optical system. To better represent the anisotropic light-matter interactions, the definition of the P matrix is generalized to incorporate not only the polarization change at a refraction/reflection interface, but also the induced optical phase accumulation as light propagates through the anisotropic medium. This enables realistic modeling of crystalline polarization elements, such as crystal waveplates and crystal polarizers. The wavefront and polarization aberrations of these anisotropic components are more complex than those of isotropic optical components and can be evaluated from the resultant P matrix for each eigen-wavefront as well as for the overall image. One incident ray refracting or reflecting into an anisotropic medium produces two eigenpolarizations or eigenmodes propagating in different directions. The associated ray parameters of these modes necessary for the anisotropic ray trace are described in Chapter 2. The algorithms to calculate the P matrix from these ray parameters are described in Chapter 3 for
Song Fu
2015-01-01
Full Text Available Although the uniform theory of diffraction (UTD could be theoretically applied to arbitrarilyshaped convex objects modeled by nonuniform rational B-splines (NURBS, one of the great challenges in calculation of the UTD surface diffracted fields is the difficulty in determining the geodesic paths along which the creeping waves propagate on arbitrarilyshaped NURBS surfaces. In differential geometry, geodesic paths satisfy geodesic differential equation (GDE. Hence, in this paper, a general and efficient adaptive variable step Euler method is introduced for solving the GDE on arbitrarilyshaped NURBS surfaces. In contrast with conventional Euler method, the proposed method employs a shape factor (SF ξ to efficiently enhance the accuracy of tracing and extends the application of UTD for practical engineering. The validity and usefulness of the algorithm can be verified by the numerical results.
Niccolini, Gilles; De Souza, Armando Domiciano
2010-01-01
The physical interpretation of spectro-interferometric data is strongly model-dependent. On one hand, models involving elaborate radiative transfer solvers are too time consuming in general to perform an automatic fitting procedure and derive astrophysical quantities and their related errors. On the other hand, using simple geometrical models does not give sufficient insights into the physics of the object. We propose to stand in between these two extreme approaches by using a physical but still simple parameterised model for the object under consideration. Based on this philosophy, we developed a numerical tool optimised for mid-infrared (mid-IR) interferometry, the fast ray-tracing algorithm for circumstellar structures (FRACS) which can be used as a stand-alone model, or as an aid for a more advanced physical description or even for elaborating observation strategies. FRACS is based on the ray-tracing technique without scattering, but supplemented with the use of quadtree meshes and the full symmetries of ...
RAY TRACING IMPLEMENTATION IN JAVA PROGRAMMING LANGUAGE
Aybars UĞUR
2002-01-01
Full Text Available In this paper realism in computer graphics and components providing realism are discussed at first. It is mentioned about illumination models, surface rendering methods and light sources for this aim. After that, ray tracing which is a technique for creating two dimensional image of a three-dimensional virtual environment is explained briefly. A simple ray tracing algorithm was given. "SahneIzle" which is a ray tracing program implemented in Java programming language which can be used on the internet is introduced. As a result, importance of network-centric ray tracing software is discussed.
Lo, Ch. K.; Lim, Y. S.; Tan, S. G.; Rahman, F. A. [Faculty of Engineering and Science, University Tunku Abdul Rahman, Jalan Genting Klang, 53300, Kuala Lumpur (Malaysia)
2010-12-15
A Luminescent Solar Concentrator (LSC) is a transparent plate containing luminescent material with photovoltaic (PV) cells attached to its edges. Sunlight entering the plate is absorbed by the luminescent material, which in turn emits light. The emitted light propagates through the plate and arrives at the PV cells through total internal reflection. The ratio of the area of the relatively cheap polymer plate to that of the expensive PV cells is increased, and the cost per unit of solar electricity can be reduced by 75%. To improve the emission performance of LSCs, simulation modeling of LSCs becomes essential. Ray-tracing modeling is a popular approach for simulating LSCs due to its great ability of modeling various LSC structures under direct and diffuse sunlight. However, this approach requires substantial amount of measurement input data. Also, the simulation time is enormous because it is a forward-ray tracing method that traces all the rays propagating from the light source to the concentrator. On the other hand, the thermodynamic approach requires substantially less input parameters and simulation time, but it can only be used to model simple LSC designs with direct sunlight. Therefore, a new hybrid model was developed to perform various simulation studies effectively without facing the issues arisen from the existing ray-tracing and thermodynamic models. The simulation results show that at least 60% of the total output irradiance of a LSC is contributed by the light trapped and channeled by the LSC. The novelty of this hybrid model is the concept of integrating the thermodynamic model with a well-developed Radiance ray-tracing model, hence making this model as a fast, powerful and cost-effective tool for the design of LSCs. (authors)
Chin Kim Lo
2010-11-01
Full Text Available A Luminescent Solar Concentrator (LSC is a transparent plate containing luminescent material with photovoltaic (PV cells attached to its edges. Sunlight entering the plate is absorbed by the luminescent material, which in turn emits light. The emitted light propagates through the plate and arrives at the PV cells through total internal reflection. The ratio of the area of the relatively cheap polymer plate to that of the expensive PV cells is increased, and the cost per unit of solar electricity can be reduced by 75%. To improve the emission performance of LSCs, simulation modeling of LSCs becomes essential. Ray-tracing modeling is a popular approach for simulating LSCs due to its great ability of modeling various LSC structures under direct and diffuse sunlight. However, this approach requires substantial amount of measurement input data. Also, the simulation time is enormous because it is a forward-ray tracing method that traces all the rays propagating from the light source to the concentrator. On the other hand, the thermodynamic approach requires substantially less input parameters and simulation time, but it can only be used to model simple LSC designs with direct sunlight. Therefore, a new hybrid model was developed to perform various simulation studies effectively without facing the issues arisen from the existing ray-tracing and thermodynamic models. The simulation results show that at least 60% of the total output irradiance of a LSC is contributed by the light trapped and channeled by the LSC. The novelty of this hybrid model is the concept of integrating the thermodynamic model with a well-developed Radiance ray-tracing model, hence making this model as a fast, powerful and cost-effective tool for the design of LSCs.
Fast Ray Tracing NURBS Surfaces
秦开怀; 龚明伦; 等
1996-01-01
In this paper,a new algorithm wit extrapolation process for computing the ray/surface intersection is presented.Also,a ray is defined to be the intersection of two planes,which are non-orthogonal in general,in such a way that the number of multiplication operations is reduced.In the preprocessing step,NURBS surfaces are subdivded adaptively into rational Bezier patches.Parallelepipeds are used to enclose the respective patches as tightly as possible Therefore,for each ray that hits the enclosure(i.e.,parallelepiped)of a patch the intersection points with the parallelepiped's faces can be used to yield a good starting point for the following iteration.The improved Newton iteration with extrapolation process saves CPU time by reducing the number of iteration steps.The intersection scheme is facter than previous methods for which published performance data allow reliable comparison.The method may also be used to speed up tracing the intersection of two parametric surfaces and oter operations that need Newton iteration.
Slot Thing, Rune; Bernchou, Uffe; Mainegra-Hing, Ernesto;
2013-01-01
Abstract Purpose. Cone beam computed tomography (CBCT) image quality is limited by scattered photons. Monte Carlo (MC) simulations provide the ability of predicting the patient-specific scatter contamination in clinical CBCT imaging. Lengthy simulations prevent MC-based scatter correction from...... and pelvis scan were simulated within 2% statistical uncertainty in two hours per scan. Within the same time, the ray tracing algorithm provided the primary signal for each of the projections. Thus, all the data needed for MC-based scatter correction in clinical CBCT imaging was obtained within two hours per...
Development of ray tracing visualization program by Monte Carlo method
Higuchi, Kenji; Otani, Takayuki [Japan Atomic Energy Research Inst., Tokyo (Japan); Hasegawa, Yukihiro
1997-09-01
Ray tracing algorithm is a powerful method to synthesize three dimensional computer graphics. In conventional ray tracing algorithms, a view point is used as a starting point of ray tracing, from which the rays are tracked up to the light sources through center points of pixels on the view screen to calculate the intensities of the pixels. This manner, however, makes it difficult to define the configuration of light source as well as to strictly simulate the reflections of the rays. To resolve these problems, we have developed a new ray tracing means which traces rays from a light source, not from a view point, with use of Monte Carlo method which is widely applied in nuclear fields. Moreover, we adopt the variance reduction techniques to the program with use of the specialized machine (Monte-4) for particle transport Monte Carlo so that the computational time could be successfully reduced. (author)
Development of ray tracing visualization program by Monte Carlo method
Higuchi, Kenji; Otani, Takayuki [Japan Atomic Energy Research Inst., Tokyo (Japan); Hasegawa, Yukihiro
1997-09-01
Ray tracing algorithm is a powerful method to synthesize three dimensional computer graphics. In conventional ray tracing algorithms, a view point is used as a starting point of ray tracing, from which the rays are tracked up to the light sources through center points of pixels on the view screen to calculate the intensities of the pixels. This manner, however, makes it difficult to define the configuration of light source as well as to strictly simulate the reflections of the rays. To resolve these problems, we have developed a new ray tracing means which traces rays from a light source, not from a view point, with use of Monte Carlo method which is widely applied in nuclear fields. Moreover, we adopt the variance reduction techniques to the program with use of the specialized machine (Monte-4) for particle transport Monte Carlo so that the computational time could be successfully reduced. (author)
Interactive Ray Tracing for Virtual TV Studio Applications
Philipp Slusallek
2005-12-01
Full Text Available In the last years, the well known ray tracing algorithm gained new popularity with the introduction of interactive ray tracing methods. The high modularity and the ability to produce highly realistic images make ray tracing an attractive alternative to raster graphics hardware.Interactive ray tracing also proved its potential in the field of Mixed Reality rendering and provides novel methods for seamless integration of real and virtual content. Actor insertion methods, a subdomain of Mixed Reality and closely related to virtual television studio techniques, can use ray tracing for achieving high output quality in conjunction with appropriate visual cues like shadows and reflections at interactive frame rates.In this paper, we show how interactive ray tracing techniques can provide new ways of implementing future virtual studio applications.
IONORT: IONOsphere Ray-Tracing
Bianchi, C.; Settimi, A; Azzarone, A.
2010-01-01
Il pacchetto applicativo “IONORT” per il calcolo del ray-tracing può essere utilizzato dagli utenti che impiegano il sistema operativo Windows. È un programma la cui interfaccia grafica con l’utente è realizzata in MATLAB. In realtà, il programma lancia un eseguibile che integra il sistema d’equazioni differenziali scritto in linguaggio Fortran e ne importa l’output nel programma MATLAB, il quale genera i grafici e altre informazioni sul raggio. A completamento di questa premessa va detto ...
Kovács, Z.; Harko, T.
2011-11-01
We present a full general relativistic numerical code for estimating the energy-momentum deposition rate (EMDR) from neutrino pair annihilation (?). The source of the neutrinos is assumed to be a neutrino-cooled accretion disc around neutron and quark stars. We calculate the neutrino trajectories by using a ray-tracing algorithm with the general relativistic Hamilton's equations for neutrinos and derive the spatial distribution of the EMDR due to the annihilations of neutrinos and antineutrinos around rotating neutron and quark stars. We obtain the EMDR for several classes of rotating neutron stars, described by different equations of state of the neutron matter, and for quark stars, described by the Massachusetts Institute of Technology (MIT) bag model equation of state and in the colour-flavour-locked (CFL) phase. The distribution of the total annihilation rate of the neutrino-antineutrino pairs around rotating neutron and quark stars is studied for isothermal discs and accretion discs in thermodynamical equilibrium. We demonstrate both the differences in the equations of state for neutron and quark matter and rotation with the general relativistic effects significantly modify the EMDR of the electrons and positrons generated by the neutrino-antineutrino pair annihilation around compact stellar objects, as measured at infinity.
Three-dimensional polarization ray-tracing calculus II: retardance.
Yun, Garam; McClain, Stephen C; Chipman, Russell A
2011-06-20
The concept of retardance is critically analyzed for ray paths through optical systems described by a three-by-three polarization ray-tracing matrix. Algorithms are presented to separate the effects of retardance from geometric transformations. The geometric transformation described by a "parallel transport matrix" characterizes nonpolarizing propagation through an optical system, and also provides a proper relationship between sets of local coordinates along the ray path. The proper retardance is calculated by removing this geometric transformation from the three-by-three polarization ray-tracing matrix. Two rays with different ray paths through an optical system can have the same polarization ray-tracing matrix but different retardances. The retardance and diattenuation of an aluminum-coated three fold-mirror system are analyzed as an example.
Reverse ray tracing for transformation optics.
Hu, Chia-Yu; Lin, Chun-Hung
2015-06-29
Ray tracing is an important technique for predicting optical system performance. In the field of transformation optics, the Hamiltonian equations of motion for ray tracing are well known. The numerical solutions to the Hamiltonian equations of motion are affected by the complexities of the inhomogeneous and anisotropic indices of the optical device. Based on our knowledge, no previous work has been conducted on ray tracing for transformation optics with extreme inhomogeneity and anisotropicity. In this study, we present the use of 3D reverse ray tracing in transformation optics. The reverse ray tracing is derived from Fermat's principle based on a sweeping method instead of finding the full solution to ordinary differential equations. The sweeping method is employed to obtain the eikonal function. The wave vectors are then obtained from the gradient of that eikonal function map in the transformed space to acquire the illuminance. Because only the rays in the points of interest have to be traced, the reverse ray tracing provides an efficient approach to investigate the illuminance of a system. This approach is useful in any form of transformation optics where the material property tensor is a symmetric positive definite matrix. The performance and analysis of three transformation optics with inhomogeneous and anisotropic indices are explored. The ray trajectories and illuminances in these demonstration cases are successfully solved by the proposed reverse ray tracing method.
Validation of Ray Tracing Code Refraction Effects
Heath, Stephanie L.; McAninch, Gerry L.; Smith, Charles D.; Conner, David A.
2008-01-01
NASA's current predictive capabilities using the ray tracing program (RTP) are validated using helicopter noise data taken at Eglin Air Force Base in 2007. By including refractive propagation effects due to wind and temperature, the ray tracing code is able to explain large variations in the data observed during the flight test.
Real time ray tracing of skeletal implicit surfaces
Rouiller, Olivier; Bærentzen, Jakob Andreas
Modeling and rendering in real time is usually done via rasterization of polygonal meshes. We present a method to model with skeletal implicit surfaces and an algorithm to ray trace these surfaces in real time in the GPU. Our skeletal representation of the surfaces allows to create smooth models...
The Alba ray tracing code: ART
Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi
2013-09-01
The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.
Benthin, Carsten; Wald, Ingo; Woop, Sven; Ernst, Manfred; Mark, William R
2012-09-01
Wide-SIMD hardware is power and area efficient, but it is challenging to efficiently map ray tracing algorithms to such hardware especially when the rays are incoherent. The two most commonly used schemes are either packet tracing, or relying on a separate traversal stack for each SIMD lane. Both work great for coherent rays, but suffer when rays are incoherent: The former experiences a dramatic loss of SIMD utilization once rays diverge; the latter requires a large local storage, and generates multiple incoherent streams of memory accesses that present challenges for the memory system. In this paper, we introduce a single-ray tracing scheme for incoherent rays that uses just one traversal stack on 16-wide SIMD hardware. It uses a bounding-volume hierarchy with a branching factor of four as the acceleration structure, exploits four-wide SIMD in each box and primitive intersection test, and uses 16-wide SIMD by always performing four such node or primitive tests in parallel. We then extend this scheme to a hybrid tracing scheme that automatically adapts to varying ray coherence by starting out with a 16-wide packet scheme and switching to the new single-ray scheme as soon as rays diverge. We show that on the Intel Many Integrated Core architecture this hybrid scheme consistently, and over a wide range of scenes and ray distributions, outperforms both packet and single-ray tracing.
Ray tracing reconstruction investigation for C-arm tomosynthesis
Malalla, Nuhad A. Y.; Chen, Ying
2016-04-01
C-arm tomosynthesis is a three dimensional imaging technique. Both x-ray source and the detector are mounted on a C-arm wheeled structure to provide wide variety of movement around the object. In this paper, C-arm tomosynthesis was introduced to provide three dimensional information over a limited view angle (less than 180o) to reduce radiation exposure and examination time. Reconstruction algorithms based on ray tracing method such as ray tracing back projection (BP), simultaneous algebraic reconstruction technique (SART) and maximum likelihood expectation maximization (MLEM) were developed for C-arm tomosynthesis. C-arm tomosynthesis projection images of simulated spherical object were simulated with a virtual geometric configuration with a total view angle of 40 degrees. This study demonstrated the sharpness of in-plane reconstructed structure and effectiveness of removing out-of-plane blur for each reconstruction algorithms. Results showed the ability of ray tracing based reconstruction algorithms to provide three dimensional information with limited angle C-arm tomosynthesis.
Backward ray tracing for ultrasonic imaging
Breeuwer, R.
1990-01-01
Focused ultrasonic beams frequently pass one or more media interfaces, strongly affecting the ultrasonic beamshape and focusing. A computer program, based on backward ray tracing was developed to compute the shape of a corrected focusing mirror. This shape is verified with another program; then the
Backward ray tracing for ultrasonic imaging
Breeuwer, R.
1990-01-01
Focused ultrasonic beams frequently pass one or more media interfaces, strongly affecting the ultrasonic beamshape and focusing. A computer program, based on backward ray tracing was developed to compute the shape of a corrected focusing mirror. This shape is verified with another program; then the
A Fast Ray-Tracing Using Bounding Spheres and Frustum Rays for Dynamic Scene Rendering
Suzuki, Ken-Ichi; Kaeriyama, Yoshiyuki; Komatsu, Kazuhiko; Egawa, Ryusuke; Ohba, Nobuyuki; Kobayashi, Hiroaki
Ray tracing is one of the most popular techniques for generating photo-realistic images. Extensive research and development work has made interactive static scene rendering realistic. This paper deals with interactive dynamic scene rendering in which not only the eye point but also the objects in the scene change their 3D locations every frame. In order to realize interactive dynamic scene rendering, RTRPS (Ray Tracing based on Ray Plane and Bounding Sphere), which utilizes the coherency in rays, objects, and grouped-rays, is introduced. RTRPS uses bounding spheres as the spatial data structure which utilizes the coherency in objects. By using bounding spheres, RTRPS can ignore the rotation of moving objects within a sphere, and shorten the update time between frames. RTRPS utilizes the coherency in rays by merging rays into a ray-plane, assuming that the secondary rays and shadow rays are shot through an aligned grid. Since a pair of ray-planes shares an original ray, the intersection for the ray can be completed using the coherency in the ray-planes. Because of the three kinds of coherency, RTRPS can significantly reduce the number of intersection tests for ray tracing. Further acceleration techniques for ray-plane-sphere and ray-triangle intersection are also presented. A parallel projection technique converts a 3D vector inner product operation into a 2D operation and reduces the number of floating point operations. Techniques based on frustum culling and binary-tree structured ray-planes optimize the order of intersection tests between ray-planes and a sphere, resulting in 50% to 90% reduction of intersection tests. Two ray-triangle intersection techniques are also introduced, which are effective when a large number of rays are packed into a ray-plane. Our performance evaluations indicate that RTRPS gives 13 to 392 times speed up in comparison with a ray tracing algorithm without organized rays and spheres. We found out that RTRPS also provides competitive
A three-dimensional sound ray tracing method by deploying regular tetrahedrons
JIANG Wei; LI Taibao
2005-01-01
A sound ray tracing algorithm is presented, which helps to rapidly find the sound ray trajectories in three-dimensional (3-D) space. At each step of ray tracing, a small regular tetrahedron is made in front of a ray, so that the sound speed field inside may be approximately regarded as linear. Since a ray trajectory in the linear sound speed field is always on a plane, it may be obtained by the two-dimensional (2-D) sound ray tracing method by deploying triangles.The theoretical derivation is given and a numerical model is discussed. It shows that the algorithm is fast and precise. It is also more concise and reliable than the traditional 3-D algorithms, and may be used to avoid the damage to the precision by the acoustic refraction in the 3-D ultrasound computerized tomography.
Three-dimensional polarization ray-tracing calculus I: definition and diattenuation.
Yun, Garam; Crabtree, Karlton; Chipman, Russell A
2011-06-20
A three-by-three polarization ray-tracing matrix method for polarization ray tracing in optical systems is presented for calculating the polarization transformations associated with ray paths through optical systems. The method is a three-dimensional generalization of the Jones calculus. Reflection and refraction algorithms are provided. Diattenuation of the optical system is calculated via singular value decomposition. Two numerical examples, a three fold-mirror system and a hollow corner cube, demonstrate the method.
Trans-Ionospheric High Frequency Signal Ray Tracing
Wright, S.; Gillespie, R. J.
2012-09-01
All electromagnetic radiation undergoes refraction as it propagates through the atmosphere. Tropospheric refraction is largely governed by interaction of the radiation with bounded electrons; ionospheric refraction is primarily governed by free electron interactions. The latter phenomenon is important for propagation and refraction of High Frequency (HF) through Extremely High Frequency (EHF) signals. The degree to which HF to EHF signals are bent is dependent upon the integrated refractive effect of the ionosphere: a result of the signal's angle of incidence with the boundaries between adjacent ionospheric regions, the magnitude of change in electron density between two regions, as well as the frequency of the signal. In the case of HF signals, the ionosphere may bend the signal so much that it is directed back down towards the Earth, making over-the-horizon HF radio communication possible. Ionospheric refraction is a major challenge for space-based geolocation applications, where the ionosphere is typically the biggest contributor to geolocation error. Accurate geolocation requires an algorithm that accurately reflects the physical process of a signal transiting the ionosphere, and an accurate specification of the ionosphere at the time of the signal transit. Currently implemented solutions are limited by both the algorithm chosen to perform the ray trace and by the accuracy of the ionospheric data used in the calculations. This paper describes a technique for adapting a ray tracing algorithm to run on a General-Purpose Graphics Processing Unit (GPGPU or GPU), and using a physics-based model specifying the ionosphere at the time of signal transit. This technique allows simultaneous geolocation of significantly more signals than an equivalently priced Central Processing Unit (CPU) based system. Additionally, because this technique makes use of the most widely accepted numeric algorithm for ionospheric ray tracing and a timely physics-based model of the ionosphere
基于GPU和均匀栅格法的光线追踪算法研究%Research of ray-tracing algorithm based on GPU and uniform grid method
童星; 袁道华
2011-01-01
由于GPU(图形处理器)性能的大幅提高和可编程性的发展,基于GPU的光线追踪算法逐渐成为研究热点,光线追踪算法需要的计算量大,基于此,分析了光线追踪算法的基本原理,在NVIDIA公司的CUDA(计算统一设备体系结构)环境下采用均匀栅格法作为加速结构实现了光线追踪算法.实验结果表明,该计算模式相对于传统基于CPU的光线追踪算法具有更快的整体运算速度,GPU适合处理高密度数据计算.%Ray-tracing is the technique that rendering images from a three dimensional model of a scene by projecting it on to a two dimensional image plane. In the past decades, the development of the computer graphic (especial for raster graphics systems) emphasize on building the high-efficient, low-cost large graphic systems. For the above-mentioned reasons such as the implementation of a large number of mathematical calculation, the large-scale parallel processing technologies play a important role in graphics compose. The principle of ray-tracing algorithm is introduced. A ray-tracing parallel processing model is built through the research on GPU stream processing and MPICH, and it is proved that applying this mode reduces computation time effectively and the quality of the generated graph holds no difference with that by traditional stand-alone computer.
RAY TRACING RENDER MENGGUNAKAN FRAGMENT ANTI ALIASING
Febriliyan Samopa
2008-07-01
Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Rendering is generating surface and three-dimensional effects on an object displayed on a monitor screen. Ray tracing as a rendering method that traces ray for each image pixel has a drawback, that is, aliasing (jaggies effect. There are some methods for executing anti aliasing. One of those methods is OGSS (Ordered Grid Super Sampling. OGSS is able to perform aliasing well. However, this method requires more computation time since sampling of all pixels in the image will be increased. Fragment Anti Aliasing (FAA is a new alternative method that can cope with the drawback. FAA will check the image when performing rendering to a scene. Jaggies effect is only happened at curve and gradient object. Therefore, only this part of object that will experience sampling magnification. After this sampling magnification and the pixel values are computed, then downsample is performed to retrieve the original pixel values. Experimental results show that the software can implement ray tracing well in order to form images, and it can implement FAA and OGSS technique to perform anti aliasing. In general, rendering using FAA is faster than using OGSS
IL RAY-TRACING NELLA IONOSFERA
Azzarone, A.; Bianchi, C.; Settimi, A
2010-01-01
Il pacchetto applicativo “IONORT” per il calcolo del ray-tracing può essere utilizzato dagli utenti che impiegano il sistema operativo Windows. È un programma la cui interfaccia grafica con l’utente è realizzata in MATLAB. In realtà, il programma lancia un eseguibile che integra il sistema d’equazioni differenziali scritto in linguaggio Fortran e ne importa l’output nel programma MATLAB, il quale genera i grafici e altre informazioni sul raggio. A completamento di questa premessa va detto che...
IONORT: IONOsphere Ray-Tracing - Ray-tracing program in ionospheric magnetoplasma
Bianchi, Cesidio; Settimi, Alessandro; Azzarone, Adriano
2010-01-01
The application package "IONORT" for the calculation of ray-tracing can be used by customers using the Windows operating system. It is a program whose interface with the user is created in MATLAB. In fact, the program launches an executable that integrates the system of differential equations written in Fortran and importing the output in the MATLAB program, which generates graphics and other information on the ray. This work is inspired mainly by the program of Jones and Stephenson, widespre...
RayTrace: A Simplified Ray Tracing Software for use in AutoCad
Reimann, Gregers Peter; Tang, C.K.
2005-01-01
A design aid tool for testing and development of daylighting systems was developed. A simplified ray tracing software was programmed in Lisp for AutoCad. Only fully specularly reflective, fully transparent and fully absorbant surfaces can be defined in the software. The software is therefore best...
RayTrace: A Simplified Ray Tracing Software for use in AutoCad
Reimann, Gregers Peter; Tang, C.K.
2005-01-01
A design aid tool for testing and development of daylighting systems was developed. A simplified ray tracing software was programmed in Lisp for AutoCad. Only fully specularly reflective, fully transparent and fully absorbant surfaces can be defined in the software. The software is therefore best...
Algorithms and analysis for underwater vehicle plume tracing.
Byrne, Raymond Harry; Savage, Elizabeth L. (Texas A& M University, College Station, TX); Hurtado, John Edward (Texas A& M University, College Station, TX); Eskridge, Steven E.
2003-07-01
The goal of this research was to develop and demonstrate cooperative 3-D plume tracing algorithms for miniature autonomous underwater vehicles. Applications for this technology include Lost Asset and Survivor Location Systems (L-SALS) and Ship-in-Port Patrol and Protection (SP3). This research was a joint effort that included Nekton Research, LLC, Sandia National Laboratories, and Texas A&M University. Nekton Research developed the miniature autonomous underwater vehicles while Sandia and Texas A&M developed the 3-D plume tracing algorithms. This report describes the plume tracing algorithm and presents test results from successful underwater testing with pseudo-plume sources.
Powerful scriptable ray tracing package xrt
Klementiev, Konstantin; Chernikov, Roman
2014-09-01
We present an open source python based ray tracing tool that offers several useful features in graphical presentation, material properties, advanced calculations of synchrotron sources, implementation of diffractive and refractive elements, complex (also closed) surfaces and multiprocessing. The package has many usage examples which are supplied together with the code and visualized on its web page. We exemplify the present version by modeling (i) a curved crystal analyzer, (ii) a quarter wave plate, (iii) Bragg-Fresnel optics and (iv) multiple reflective and non-sequential optics (polycapillary). The present version implements the use of OpenCL framework that executes calculations on both CPUs and GPUs. Currently, the calculations of an undulator source on a GPU show a gain of about two orders of magnitude in computing time. The development version is successful in modelling the wavefront propagation. Two examples of diffraction on a plane mirror and a plane blazed grating are given for a beam with a finite energy band.
Tracing Analytic Ray Curves for Light and Sound Propagation in Non-Linear Media.
Mo, Qi; Yeh, Hengchin; Manocha, Dinesh
2016-11-01
The physical world consists of spatially varying media, such as the atmosphere and the ocean, in which light and sound propagates along non-linear trajectories. This presents a challenge to existing ray-tracing based methods, which are widely adopted to simulate propagation due to their efficiency and flexibility, but assume linear rays. We present a novel algorithm that traces analytic ray curves computed from local media gradients, and utilizes the closed-form solutions of both the intersections of the ray curves with planar surfaces, and the travel distance. By constructing an adaptive unstructured mesh, our algorithm is able to model general media profiles that vary in three dimensions with complex boundaries consisting of terrains and other scene objects such as buildings. Our analytic ray curve tracer with the adaptive mesh improves the efficiency considerably over prior methods. We highlight the algorithm's application on simulation of visual and sound propagation in outdoor scenes.
High performance dosimetry calculations using adapted ray-tracing
Perrotte, Lancelot; Saupin, Guillaume
2010-11-01
When preparing interventions on nuclear sites, it is interesting to study different scenarios, to identify the most appropriate one for the operator(s). Using virtual reality tools is a good way to simulate the potential scenarios. Thus, taking advantage of very efficient computation times can help the user studying different complex scenarios, by immediately evaluating the impact of any changes. In the field of radiation protection, people often use computation codes based on the straight line attenuation method with build-up factors. As for other approaches, geometrical computations (finding all the interactions between radiation rays and the scene objects) remain the bottleneck of the simulation. We present in this paper several optimizations used to speed up these geometrical computations, using innovative GPU ray-tracing algorithms. For instance, we manage to compute every intersectionbetween 600 000 rays and a huge 3D industrial scene in a fraction of second. Moreover, our algorithm works the same way for both static and dynamic scenes, allowing easier study of complex intervention scenarios (where everything moves: the operator(s), the shielding objects, the radiation sources).
Snellenburg, J.J.; Braaf, B.; Hermans, E.A.; Heijde, van der R.G.L.; Sicam, V.A.
2010-01-01
A forward ray tracing (FRT) model is presented to determine the exact image projection in a general corneal topography system. Consequently, the skew ray error in Placido-based topography is demonstrated. A quantitative analysis comparing FRT-based algorithms and Placido-based algorithms in reconstr
Real-time ray tracing of implicit surfaces on the GPU.
Singh, Jag Mohan; Narayanan, P J
2010-01-01
Compact representation of geometry using a suitable procedural or mathematical model and a ray-tracing mode of rendering fit the programmable graphics processor units (GPUs) well. Several such representations including parametric and subdivision surfaces have been explored in recent research. The important and widely applicable category of the general implicit surface has received less attention. In this paper, we present a ray-tracing procedure to render general implicit surfaces efficiently on the GPU. Though only the fourth or lower order surfaces can be rendered using analytical roots, our adaptive marching points algorithm can ray trace arbitrary implicit surfaces without multiple roots, by sampling the ray at selected points till a root is found. Adapting the sampling step size based on a proximity measure and a horizon measure delivers high speed. The sign test can handle any surface without multiple roots. The Taylor test that uses ideas from interval analysis can ray trace many surfaces with complex roots. Overall, a simple algorithm that fits the SIMD architecture of the GPU results in high performance. We demonstrate the ray tracing of algebraic surfaces up to order 50 and nonalgebraic surfaces including a Blinn's blobby with 75 spheres at better than interactive frame rates.
An Energy Conservative Ray-Tracing Method With a Time Interpolation of the Force Field
Yao, Jin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-02-10
A new algorithm that constructs a continuous force field interpolated in time is proposed for resolving existing difficulties in numerical methods for ray-tracing. This new method has improved accuracy, but with the same degree of algebraic complexity compared to Kaisers method.
Simplification of vector ray tracing by the groove function.
Hu, Zhongwen; Liu, Zuping; Wang, Qiuping
2005-01-01
Tracing rays through arbitrary diffraction gratings (including holographic gratings of the second generation fabricated on a curved substrate) by the vector form is somewhat complicated. Vector ray tracing utilizes the local groove density, the calculation of which highly depends on how the grooves are generated. Characterizing a grating by its groove function, available for almost arbitrary gratings, is much simpler than doing so by its groove density, essentially being a vector. Applying the concept of Riemann geometry, we give an expression of the groove density by the groove function. The groove function description of a grating can thus be incorporated into vector ray tracing, which is beneficial especially at the design stage. A unified explicit grating ray-tracing formalism is given as well.
GRay: a Massively Parallel GPU-Based Code for Ray Tracing in Relativistic Spacetimes
Chan, Chi-kwan; Ozel, Feryal
2013-01-01
We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This GPU-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 nanosecond per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing CPU-based ray tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and lightcurves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of K...
Multiple Encryption-based Algorithm of Agricultural Product Trace Code
2012-01-01
To establish a sound traceability system of agricultural products and guarantee security of agricultural products,an algorithm is proposed to encrypt trace code of agricultural products.Original trace code consists of 34 digits indicating such information as place of origin,name of product,date of production and authentication.Area code is used to indicate enterprise information,the encrypted algorithm is designed because of the increasing code length,such coding algorithms as system conversion and section division are applied for the encrypted conversion of code of origin place and production date code,moreover,section identification code and authentication code are permutated and combined to produce check code.Through the multiple encryption and code length compression,34 digits are compressed to 20 on the basis of ensuring complete coding information,shorter code length and better encryption enable the public to know information about agricultural products without consulting professional database.
IONORT: IONOsphere Ray-Tracing - Ray-tracing program in ionospheric magnetoplasma
Bianchi, Cesidio; Azzarone, Adriano
2010-01-01
The application package "IONORT" for the calculation of ray-tracing can be used by customers using the Windows operating system. It is a program whose interface with the user is created in MATLAB. In fact, the program launches an executable that integrates the system of differential equations written in Fortran and imports the output in the MATLAB program, which generates graphics and other information on the ray. This work is inspired mainly by the program of Jones and Stephenson, widespread in the scientific community that is interested in radio propagation via the ionosphere. The program is written in FORTRAN 77, a mainframe CDC-3800. The code itself, as well as being very elegant, is highly efficient and provides the basis for many programs now in use mainly in the Coordinate Registration (CR) of Over The Horizon (OTH) radar. The input and output of this program require devices no longer in use for several decades and there are no compilers that accept instructions written for that type of mainframe. For ...
a Highly-Accurate and Fast Ray Tracing Sysyem for HF and UHF Simulations
Jones, J. C.; Richards, G. P.
2016-12-01
Accurate and fast ray tracing is critical for radiowave propagation tools and applications. A ray tracer needs to be accurate to reduce accumulated errors which come from the myriad of models (ionospheric electron density, magnetic field, ion density, neutral molecule density, absorption, land surface, ocean surface, and potentially others) required for accurate simulation. A ray tracer must also be fast to make the use of applications practical. Here we introduce NINJART Is Not Just Another Ray Tracer (NINJART), a highly accurate and fast ray tracing system. NINJART consists of an embarrassingly parallel algorithm rigorously solving the 3-D Hasselgrove equations with a Runge-Kutta adaptive step quadrature rule to accurately trace high frequency to ultra-high frequency radiowaves. It is capable of a wide range of propagation modes from multi-ground hops to vertical and near vertical incidence rays, chordal modes, and other esoteric paths. It is capable of using a variety of ionospheric models to include operational data assimilative or empirical models depending on the needs of the user. It can forward and backward ray trace, calculate time of flight, find the focus factor for signals near the skip zone and calculate the angle of arrival from a known transmitter to a known receiver location. Additionally NINJART uses magnetic field data from various models including the International Geomagnetic Reference Field to reduce the inaccuracies introduced by the simple dipole model, which is commonly used by other ray tracers, in calculating the effects of magneto-ionic splitting thereby allowing accurate traces of both the ordinary and extraordinary mode rays. The NINJART algorithm is a heterogeneous system utilizing the CUDA programming language to take advantage of the computing power of graphical processing units. This allows tracing of thousands of rays concurrently. NINJART achieves additional processing savings, without sacrificing accuracy, by use of an adaptive
Masmoudi, Nabil
2014-01-01
We present an approximate, but efficient and sufficiently accurate P-wave ray tracing and dynamic ray tracing procedure for 3D inhomogeneous, weakly orthorhombic media with varying orientation of symmetry planes. In contrast to commonly used approaches, the orthorhombic symmetry is preserved at any point of the model. The model is described by six weak-anisotropy parameters and three Euler angles, which may vary arbitrarily, but smoothly, throughout the model. We use the procedure for the calculation of rays and corresponding two-point traveltimes in a VSP experiment in a part of the BP benchmark model generalized to orthorhombic symmetry.
Implementation of Refined Ray Tracing inside a Space Module
Balamati Choudhury
2012-08-01
Full Text Available Modern space modules are susceptible to EM radiation from both external and internal sources within the space module. Since the EM waves for various operations are frequently in the high-frequency domain, asymptotic raytheoretic methods are often the most optimal choice for deterministic EM field analysis. In this work, surface modeling of a typical manned space module is done by hybridizing a finite segment of right circular cylinder and a general paraboloid of revolution (GPOR frustum. A transmitting source is placed inside the space module and test rays are launched from the transmitter. The rays are allowed to propagate inside the cavity. Unlike the available ray-tracing package, that use numerical search methods, a quasi-analytical ray-propagation model is developed to obtain the ray-path details inside the cavity which involves the ray-launching, ray-bunching, and an adaptive cube for ray-reception.
Ray Tracing RF Field Prediction: An Unforgiving Validation
E. M. Vitucci
2015-01-01
Full Text Available The prediction of RF coverage in urban environments is now commonly considered a solved problem with tens of models proposed in the literature showing good performance against measurements. Among these, ray tracing is regarded as one of the most accurate ones available. In the present work, however, we show that a great deal of work is still needed to make ray tracing really unleash its potential in practical use. A very extensive validation of a state-of-the-art 3D ray tracing model is carried out through comparison with measurements in one of the most challenging environments: the city of San Francisco. Although the comparison is based on RF cellular coverage at 850 and 1900 MHz, a widely studied territory, very relevant sources of error and inaccuracy are identified in several cases along with possible solutions.
GPU-based Ray Tracing of Dynamic Scenes
Christopher Lux
2010-08-01
Full Text Available Interactive ray tracing of non-trivial scenes is just becoming feasible on single graphics processing units (GPU. Recent work in this area focuses on building effective acceleration structures, which work well under the constraints of current GPUs. Most approaches are targeted at static scenes and only allow navigation in the virtual scene. So far support for dynamic scenes has not been considered for GPU implementations. We have developed a GPU-based ray tracing system for dynamic scenes consisting of a set of individual objects. Each object may independently move around, but its geometry and topology are static.
An active set algorithm for tracing parametrized optima
Rakowska, J.; Haftka, R. T.; Watson, L. T.
1991-01-01
Optimization problems often depend on parameters that define constraints or objective functions. It is often necessary to know the effect of a change in a parameter on the optimum solution. An algorithm is presented here for tracking paths of optimal solutions of inequality constrained nonlinear programming problems as a function of a parameter. The proposed algorithm employs homotopy zero-curve tracing techniques to track segments where the set of active constraints is unchanged. The transition between segments is handled by considering all possible sets of active constraints and eliminating nonoptimal ones based on the signs of the Lagrange multipliers and the derivatives of the optimal solutions with respect to the parameter. A spring-mass problem is used to illustrate all possible kinds of transition events, and the algorithm is applied to a well-known ten-bar truss structural optimization problem.
Statistical Inverse Ray Tracing for Image-Based 3D Modeling.
Liu, Shubao; Cooper, David B
2014-10-01
This paper proposes a new formulation and solution to image-based 3D modeling (aka "multi-view stereo") based on generative statistical modeling and inference. The proposed new approach, named statistical inverse ray tracing, models and estimates the occlusion relationship accurately through optimizing a physically sound image generation model based on volumetric ray tracing. Together with geometric priors, they are put together into a Bayesian formulation known as Markov random field (MRF) model. This MRF model is different from typical MRFs used in image analysis in the sense that the ray clique, which models the ray-tracing process, consists of thousands of random variables instead of two to dozens. To handle the computational challenges associated with large clique size, an algorithm with linear computational complexity is developed by exploiting, using dynamic programming, the recursive chain structure of the ray clique. We further demonstrate the benefit of exact modeling and accurate estimation of the occlusion relationship by evaluating the proposed algorithm on several challenging data sets.
A Rotor Tip Vortex Tracing Algorithm for Image Post-Processing
Overmeyer, Austin D.
2015-01-01
A neurite tracing algorithm, originally developed for medical image processing, was used to trace the location of the rotor tip vortex in density gradient flow visualization images. The tracing algorithm was applied to several representative test images to form case studies. The accuracy of the tracing algorithm was compared to two current methods including a manual point and click method and a cross-correlation template method. It is shown that the neurite tracing algorithm can reduce the post-processing time to trace the vortex by a factor of 10 to 15 without compromising the accuracy of the tip vortex location compared to other methods presented in literature.
Parallel Ray Tracing Using the Message Passing Interface
2007-09-01
efficiency of 97.9% and a normalized ray-tracing rate of 6.95 ?106 rays ? surfaces/(s ? processor) in a system with 22 planar surfaces, two paraboloid ...with 22 planar surfaces, two paraboloid reflectors, and one hyperboloid refractor. The need for a load-balancing software was obviated by the use of a...specified for each type of optical surface—planar, spherical, paraboloid , hyperboloid, aspheric—and whether it applies for reflection or refraction. The
Ray Tracing Modelling of Reflector for Vertical Bifacial Panel
Jakobsen, Michael Linde; Thorsteinsson, Sune; Poulsen, Peter Behrensdorff
2016-01-01
Bifacial solar panels have recently become a new attractive building block for PV systems. In this work we propose a reflector system for a vertical bifacial panel, and use ray tracing modelling to model the performance. Particularly, we investigate the impact of the reflector volume being filled...
Ray tracing and refraction in the modified US1976 atmosphere
van der Werf, SY
2003-01-01
A new and flexible ray-tracing procedure for calculating astronomical refraction is outlined and applied to the US1976 standard atmosphere. This atmosphere is generalized to allow for a free choice of the temperature and pressure at sea level, and in this form it has been named the modified US1976
Simplifying numerical ray tracing for characterization of optical systems.
Gagnon, Yakir Luc; Speiser, Daniel I; Johnsen, Sönke
2014-07-20
Ray tracing, a computational method for tracing the trajectories of rays of light through matter, is often used to characterize mechanical or biological visual systems with aberrations that are larger than the effect of diffraction inherent in the system. For example, ray tracing may be used to calculate geometric point spread functions (PSFs), which describe the image of a point source after it passes through an optical system. Calculating a geometric PSF is useful because it gives an estimate of the detail and quality of the image formed by a given optical system. However, when using ray tracing to calculate a PSF, the accuracy of the estimated PSF directly depends on the number of discrete rays used in the calculation; higher accuracies may require more computational power. Furthermore, adding optical components to a modeled system will increase its complexity and require critical modifications so that the model will describe the system correctly, sometimes necessitating a completely new model. Here, we address these challenges by developing a method that represents rays of light as a continuous function that depends on the light's initial direction. By utilizing Chebyshev approximations (via the chebfun toolbox in MATLAB) for the implementation of this method, we greatly simplified the calculations for the location and direction of the rays. This method provides high precision and fast calculation speeds that allow the characterization of any symmetrical optical system (with a centered point source) in an analytical-like manner. Next, we demonstrate our methods by showing how they can easily calculate PSFs for complicated optical systems that contain multiple refractive and/or reflective interfaces.
Trace chemical characterization using monochromatic X-ray undulator radiation
Eba; Numako; Iihara; Sakurai
2000-06-01
An efficient Johansson-type X-ray fluorescence spectrometer has been developed for advanced X-ray spectroscopic analysis with third-generation synchrotron radiation. Kalpha and Kbeta X-ray fluorescence spectra for trace metals have been collected by a Ge(220) analyzing crystal with a Rowland radius of 150 mm, under monochromatic X-ray excitation at the undulator beamline at the SPring-8. The energy resolution is approximately 10 eV for most of the K lines for 3d transition metals. In light of the greatly improved efficiency, as well as the excellent signal-to-background ratio, the relative and absolute detection limits achieved are 1 ppm and 1.2 ng of copper in a carbon matrix, respectively. The energy resolution of the present spectrometer permits the observation of some chemical effects in Kbeta spectra. It has been demonstrated that the changes in Kbeta5 and Kbeta'' intensity for iron and cobalt compounds can be used for the analysis of chemical states. Resonant X-ray fluorescent spectra are another important application of monochromatic excitation. In view of trace chemical characterization, the present spectrometer can be a good alternative to a conventional Si(Li) detector system when combined with highly brilliant X-rays.
McXtrace: A modern ray-tracing package for X-ray instrumentation
Bergbäck Knudsen, Erik; Prodi, A.; Willendrup, Peter Kjær
2011-01-01
we present the developments of the McXtrace project, a free, open source software package based on Monte Carlo ray tracing for simulations and optimisation of complete X-ray instruments. The methodology of building a simulation is presented through an example beamline, namely Beamline 811 at MAX-...
3D ultrasonic ray tracing in AutoCAD®
Reilly, D.; Leggat, P.; McNab, A.
2001-04-01
To assist with the design and validation of testing procedures for NDT, add-on modules have been developed for AutoCAD® 2000. One of the modules computes and displays ultrasonic 3D ray tracing. Another determines paths between two points, for instance a probe and a target or two probes. The third module displays phased array operational modes and calculates element delays for phased array operation. The modules can be applied to simple or complex solid model components.
Tropospheric Refraction Modeling Using Ray-Tracing and Parabolic Equation
P. Pechac
2005-12-01
Full Text Available Refraction phenomena that occur in the lower atmospheresignificantly influence the performance of wireless communicationsystems. This paper provides an overview of corresponding computationalmethods. Basic properties of the lower atmosphere are mentioned.Practical guidelines for radiowave propagation modeling in the loweratmosphere using ray-tracing and parabolic equation methods are given.In addition, a calculation of angle-of-arrival spectra is introducedfor multipath propagation simulations.
Ray Tracing Modelling of Reflector for Vertical Bifacial Panel
Jakobsen, Michael Linde; Thorsteinsson, Sune; Poulsen, Peter Behrensdorff
2016-01-01
Bifacial solar panels have recently become a new attractive building block for PV systems. In this work we propose a reflector system for a vertical bifacial panel, and use ray tracing modelling to model the performance. Particularly, we investigate the impact of the reflector volume being filled...... with a refractive medium, and shows the refractive medium improves the reflector performance since it directs almost all the light incident on the incoming plane into the PV panel....
The Search for Efficiency in Arboreal Ray Tracing Applications
van Leeuwen, M.; Disney, M.; Chen, J. M.; Gomez-Dans, J.; Kelbe, D.; van Aardt, J. A.; Lewis, P.
2016-12-01
Forest structure significantly impacts a range of abiotic conditions, including humidity and the radiation regime, all of which affect the rate of net and gross primary productivity. Current forest productivity models typically consider abstract media to represent the transfer of radiation within the canopy. Examples include the representation forest structure via a layered canopy model, where leaf area and inclination angles are stratified with canopy depth, or as turbid media where leaves are randomly distributed within space or within confined geometric solids such as blocks, spheres or cones. While these abstract models are known to produce accurate estimates of primary productivity at the stand level, their limited geometric resolution restricts applicability at fine spatial scales, such as the cell, leaf or shoot levels, thereby not addressing the full potential of assimilation of data from laboratory and field measurements with that of remote sensing technology. Recent research efforts have explored the use of laser scanning to capture detailed tree morphology at millimeter accuracy. These data can subsequently be used to combine ray tracing with primary productivity models, providing an ability to explore trade-offs among different morphological traits or assimilate data from spatial scales, spanning the leaf- to the stand level. Ray tracing has a major advantage of allowing the most accurate structural description of the canopy, and can directly exploit new 3D structural measurements, e.g., from laser scanning. However, the biggest limitation of ray tracing models is their high computational cost, which currently limits their use for large-scale applications. In this talk, we explore ways to more efficiently exploit ray tracing simulations and capture this information in a readily computable form for future evaluation, thus potentially enabling large-scale first-principles forest growth modelling applications.
MCViNE -- An object oriented Monte Carlo neutron ray tracing simulation package
Lin, Jiao Y Y; Granroth, Garrett E; Abernathy, Douglas L; Lumsden, Mark D; Winn, Barry; Aczel, Adam A; Aivazis, Michael; Fultz, Brent
2015-01-01
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is a versatile Monte Carlo (MC) neutron ray-tracing program that provides researchers with tools for performing computer modeling and simulations that mirror real neutron scattering experiments. By adopting modern software engineering practices such as using composite and visitor design patterns for representing and accessing neutron scatterers, and using recursive algorithms for multiple scattering, MCViNE is flexible enough to handle sophisticated neutron scattering problems including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can take advantage of simulation components in linear-chain-based MC ray tracing packages widely used in instrument design and optimization, as well as NumPy-based components that make prototypes useful and easy to develop. These developments have enabled us to carry out detailed simulations of neutron scatteri...
Numerical simulation and comparison of nonlinear self-focusing based on iteration and ray tracing
Li, Xiaotong; Chen, Hao; Wang, Weiwei; Ruan, Wangchao; Zhang, Luwei; Cen, Zhaofeng
2017-05-01
Self-focusing is observed in nonlinear materials owing to the interaction between laser and matter when laser beam propagates. Some of numerical simulation strategies such as the beam propagation method (BPM) based on nonlinear Schrödinger equation and ray tracing method based on Fermat's principle have applied to simulate the self-focusing process. In this paper we present an iteration nonlinear ray tracing method in that the nonlinear material is also cut into massive slices just like the existing approaches, but instead of paraxial approximation and split-step Fourier transform, a large quantity of sampled real rays are traced step by step through the system with changing refractive index and laser intensity by iteration. In this process a smooth treatment is employed to generate a laser density distribution at each slice to decrease the error caused by the under-sampling. The characteristics of this method is that the nonlinear refractive indices of the points on current slice are calculated by iteration so as to solve the problem of unknown parameters in the material caused by the causal relationship between laser intensity and nonlinear refractive index. Compared with the beam propagation method, this algorithm is more suitable for engineering application with lower time complexity, and has the calculation capacity for numerical simulation of self-focusing process in the systems including both of linear and nonlinear optical media. If the sampled rays are traced with their complex amplitudes and light paths or phases, it will be possible to simulate the superposition effects of different beam. At the end of the paper, the advantages and disadvantages of this algorithm are discussed.
Distance measurement based on light field geometry and ray tracing.
Chen, Yanqin; Jin, Xin; Dai, Qionghai
2017-01-09
In this paper, we propose a geometric optical model to measure the distances of object planes in a light field image. The proposed geometric optical model is composed of two sub-models based on ray tracing: object space model and image space model. The two theoretic sub-models are derived on account of on-axis point light sources. In object space model, light rays propagate into the main lens and refract inside it following the refraction theorem. In image space model, light rays exit from emission positions on the main lens and subsequently impinge on the image sensor with different imaging diameters. The relationships between imaging diameters of objects and their corresponding emission positions on the main lens are investigated through utilizing refocusing and similar triangle principle. By combining the two sub-models together and tracing light rays back to the object space, the relationships between objects' imaging diameters and corresponding distances of object planes are figured out. The performance of the proposed geometric optical model is compared with existing approaches using different configurations of hand-held plenoptic 1.0 cameras and real experiments are conducted using a preliminary imaging system. Results demonstrate that the proposed model can outperform existing approaches in terms of accuracy and exhibits good performance at general imaging range.
Weiland, C.M. [Univ. of California, Santa Barbara, CA (United States); Steck, L.K. [Los Alamos National Lab., NM (United States); Dawson, P.B. [Geological Survey, Menlo Park, CA (United States)] [and others
1995-10-10
The authors explore the impact of three-dimensional minimum travel time ray tracing on nonlinear teleseismic inversion. This problem has particular significance when trying to image strongly contrasting low-velocity bodies, such as magma chambers, because strongly refracted/and/or diffracted rays may precede the direct P wave arrival traditionally used in straight-ray seismic tomography. They use a simplex-based ray tracer to compute the three-dimensional, minimum travel time ray paths and employ an interative technique to cope with nonlinearity. Results from synthetic data show that their algorithm results in better model reconstructions compared with traditional straight-ray inversions. The authors reexamine the teleseismic data collected at Long Valley caldera by the U.S. Geological Survey. The most prominent feature of their result is a 25-30% low-velocity zone centered at 11.5 km depth beneath the northwestern quandrant of the caldera. Beneath this at a depth of 24.5 km is a more diffuse 15% low-velocity zone. In general, the low velocities tend to deepen to the south and east. The authors interpret the shallow feature to be the residual Long Valley caldera magma chamber, while the deeper feature may represent basaltic magmas ponded in the midcrust. The deeper position of the prominent low-velocity region in comparison to earlier tomographic images is a result of using three-dimensional rays rather than straight rays in the ray tracing. The magnitude of the low-velocity anomaly is a factor of {approximately}3 times larger than earlier models from linear arrival time inversions and is consistent with models based on observations of ray bending at sites within the caldera. These results imply the presence of anywhere from 7 to 100% partial melt beneath the caldera. 40 refs., 1 fig., 1 tab.
Ray tracing and ECRH absorption modeling in the HSX stellarator
Weir, G. M.; Likin, K. M.; Marushchenko, N. B.; Turkin, Y.
2015-09-01
To increase flexibility in ECRH experiments on the helically symmetric experiment (HSX), a second gyrotron and transmission line have been installed. The second antenna includes a steerable mirror for off-axis heating, and the launched power may be modulated for use in heat pulse propagation experiments. The extraordinary wave at the second harmonic of the electron gyrofrequency or the ordinary wave at the fundamental resonance are used for plasma start-up and heating on HSX. The tracing visualized ray tracing code (Marushchenko et al 2007 Plasma Fusion Res. 2 S1129) is used to estimate single-pass absorption and to model multi-pass wave damping in the three-dimensional HSX geometry. The single-pass absorption of the ordinary wave at the fundamental resonance is calculated to be as high as 30%, while measurements of the total absorption indicate that 45% of the launched power is absorbed. A multi-pass ray tracing model correctly predicts the experimental absorption and indicates that the launched power is absorbed within the plasma core (r/a≤slant 0.2 ).
GRay: A Massively Parallel GPU-based Code for Ray Tracing in Relativistic Spacetimes
Chan, Chi-kwan; Psaltis, Dimitrios; Özel, Feryal
2013-11-01
We introduce GRay, a massively parallel integrator designed to trace the trajectories of billions of photons in a curved spacetime. This graphics-processing-unit (GPU)-based integrator employs the stream processing paradigm, is implemented in CUDA C/C++, and runs on nVidia graphics cards. The peak performance of GRay using single-precision floating-point arithmetic on a single GPU exceeds 300 GFLOP (or 1 ns per photon per time step). For a realistic problem, where the peak performance cannot be reached, GRay is two orders of magnitude faster than existing central-processing-unit-based ray-tracing codes. This performance enhancement allows more effective searches of large parameter spaces when comparing theoretical predictions of images, spectra, and light curves from the vicinities of compact objects to observations. GRay can also perform on-the-fly ray tracing within general relativistic magnetohydrodynamic algorithms that simulate accretion flows around compact objects. Making use of this algorithm, we calculate the properties of the shadows of Kerr black holes and the photon rings that surround them. We also provide accurate fitting formulae of their dependencies on black hole spin and observer inclination, which can be used to interpret upcoming observations of the black holes at the center of the Milky Way, as well as M87, with the Event Horizon Telescope.
Dynamic ray tracing and its application in triangulated media
Rueger, A.
1993-07-01
Hale and Cohen (1991) developed software to generate two-dimensional computer models of complex geology. Their method uses a triangulation technique designed to support efficient and accurate computation of seismic wavefields for models of the earth`s interior. Subsequently, Hale (1991) used this triangulation approach to perform dynamic ray tracing and create synthetic seismograms based on the method of Gaussian beams. Here, I extend this methodology to allow an increased variety of ray-theoretical experiments. Specifically, the developed program GBmod (Gaussian Beam MODeling) can produce arbitrary multiple sequences and incorporate attenuation and density variations. In addition, I have added an option to perform Fresnel-volume ray tracing (Cerveny and Soares, 1992). Corrections for reflection and transmission losses at interfaces, and for two-and-one-half-dimensional (2.5-D) spreading are included. However, despite these enhancements, difficulties remain in attempts to compute accurate synthetic seismograms if strong lateral velocity inhomogeneities are present. Here, these problems are discussed and, to a certain extent, reduced. I provide example computations of high-frequency seismograms based on the method of Gaussian beams to exhibit the advantages and disadvantages of the proposed modeling method and illustrate new features for both surface and vertical seismic profiling (VSP) acquisition geometries.
Microseismic network design assessment based on 3D ray tracing
Näsholm, Sven Peter; Wuestefeld, Andreas; Lubrano-Lavadera, Paul; Lang, Dominik; Kaschwich, Tina; Oye, Volker
2016-04-01
There is increasing demand on the versatility of microseismic monitoring networks. In early projects, being able to locate any triggers was considered a success. These early successes led to a better understanding of how to extract value from microseismic results. Today operators, regulators, and service providers work closely together in order to find the optimum network design to meet various requirements. In the current study we demonstrate an integrated and streamlined network capability assessment approach. It is intended for use during the microseismic network design process prior to installation. The assessments are derived from 3D ray tracing between a grid of event points and the sensors. Three aspects are discussed: 1) Magnitude of completeness or detection limit; 2) Event location accuracy; and 3) Ground-motion hazard. The network capability parameters 1) and 2) are estimated at all hypothetic event locations and are presented in the form of maps given a seismic sensor coordinate scenario. In addition, the ray tracing traveltimes permit to estimate the point-spread-functions (PSFs) at the event grid points. PSFs are useful in assessing the resolution and focusing capability of the network for stacking-based event location and imaging methods. We estimate the performance for a hypothetical network case with 11 sensors. We consider the well-documented region around the San Andreas Fault Observatory at Depth (SAFOD) located north of Parkfield, California. The ray tracing is done through a detailed velocity model which covers a 26.2 by 21.2 km wide area around the SAFOD drill site with a resolution of 200 m both for the P-and S-wave velocities. Systematic network capability assessment for different sensor site scenarios prior to installation facilitates finding a final design which meets the survey objectives.
Photorealistic ray tracing to visualize automobile side mirror reflective scenes.
Lee, Hocheol; Kim, Kyuman; Lee, Gang; Lee, Sungkoo; Kim, Jingu
2014-10-20
We describe an interactive visualization procedure for determining the optimal surface of a special automobile side mirror, thereby removing the blind spot, without the need for feedback from the error-prone manufacturing process. If the horizontally progressive curvature distributions are set to the semi-mathematical expression for a free-form surface, the surface point set can then be derived through numerical integration. This is then converted to a NURBS surface while retaining the surface curvature. Then, reflective scenes from the driving environment can be virtually realized using photorealistic ray tracing, in order to evaluate how these reflected images would appear to drivers.
Adaptive image ray-tracing for astrophysical simulations
Parkin, E R
2010-01-01
A technique is presented for producing synthetic images from numerical simulations whereby the image resolution is adapted around prominent features. In so doing, adaptive image ray-tracing (AIR) improves the efficiency of a calculation by focusing computational effort where it is needed most. The results of test calculations show that a factor of >~ 4 speed-up, and a commensurate reduction in the number of pixels required in the final image, can be achieved compared to an equivalent calculation with a fixed resolution image.
The ray-tracing mapping operator in an asymmetric atmosphere
无
2008-01-01
In a spherically symmetric atmosphere, the refractive index profile is retrieved from bending angle measurements through Abel integral transform. As horizontal refractivity inhomogeneity becomes significant in the moist low atmosphere, the error in refractivity profile obtained from Abel inversion reaches about 10%. One way to avoid this error is to directly assimilate bending angle profile into numerical weather models. This paper discusses the 2D ray-tracing mapping operator for bending angle in an asymmetric atmosphere. Through simulating computations, the retrieval error of the refractivity in horizontal inhomogeneity is assessed. The step length of 4 rank Runge-Kutta method is also tested.
Ray tracing study for non-imaging daylight collectors
Wittkopf, Stephen [Solar Energy Research Institute of Singapore (SERIS), National University of Singapore (NUS), 7 Engineering Drive 1, Block E3A, 06-01, Singapore 117574 (Singapore); Solar Energy and Building Physics Laboratory (LESO), Swiss Federal Institute of Technology Lausanne (EPFL) (Switzerland); Oliver Grobe, Lars; Geisler-Moroder, David [Solar Energy Research Institute of Singapore (SERIS), National University of Singapore (NUS), 7 Engineering Drive 1, Block E3A, 06-01, Singapore 117574 (Singapore); Compagnon, Raphael [College of Engineering and Architecture of Fribourg (EIA-FR), University of Applied Sciences of Western Switzerland (HES-SO) (Switzerland); Kaempf, Jerome; Linhart, Friedrich; Scartezzini, Jean-Louis [Solar Energy and Building Physics Laboratory (LESO), Swiss Federal Institute of Technology Lausanne (EPFL) (Switzerland)
2010-06-15
This paper presents a novel method to study how well non-imaging daylight collectors pipe diffuse daylight into long horizontal funnels for illuminating deep buildings. Forward ray tracing is used to derive luminous intensity distributions curves (LIDC) of such collectors centered in an arc-shaped light source representing daylight. New photometric characteristics such as 2D flux, angular spread and horizontal offset are introduced as a function of such LIDC. They are applied for quantifying and thus comparing different collector contours. (author)
Ray-tracing optical modeling of negative dysphotopsia
Hong, Xin; Liu, Yueai; Karakelle, Mutlu; Masket, Samuel; Fram, Nicole R.
2011-12-01
Negative dysphotopsia is a relatively common photic phenomenon that may occur after implantation of an intraocular lens. The etiology of negative dysphotopsia is not fully understood. In this investigation, optical modeling was developed using nonsequential-component Zemax ray-tracing technology to simulate photic phenomena experienced by the human eye. The simulation investigated the effects of pupil size, capsulorrhexis size, and bag diffusiveness. Results demonstrated the optical basis of negative dysphotopsia. We found that photic structures were mainly influenced by critical factors such as the capsulorrhexis size and the optical diffusiveness of the capsular bag. The simulations suggested the hypothesis that the anterior capsulorrhexis interacting with intraocular lens could induce negative dysphotopsia.
Ray-tracing software comparison for linear focusing solar collectors
Osório, Tiago; Horta, Pedro; Larcher, Marco; Pujol-Nadal, Ramón; Hertel, Julian; van Rooyen, De Wet; Heimsath, Anna; Schneider, Simon; Benitez, Daniel; Frein, Antoine; Denarie, Alice
2016-05-01
Ray-Tracing software tools have been widely used in the optical design of solar concentrating collectors. In spite of the ability of these tools to assess the geometrical and material aspects impacting the optical performance of concentrators, their use in combination with experimental measurements in the framework of collector testing procedures as not been implemented, to the date, in none of the current solar collector testing standards. In the latest revision of ISO9806 an effort was made to include linear focusing concentrating collectors but some practical and theoretical difficulties emerged. A Ray-Tracing analysis could provide important contributions to overcome these issues, complementing the experimental results obtained through thermal testing and allowing the achievement of more thorough testing outputs with lower experimental requirements. In order to evaluate different available software tools a comparison study was conducted. Taking as representative technologies for line-focus concentrators the Parabolic Trough Collector and the Linear Fresnel Reflector Collector, two exemplary cases with predefined conditions - geometry, sun model and material properties - were simulated with different software tools. This work was carried out within IEA/SHC Task 49 "Solar Heat Integration in Industrial Processes".
Zhang, Dong; Zhang, Ting-Ting; Zhang, Xiao-Lei; Yang, Yan; Hu, Ying; Qin, Qian-Qing
2013-05-01
We present a new method of three-dimensional (3-D) seismic ray tracing, based on an improvement to the linear traveltime interpolation (LTI) ray tracing algorithm. This new technique involves two separate steps. The first involves a forward calculation based on the LTI method and the dynamic successive partitioning scheme, which is applied to calculate traveltimes on cell boundaries and assumes a wavefront that expands from the source to all grid nodes in the computational domain. We locate several dynamic successive partition points on a cell's surface, the traveltimes of which can be calculated by linear interpolation between the vertices of the cell's boundary. The second is a backward step that uses Fermat's principle and the fact that the ray path is always perpendicular to the wavefront and follows the negative traveltime gradient. In this process, the first-arriving ray path can be traced from the receiver to the source along the negative traveltime gradient, which can be calculated by reconstructing the continuous traveltime field with cubic B-spline interpolation. This new 3-D ray tracing method is compared with the LTI method and the shortest path method (SPM) through a number of numerical experiments. These comparisons show obvious improvements to computed traveltimes and ray paths, both in precision and computational efficiency.
The Verification and Validation of the Ray-tracing of Bag of Triangles (BoTs)
2015-02-01
The Verification and Validation of the Ray-tracing of Bag of Triangles ( BoTs ) by Charith Ranawake ARL-CR-0761 February 2015...Ground, MD 22105 ARL-CR-0761 February 2015 The Verification and Validation of the Ray-tracing of Bag of Triangles ( BoTs ) Charith...and Validation of the Ray-tracing of Bag of Triangles ( BoTs ) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S
OSPRay - A CPU Ray Tracing Framework for Scientific Visualization.
Wald, I; Johnson, G P; Amstutz, J; Brownlee, C; Knoll, A; Jeffers, J; Gunther, J; Navratil, P
2017-01-01
Scientific data is continually increasing in complexity, variety and size, making efficient visualization and specifically rendering an ongoing challenge. Traditional rasterization-based visualization approaches encounter performance and quality limitations, particularly in HPC environments without dedicated rendering hardware. In this paper, we present OSPRay, a turn-key CPU ray tracing framework oriented towards production-use scientific visualization which can utilize varying SIMD widths and multiple device backends found across diverse HPC resources. This framework provides a high-quality, efficient CPU-based solution for typical visualization workloads, which has already been integrated into several prevalent visualization packages. We show that this system delivers the performance, high-level API simplicity, and modular device support needed to provide a compelling new rendering framework for implementing efficient scientific visualization workflows.
Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing
Ari, Gizem; Toker, Cenk
2016-07-01
Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.
RAY-RAMSES: a code for ray tracing on the fly in N-body simulations
Barreira, Alexandre; Bose, Sownak; Li, Baojiu
2016-01-01
We present a ray tracing code to compute integrated cosmological observables on the fly in AMR N-body simulations. Unlike conventional ray tracing techniques, our code takes full advantage of the time and spatial resolution attained by the N-body simulation by computing the integrals along the line of sight on a cell-by-cell basis through the AMR simulation grid. Moroever, since it runs on the fly in the N-body run, our code can produce maps of the desired observables without storing large (or any) amounts of data for post-processing. We implemented our routines in the RAMSES N-body code and tested the implementation using an example of weak lensing simulation. We analyse basic statistics of lensing convergence maps and find good agreement with semi-analytical methods. The ray tracing methodology presented here can be used in several cosmological analysis such as Sunyaev-Zel'dovich and integrated Sachs-Wolfe effect studies as well as modified gravity. Our code can also be used in cross-checks of the more conv...
GENERALIZED CONJUGATE-GRADIENT ALGORITHM AND ITS APPLICATIONS TO SEISMIC TRACE INVERSION
无
1999-01-01
A novel generalized conjugate-gradient algorithm for complicated equations of seismic trace inverse problems, which is based on classical conjugate-gradient algorithm, has been put forward so as to improve the stability of seismic trace inversion, and to reduce inversion computation and memory resources needed. The algorithm brings high accuracy, fast operation speed and good ability of resisting ill-condition. In addition, by analysing sensitivity matrix according to the specific problem of seismic trace inversion, a new recursive algorithm which needs no sensitivity matrix is developed to save memory greatly. Furthermore, in the new algorithm, sensitivity matrix operation can be converted into convolution and correlation operations to make the whole recursion to be implemented completely by vector operation, which thus speeds recursion operation greatly.
MCViNE - An object oriented Monte Carlo neutron ray tracing simulation package
Lin, Jiao Y. Y.; Smith, Hillary L.; Granroth, Garrett E.; Abernathy, Douglas L.; Lumsden, Mark D.; Winn, Barry; Aczel, Adam A.; Aivazis, Michael; Fultz, Brent
2016-02-01
MCViNE (Monte-Carlo VIrtual Neutron Experiment) is an open-source Monte Carlo (MC) neutron ray-tracing software for performing computer modeling and simulations that mirror real neutron scattering experiments. We exploited the close similarity between how instrument components are designed and operated and how such components can be modeled in software. For example we used object oriented programming concepts for representing neutron scatterers and detector systems, and recursive algorithms for implementing multiple scattering. Combining these features together in MCViNE allows one to handle sophisticated neutron scattering problems in modern instruments, including, for example, neutron detection by complex detector systems, and single and multiple scattering events in a variety of samples and sample environments. In addition, MCViNE can use simulation components from linear-chain-based MC ray tracing packages which facilitates porting instrument models from those codes. Furthermore it allows for components written solely in Python, which expedites prototyping of new components. These developments have enabled detailed simulations of neutron scattering experiments, with non-trivial samples, for time-of-flight inelastic instruments at the Spallation Neutron Source. Examples of such simulations for powder and single-crystal samples with various scattering kernels, including kernels for phonon and magnon scattering, are presented. With simulations that closely reproduce experimental results, scattering mechanisms can be turned on and off to determine how they contribute to the measured scattering intensities, improving our understanding of the underlying physics.
Baltser, Jana; Bergbäck Knudsen, Erik; Vickery, Anette
2011-01-01
of X-ray beamline designs for particular user experiments. In this work we used the newly developed McXtrace ray-tracing package and the SRW wave-optics code to simulate the beam propagation of X-ray undulator radiation through such a "transfocator" as implemented at ID- 11 at ESRF. By applying two...
Testing the validity of the ray-tracing code GYOTO
Grould, Marion; Perrin, Guy
2016-01-01
In the next few years, the near-infrared interferometer GRAVITY will be able to observe the Galactic center. Astrometric data will be obtained with an anticipated accuracy of 10 $\\mu$as. To analyze these future data, we have developed a code called GYOTO to compute orbits and images. We want to assess the validity and accuracy of GYOTO in a variety of contexts, in particular for stellar astrometry in the Galactic center. Furthermore, we want to tackle and complete a study made on the astrometric displacements that are due to lensing effects of a star of the central parsec with GYOTO. We first validate GYOTO in the weak-deflection limit (WDL) by studying primary caustics and primary critical curves obtained for a Kerr black hole. We compare GYOTO results to available analytical approximations and estimate GYOTO errors using an intrinsic estimator. In the strong-deflection limit (SDL), we choose to compare null geodesics computed by GYOTO and the ray-tracing code named Geokerr. Finally, we use GYOTO to estimate...
Fast Ray Tracing of Lunar Digital Elevation Models
McClanahan, Timothy P.; Evans, L. G.; Starr, R. D.; Mitrofanov, I.
2009-01-01
Ray-tracing (RT) of Lunar Digital Elevation Models (DEM)'s is performed to virtually derive the degree of radiation incident to terrain as a function of time, orbital and ephemeris constraints [I- 4]. This process is an integral modeling process in lunar polar research and exploration due to the present paucity of terrain information at the poles and mission planning activities for the anticipated spring 2009 launch of the Lunar Reconnaissance Orbiter (LRO). As part of the Lunar Exploration Neutron Detector (LEND) and Lunar Crater Observation and Sensing Satellite (LCROSS) preparations RI methods are used to estimate the critical conditions presented by the combined effects of high latitude, terrain and the moons low obliquity [5-7]. These factors yield low incident solar illumination and subsequently extreme thermal, and radiation conditions. The presented research uses RT methods both for radiation transport modeling in space and regolith related research as well as to derive permanently shadowed regions (PSR)'s in high latitude topographic minima, e.g craters. These regions are of scientific and human exploration interest due to the near constant low temperatures in PSRs, inferred to be < 100 K. Hydrogen is thought to have accumulated in PSR's through the combined effects of periodic cometary bombardment and/or solar wind processes, and the extreme cold which minimizes hydrogen sublimation [8-9]. RT methods are also of use in surface position optimization for future illumination dependent on surface resources e.g. power and communications equipment.
Virtual Ray Tracing as a Conceptual Tool for Image Formation in Mirrors and Lenses
Heikkinen, Lasse; Savinainen, Antti; Saarelainen, Markku
2016-12-01
The ray tracing method is widely used in teaching geometrical optics at the upper secondary and university levels. However, using simple and straightforward examples may lead to a situation in which students use the model of ray tracing too narrowly. Previous studies show that students seem to use the ray tracing method too concretely instead of as a conceptual model. This suggests that introductory physics students need to understand the nature of the ray model more profoundly. In this paper, we show how a virtual ray tracing model can be used as a tool for image formation in more complex and unconventional cases. We believe that this tool has potential in helping students to better appreciate the nature of the ray model.
An Algorithm for Static Tracing of Message Passing Interface Programs Using Data Flow Analysis
Alaa I. Elnashar
2014-12-01
Full Text Available Message Passing Interface (MPI is a well know paradigm that is widely used in coding explicit parallel programs. MPI programs exchange data among parallel processes using communication routines. Program execution trace depends on the way that its processes are communicated together. For the same program, there are a lot of processes transitions states that may appear due to the nondeterministic features of parallel execution. In this paper we present a new algorithm that statically generates the execution trace of a given MPI program using data flow analysis technique. The performance of the proposed algorithm is evaluated and compared with that of two heuristic techniques that use a random and genetic algorithm approaches to generate trace sequences. The results show that the proposed algorithm scales well with the program size and avoids the problem of processes state explosion which the other techniques suffer from.
MC ray-tracing optimization of lobster-eye focusing devices with RESTRAX
Saroun, Jan [Nuclear Physics Institute, ASCR, 25068 Rez (Czech Republic)]. E-mail: saroun@ujf.cas.cz; Kulda, Jiri [Institut Laue-Langevin, 6 rue Jules Horowitz, BP 156, 38042 Grenoble Cedex 9 (France)
2006-11-15
The enhanced functionalities of the latest version of the RESTRAX software, providing a high-speed Monte Carlo (MC) ray-tracing code to represent a virtual three-axis neutron spectrometer, include representation of parabolic and elliptic guide profiles and facilities for numerical optimization of parameter values, characterizing the instrument components. As examples, we present simulations of a doubly focusing monochromator in combination with cold neutron guides and lobster-eye supermirror devices, concentrating a monochromatic beam to small sample volumes. A Levenberg-Marquardt minimization algorithm is used to optimize simultaneously several parameters of the monochromator and lobster-eye guides. We compare the performance of optimized configurations in terms of monochromatic neutron flux and energy spread and demonstrate the effect of lobster-eye optics on beam transformations in real and momentum subspaces.
Heat-Flux Analysis of Solar Furnace Using the Monte Carlo Ray-Tracing Method
Lee, Hyun Jin; Kim, Jong Kyu; Lee, Sang Nam; Kang, Yong Heack [Korea Institute of Energy Research, Daejeon (Korea, Republic of)
2011-10-15
An understanding of the concentrated solar flux is critical for the analysis and design of solar-energy-utilization systems. The current work focuses on the development of an algorithm that uses the Monte Carlo ray-tracing method with excellent flexibility and expandability; this method considers both solar limb darkening and the surface slope error of reflectors, thereby analyzing the solar flux. A comparison of the modeling results with measurements at the solar furnace in Korea Institute of Energy Research (KIER) show good agreement within a measurement uncertainty of 10%. The model evaluates the concentration performance of the KIER solar furnace with a tracking accuracy of 2 mrad and a maximum attainable concentration ratio of 4400 sun. Flux variations according to measurement position and flux distributions depending on acceptance angles provide detailed information for the design of chemical reactors or secondary concentrators.
Enzo+Moray: Radiation Hydrodynamics Adaptive Mesh Refinement Simulations with Adaptive Ray Tracing
Wise, John H
2010-01-01
We describe a photon-conserving radiative transfer algorithm, using a spatially-adaptive ray tracing scheme, and its parallel implementation into the adaptive mesh refinement (AMR) cosmological hydrodynamics code, Enzo. By coupling the solver with the energy equation and non-equilibrium chemistry network, our radiation hydrodynamics framework can be utilised to study a broad range of astrophysical problems, such as stellar and black hole (BH) feedback. Inaccuracies can arise from large timesteps and poor sampling, therefore we devised an adaptive time-stepping scheme and a fast approximation of the optically-thin radiation field with multiple sources. We test the method with several radiative transfer and radiation hydrodynamics tests that are given in Iliev et al. (2006, 2009). We further test our method with more dynamical situations, for example, the propagation of an ionisation front through a Rayleigh-Taylor instability, time-varying luminosities, and collimated radiation. The test suite also includes an...
Desnijder, Karel; Hanselaer, Peter; Meuret, Youri
2016-04-01
A key requirement to obtain a uniform luminance for a side-lit LED backlight is the optimised spatial pattern of structures on the light guide that extract the light. The generation of such a scatter pattern is usually performed by applying an iterative approach. In each iteration, the luminance distribution of the backlight with a particular scatter pattern is analysed. This is typically performed with a brute-force ray-tracing algorithm, although this approach results in a time-consuming optimisation process. In this study, the Adding-Doubling method is explored as an alternative way for evaluating the luminance of a backlight. Due to the similarities between light propagating in a backlight with extraction structures and light scattering in a cloud of light scatterers, the Adding-Doubling method which is used to model the latter could also be used to model the light distribution in a backlight. The backlight problem is translated to a form upon which the Adding-Doubling method is directly applicable. The calculated luminance for a simple uniform extraction pattern with the Adding-Doubling method matches the luminance generated by a commercial raytracer very well. Although successful, no clear computational advantage over ray tracers is realised. However, the dynamics of light propagation in a light guide as used the Adding-Doubling method, also allow to enhance the efficiency of brute-force ray-tracing algorithms. The performance of this enhanced ray-tracing approach for the simulation of backlights is also evaluated against a typical brute-force ray-tracing approach.
Ray tracing based path-length calculations for polarized light tomographic imaging
Manjappa, Rakesh; Kanhirodan, Rajan
2015-09-01
A ray tracing based path length calculation is investigated for polarized light transport in a pixel space. Tomographic imaging using polarized light transport is promising for applications in optical projection tomography of small animal imaging and turbid media with low scattering. Polarized light transport through a medium can have complex effects due to interactions such as optical rotation of linearly polarized light, birefringence, di-attenuation and interior refraction. Here we investigate the effects of refraction of polarized light in a non-scattering medium. This step is used to obtain the initial absorption estimate. This estimate can be used as prior in Monte Carlo (MC) program that simulates the transport of polarized light through a scattering medium to assist in faster convergence of the final estimate. The reflectance for p-polarized (parallel) and s-polarized (perpendicular) are different and hence there is a difference in the intensities that reach the detector end. The algorithm computes the length of the ray in each pixel along the refracted path and this is used to build the weight matrix. This weight matrix with corrected ray path length and the resultant intensity reaching the detector for each ray is used in the algebraic reconstruction (ART) method. The proposed method is tested with numerical phantoms for various noise levels. The refraction errors due to regions of different refractive index are discussed, the difference in intensities with polarization is considered. The improvements in reconstruction using the correction so applied is presented. This is achieved by tracking the path of the ray as well as the intensity of the ray as it traverses through the medium.
Ray Solomonoff, founding father of algorithmic information theory [Obituary
P.M.B. Vitanyi
2010-01-01
Ray J. Solomonoff died on December 7, 2009, in Cambridge, Massachusetts, of complications of a stroke caused by an aneurism in his head. Ray was the first inventor of Algorithmic Information Theory which deals with the shortest effective description length of objects and is commonly designated by th
Fast Contour-Tracing Algorithm Based on a Pixel-Following Method for Image Sensors.
Seo, Jonghoon; Chae, Seungho; Shim, Jinwook; Kim, Dongchul; Cheong, Cheolho; Han, Tack-Don
2016-03-09
Contour pixels distinguish objects from the background. Tracing and extracting contour pixels are widely used for smart/wearable image sensor devices, because these are simple and useful for detecting objects. In this paper, we present a novel contour-tracing algorithm for fast and accurate contour following. The proposed algorithm classifies the type of contour pixel, based on its local pattern. Then, it traces the next contour using the previous pixel's type. Therefore, it can classify the type of contour pixels as a straight line, inner corner, outer corner and inner-outer corner, and it can extract pixels of a specific contour type. Moreover, it can trace contour pixels rapidly because it can determine the local minimal path using the contour case. In addition, the proposed algorithm is capable of the compressing data of contour pixels using the representative points and inner-outer corner points, and it can accurately restore the contour image from the data. To compare the performance of the proposed algorithm to that of conventional techniques, we measure their processing time and accuracy. In the experimental results, the proposed algorithm shows better performance compared to the others. Furthermore, it can provide the compressed data of contour pixels and restore them accurately, including the inner-outer corner, which cannot be restored using conventional algorithms.
Fast Contour-Tracing Algorithm Based on a Pixel-Following Method for Image Sensors
Jonghoon Seo
2016-03-01
Full Text Available Contour pixels distinguish objects from the background. Tracing and extracting contour pixels are widely used for smart/wearable image sensor devices, because these are simple and useful for detecting objects. In this paper, we present a novel contour-tracing algorithm for fast and accurate contour following. The proposed algorithm classifies the type of contour pixel, based on its local pattern. Then, it traces the next contour using the previous pixel’s type. Therefore, it can classify the type of contour pixels as a straight line, inner corner, outer corner and inner-outer corner, and it can extract pixels of a specific contour type. Moreover, it can trace contour pixels rapidly because it can determine the local minimal path using the contour case. In addition, the proposed algorithm is capable of the compressing data of contour pixels using the representative points and inner-outer corner points, and it can accurately restore the contour image from the data. To compare the performance of the proposed algorithm to that of conventional techniques, we measure their processing time and accuracy. In the experimental results, the proposed algorithm shows better performance compared to the others. Furthermore, it can provide the compressed data of contour pixels and restore them accurately, including the inner-outer corner, which cannot be restored using conventional algorithms.
Application of ray-traced tropospheric slant delays to geodetic VLBI analysis
Hofmeister, Armin; Böhm, Johannes
2017-02-01
The correction of tropospheric influences via so-called path delays is critical for the analysis of observations from space geodetic techniques like the very long baseline interferometry (VLBI). In standard VLBI analysis, the a priori slant path delays are determined using the concept of zenith delays, mapping functions and gradients. The a priori use of ray-traced delays, i.e., tropospheric slant path delays determined with the technique of ray-tracing through the meteorological data of numerical weather models (NWM), serves as an alternative way of correcting the influences of the troposphere on the VLBI observations within the analysis. In the presented research, the application of ray-traced delays to the VLBI analysis of sessions in a time span of 16.5 years is investigated. Ray-traced delays have been determined with program RADIATE (see Hofmeister in Ph.D. thesis, Department of Geodesy and Geophysics, Faculty of Mathematics and Geoinformation, Technische Universität Wien. http://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-3444, 2016) utilizing meteorological data provided by NWM of the European Centre for Medium-Range Weather Forecasts (ECMWF). In comparison with a standard VLBI analysis, which includes the tropospheric gradient estimation, the application of the ray-traced delays to an analysis, which uses the same parameterization except for the a priori slant path delay handling and the used wet mapping factors for the zenith wet delay (ZWD) estimation, improves the baseline length repeatability (BLR) at 55.9% of the baselines at sub-mm level. If no tropospheric gradients are estimated within the compared analyses, 90.6% of all baselines benefit from the application of the ray-traced delays, which leads to an average improvement of the BLR of 1 mm. The effects of the ray-traced delays on the terrestrial reference frame are also investigated. A separate assessment of the RADIATE ray-traced delays is carried out by comparison to the ray-traced delays from the
Application of ray-traced tropospheric slant delays to geodetic VLBI analysis
Hofmeister, Armin; Böhm, Johannes
2017-08-01
The correction of tropospheric influences via so-called path delays is critical for the analysis of observations from space geodetic techniques like the very long baseline interferometry (VLBI). In standard VLBI analysis, the a priori slant path delays are determined using the concept of zenith delays, mapping functions and gradients. The a priori use of ray-traced delays, i.e., tropospheric slant path delays determined with the technique of ray-tracing through the meteorological data of numerical weather models (NWM), serves as an alternative way of correcting the influences of the troposphere on the VLBI observations within the analysis. In the presented research, the application of ray-traced delays to the VLBI analysis of sessions in a time span of 16.5 years is investigated. Ray-traced delays have been determined with program RADIATE (see Hofmeister in Ph.D. thesis, Department of Geodesy and Geophysics, Faculty of Mathematics and Geoinformation, Technische Universität Wien. http://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-3444, 2016) utilizing meteorological data provided by NWM of the European Centre for Medium-Range Weather Forecasts (ECMWF). In comparison with a standard VLBI analysis, which includes the tropospheric gradient estimation, the application of the ray-traced delays to an analysis, which uses the same parameterization except for the a priori slant path delay handling and the used wet mapping factors for the zenith wet delay (ZWD) estimation, improves the baseline length repeatability (BLR) at 55.9% of the baselines at sub-mm level. If no tropospheric gradients are estimated within the compared analyses, 90.6% of all baselines benefit from the application of the ray-traced delays, which leads to an average improvement of the BLR of 1 mm. The effects of the ray-traced delays on the terrestrial reference frame are also investigated. A separate assessment of the RADIATE ray-traced delays is carried out by comparison to the ray-traced delays from the
Farace, Paolo; Righetto, Roberto; Deffet, Sylvain; Meijers, Arturs; Vander Stappen, Francois
2016-01-01
Purpose: To introduce a fast ray-tracing algorithm in pencil proton radiography (PR) with a multilayer ionization chamber (MLIC) for in vivo range error mapping. Methods: Pencil beam PR was obtained by delivering spots uniformly positioned in a square (45x45 mm(2) field-of-view) of 9x9 spots capable
Advancing x-ray scattering metrology using inverse genetic algorithms
Hannon, Adam F.; Sunday, Daniel F.; Windover, Donald; Joseph Kline, R.
2016-07-01
We compare the speed and effectiveness of two genetic optimization algorithms to the results of statistical sampling via a Markov chain Monte Carlo algorithm to find which is the most robust method for determining real-space structure in periodic gratings measured using critical dimension small-angle x-ray scattering. Both a covariance matrix adaptation evolutionary strategy and differential evolution algorithm are implemented and compared using various objective functions. The algorithms and objective functions are used to minimize differences between diffraction simulations and measured diffraction data. These simulations are parameterized with an electron density model known to roughly correspond to the real-space structure of our nanogratings. The study shows that for x-ray scattering data, the covariance matrix adaptation coupled with a mean-absolute error log objective function is the most efficient combination of algorithm and goodness of fit criterion for finding structures with little foreknowledge about the underlying fine scale structure features of the nanograting.
Allocation of fixed transmission cost based on power flow tracing algorithm
ZHANG Qian; YU Ji-hui
2005-01-01
In the electricity market, charging based on the traditional spot electricity price often results in the payment imbalance of electric network, and goes against the development of the power system. So, it is necessary to modify the spot price. The key of the modification lies in how to calculate the fixed unit transmission cost of each node, that is how to allocate the fixed transmission cost to users.To solve this problem, we develop a power flow tracing algrithm to modify the spot price. We put forward a path searching method based on the graph theory after studying the fundamental principle of power flow tracing and apply the method to the downstream tracing algorithm and upstream tracing algorithm according to the proportional distribution principle. Furthermore, to improve the computational efficiency of the algorithm, we introduce the branch expunction method to optimize the node order. By using the result of power flow tracing to get fixed node transmission cost and introducing it to modify the spot price, we obtain the synthetical price.The application to a 5-bus system prove the algorithm feasible.
A novel gridding algorithm to create regional trace gas maps from satellite observations
G. Kuhlmann
2014-02-01
Full Text Available The recent increase in spatial resolution for satellite instruments has made it feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (level 2 onto a longitude–latitude grid (level 3. The algorithm is designed for the Ozone Monitoring Instrument (OMI and can easily be employed for similar instruments – for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI. Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrisation of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly
A novel gridding algorithm to create regional trace gas maps from satellite observations
G. Kuhlmann
2013-08-01
Full Text Available Recent increase of spatial resolution for satellite instruments has it made feasible to study distributions of trace gas column densities on a regional scale. For this application a new gridding algorithm was developed to map measurements from the instrument's frame of reference (Level 2 onto a longitude-latitude grid (Level 3. The algorithm is designed for the Ozone Monitoring Instrument (OMI and can be employed easily to similar instruments, for example, the upcoming TROPOspheric Monitoring Instrument (TROPOMI. Trace gas distributions are reconstructed by a continuous parabolic spline surface. The algorithm explicitly considers the spatially varying sensitivity of the sensor resulting from the instrument function. At the swath edge, the inverse problem of computing the spline coefficients is very sensitive to measurement errors and is regularised by a second-order difference matrix. Since this regularisation corresponds to the penalty term for smoothing splines, it similarly attenuates the effect of measurement noise over the entire swath width. Monte Carlo simulations are conducted to study the performance of the algorithm for different distributions of trace gas column densities. The optimal weight of the penalty term is found to be proportional to the measurement uncertainty and the width of the instrument function. A comparison with an established gridding algorithm shows improved performance for small to moderate measurement errors due to better parametrization of the distribution. The resulting maps are smoother and extreme values are more accurately reconstructed. The performance improvement is further illustrated with high-resolution distributions obtained from a regional chemistry model. The new algorithm is applied to tropospheric NO2 column densities measured by OMI. Examples of regional NO2 maps are shown for densely populated areas in China, Europe and the United States of America. This work demonstrates that the newly developed
Design of indoor WLANs: Combination of a ray-tracing tool with the BPSO method
Moreno Delgado, José; Domingo Gracia, Marta; Valle López, Luis; Pérez López, Jesús Ramón; Torres Jménez, Rafael Pedro; Basterrechea Verdeja, José
2015-01-01
This paper presents an approach that combines a ray tracing tool with a binary version of the particle swarm optimization method (BPSO) for the design of infrastructure mode indoor wireless local area networks (WLAN). The approach uses the power levels of a set of candidate access point (AP) locations obtained with the ray tracing tool at a mesh of potential receiver locations or test points to allow the BPSO optimizer to carry out the design of the WLAN. For this purpose, several restriction...
KARAT-LAMBDA - frequency dependent ray-traced troposphere delays for space applications
Hobiger, Thomas; Baron, Philippe
2014-05-01
Space-geodetic microwave techniques work under the assumption that the only dispersive, i.e. frequency dependent delay contribution is caused by the ionosphere. In general, the refractivity, even for the troposphere, is a complex quantity which can be denoted as N = N0 + (N'(f) + i N''(f)) where N0 is a frequency independent term, and N'(f) and N''(f) represent the complex frequency dependence. Thereby, the imaginary part can be used to derive the loss of energy (absorption) and the real part can be assigned to the changes in the propagation velocity (refraction) and thus describes the delay of an electromagnetic wave which propagates through that medium. Although the frequency dependent delay contribution appears to be of small order, one has to consider that signals are propagating through few kilometers of troposphere at high elevations to hundredths of kilometers at low elevations. Therefore, the Kashima Ray-Tracing package (Hobiger et al., 2008) has been modified (and named KARAT-LAMBDA) to enable the consideration of a frequency dependent refractivity. By using this tool, it was studied if and to which extent future space geodetic instruments are affected from dispersive troposphere delays. Moreover, a semi-empirical correction model for the microwave link of the Atomic Clock Ensemble in Space (ACES) has been developed, based on ray-tracing calculations with KARAT-LAMBDA. The proposed model (Hobiger et al., 2013) has been tested with simulated ISS overflights at different potential ACES ground station sites and it could be demonstrated that this model is capable to remove biases and elevation dependent features caused by the dispersive troposphere delay difference between the up-link and down-link. References: T. Hobiger, R. Ichikawa, T. Kondo, and Y. Koyama (2008), Fast and accurate ray-tracing algorithms for real-time space geodetic applications using numerical weather models, Journal of Geophysical Research, vol. 113, iss. D203027, pp. 1-14. T. Hobiger, D
Ray-Tracing studies in a perturbed atmosphere I- The initial value problem
Tannous, C
2001-01-01
We report the development of a new ray-tracing simulation tool having the potential of the full characterization of a radio link through the accurate study of the propagation path of the signal from the transmitting to the receiving antennas across a perturbed atmosphere. The ray-tracing equations are solved, with controlled accuracy, in three dimensions (3D) and the propagation characteristics are obtained using various refractive index models. The launching of the rays, the atmospheric medium and its disturbances are characterized in 3D. The novelty in the approach stems from the use of special numerical techniques dealing with so called stiff differential equations without which no solution of the ray-tracing equations is possible. Starting with a given launching angle, the solution consists of the ray trajectory, the propagation time information at each point of the path, the beam spreading, the transmitted (resp. received) power taking account of the radiation pattern and orientation of the antennas and ...
Refined ray tracing inside single- and double-curvatured concave surfaces
Choudhury, Balamati
2016-01-01
This book describes the ray tracing effects inside different quadric surfaces. Analytical surface modeling is a priori requirement for electromagnetic (EM) analysis over aerospace platforms. Although numerically-specified surfaces and even non-uniform rational basis spline (NURBS) can be used for modeling such surfaces, for most practical EM applications, it is sufficient to model them as quadric surface patches and the hybrids thereof. It is therefore apparent that a vast majority of aerospace bodies can be conveniently modeled as combinations of simpler quadric surfaces, i.e. hybrid of quadric cylinders and quadric surfaces of revolutions. Hence the analysis of geometric ray tracing inside is prerequisite to analyzing the RF build-up. This book, describes the ray tracing effects inside different quadric surfaces such as right circular cylinder, general paraboloid of revolution (GPOR), GPOR frustum of different shaping parameters and the corresponding visualization of the ray-path details. Finally ray tracin...
ENZO+MORAY: radiation hydrodynamics adaptive mesh refinement simulations with adaptive ray tracing
Wise, John H.; Abel, Tom
2011-07-01
We describe a photon-conserving radiative transfer algorithm, using a spatially-adaptive ray-tracing scheme, and its parallel implementation into the adaptive mesh refinement cosmological hydrodynamics code ENZO. By coupling the solver with the energy equation and non-equilibrium chemistry network, our radiation hydrodynamics framework can be utilized to study a broad range of astrophysical problems, such as stellar and black hole feedback. Inaccuracies can arise from large time-steps and poor sampling; therefore, we devised an adaptive time-stepping scheme and a fast approximation of the optically-thin radiation field with multiple sources. We test the method with several radiative transfer and radiation hydrodynamics tests that are given in Iliev et al. We further test our method with more dynamical situations, for example, the propagation of an ionization front through a Rayleigh-Taylor instability, time-varying luminosities and collimated radiation. The test suite also includes an expanding H II region in a magnetized medium, utilizing the newly implemented magnetohydrodynamics module in ENZO. This method linearly scales with the number of point sources and number of grid cells. Our implementation is scalable to 512 processors on distributed memory machines and can include the radiation pressure and secondary ionizations from X-ray radiation. It is included in the newest public release of ENZO.
Kizhner, Semion; Hunter, Stanley D.; Hanu, Andrei R.; Sheets, Teresa B.
2016-01-01
Richard O. Duda and Peter E. Hart of Stanford Research Institute in [1] described the recurring problem in computer image processing as the detection of straight lines in digitized images. The problem is to detect the presence of groups of collinear or almost collinear figure points. It is clear that the problem can be solved to any desired degree of accuracy by testing the lines formed by all pairs of points. However, the computation required for n=NxM points image is approximately proportional to n2 or O(n2), becoming prohibitive for large images or when data processing cadence time is in milliseconds. Rosenfeld in [2] described an ingenious method due to Hough [3] for replacing the original problem of finding collinear points by a mathematically equivalent problem of finding concurrent lines. This method involves transforming each of the figure points into a straight line in a parameter space. Hough chose to use the familiar slope-intercept parameters, and thus his parameter space was the two-dimensional slope-intercept plane. A parallel Hough transform running on multi-core processors was elaborated in [4]. There are many other proposed methods of solving a similar problem, such as sampling-up-the-ramp algorithm (SUTR) [5] and algorithms involving artificial swarm intelligence techniques [6]. However, all state-of-the-art algorithms lack in real time performance. Namely, they are slow for large images that require performance cadence of a few dozens of milliseconds (50ms). This problem arises in spaceflight applications such as near real-time analysis of gamma ray measurements contaminated by overwhelming amount of traces of cosmic rays (CR). Future spaceflight instruments such as the Advanced Energetic Pair Telescope instrument (AdEPT) [7-9] for cosmos gamma ray survey employ large detector readout planes registering multitudes of cosmic ray interference events and sparse science gamma ray event traces' projections. The AdEPT science of interest is in the
X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT
Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; Grendreau, Keith C.
2015-01-01
The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.
CUDA-Accelerated Geodesic Ray-Tracing for Fiber Tracking.
van Aart, Evert; Sepasian, Neda; Jalba, Andrei; Vilanova, Anna
2011-01-01
Diffusion Tensor Imaging (DTI) allows to noninvasively measure the diffusion of water in fibrous tissue. By reconstructing the fibers from DTI data using a fiber-tracking algorithm, we can deduce the structure of the tissue. In this paper, we outline an approach to accelerating such a fiber-tracking algorithm using a Graphics Processing Unit (GPU). This algorithm, which is based on the calculation of geodesics, has shown promising results for both synthetic and real data, but is limited in its applicability by its high computational requirements. We present a solution which uses the parallelism offered by modern GPUs, in combination with the CUDA platform by NVIDIA, to significantly reduce the execution time of the fiber-tracking algorithm. Compared to a multithreaded CPU implementation of the same algorithm, our GPU mapping achieves a speedup factor of up to 40 times.
CUDA-Accelerated Geodesic Ray-Tracing for Fiber Tracking
Evert van Aart
2011-01-01
Full Text Available Diffusion Tensor Imaging (DTI allows to noninvasively measure the diffusion of water in fibrous tissue. By reconstructing the fibers from DTI data using a fiber-tracking algorithm, we can deduce the structure of the tissue. In this paper, we outline an approach to accelerating such a fiber-tracking algorithm using a Graphics Processing Unit (GPU. This algorithm, which is based on the calculation of geodesics, has shown promising results for both synthetic and real data, but is limited in its applicability by its high computational requirements. We present a solution which uses the parallelism offered by modern GPUs, in combination with the CUDA platform by NVIDIA, to significantly reduce the execution time of the fiber-tracking algorithm. Compared to a multithreaded CPU implementation of the same algorithm, our GPU mapping achieves a speedup factor of up to 40 times.
The Gaussian Laser Angular Distribution in HYDRA's 3D Laser Ray Trace Package
Sepke, Scott M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-04-10
In this note, the angular distribution of rays launched by the 3D LZR ray trace package is derived for Gaussian beams (npower==2) with bm model=3±. Beams with bm model=+3 have a nearly at distribution, and beams with bm model=-3 have a nearly linear distribution when the spot size is large compared to the wavelength.
Skew ray tracing in a step-index optical fiber using Geometric Algebra
Ang, Angeleene; McNamara, Daniel J
2015-01-01
We used Geometric Algebra to compute the paths of skew rays in a cylindrical, step-index multimode optical fiber. To do this, we used the vector addition form for the law of propagation, the exponential of an imaginary vector form for the law of refraction, and the juxtaposed vector product form for the law of reflection. In particular, the exponential forms of the vector rotations enables us to take advantage of the addition or subtraction of exponential arguments of two rotated vectors in the derivation of the ray tracing invariants in cylindrical and spherical coordinates. We showed that the light rays inside the optical fiber trace a polygonal helical path characterized by three invariants that relate successive reflections inside the fiber: the ray path distance, the difference in axial distances, and the difference in the azimuthal angles. We also rederived the known generalized formula for the numerical aperture for skew rays, which simplifies to the standard form for meridional rays.
X-ray simulation algorithms used in ISP
Sullivan, John P. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2016-07-29
ISP is a simulation code which is sometimes used in the USNDS program. ISP is maintained by Sandia National Lab. However, the X-ray simulation algorithm used by ISP was written by scientists at LANL – mainly by Ed Fenimore with some contributions from John Sullivan and George Neuschaefer and probably others. In email to John Sullivan on July 25, 2016, Jill Rivera, ISP project lead, said “ISP uses the function xdosemeters_sim from the xgen library.” The is a fortran subroutine which is also used to simulate the X-ray response in consim (a descendant of xgen). Therefore, no separate documentation of the X-ray simulation algorithms in ISP have been written – the documentation for the consim simulation can be used.
Khare, Kshitij; 10.1214/11-AOS916
2012-01-01
The data augmentation (DA) algorithm is a widely used Markov chain Monte Carlo algorithm that is easy to implement but often suffers from slow convergence. The sandwich algorithm is an alternative that can converge much faster while requiring roughly the same computational effort per iteration. Theoretically, the sandwich algorithm always converges at least as fast as the corresponding DA algorithm in the sense that $\\Vert {K^*}\\Vert \\le \\Vert {K}\\Vert$, where $K$ and $K^*$ are the Markov operators associated with the DA and sandwich algorithms, respectively, and $\\Vert\\cdot\\Vert$ denotes operator norm. In this paper, a substantial refinement of this operator norm inequality is developed. In particular, under regularity conditions implying that $K$ is a trace-class operator, it is shown that $K^*$ is also a positive, trace-class operator, and that the spectrum of $K^*$ dominates that of $K$ in the sense that the ordered elements of the former are all less than or equal to the corresponding elements of the lat...
SolTrace: A Ray-Tracing Code for Complex Solar Optical Systems
Wendelin, Tim [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dobos, Aron [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lewandowski, Allan [Allan Lewandowski Solar Consulting LLC, Evergreen, CO (United States)
2013-10-01
SolTrace is an optical simulation tool designed to model optical systems used in concentrating solar power (CSP) applications. The code was first written in early 2003, but has seen significant modifications and changes since its inception, including conversion from a Pascal-based software development platform to C++. SolTrace is unique in that it can model virtually any optical system utilizingthe sun as the source. It has been made available for free and as such is in use worldwide by industry, universities, and research laboratories. The fundamental design of the code is discussed, including enhancements and improvements over the earlier version. Comparisons are made with other optical modeling tools, both non-commercial and commercial in nature. Finally, modeled results are shownfor some typical CSP systems and, in one case, compared to measured optical data.
F--Ray: A new algorithm for efficient transport of ionizing radiation
Mao, Yi; Zhang, J.; Wandelt, B. D.; Shapiro, P. R.; Iliev, I. T.
2014-04-01
We present a new algorithm for the 3D transport of ionizing radiation, called F2-Ray (Fast Fourier Ray-tracing method). The transfer of ionizing radiation with long mean free path in diffuse intergalactic gas poses a special challenge to standard numerical methods which transport the radiation in position space. Standard methods usually trace each individual ray until it is fully absorbed by the intervening gas. If the mean free path is long, the computational cost and memory load are likely to be prohibitive. We have developed an algorithm that overcomes these limitations and is, therefore, significantly more efficient. The method calculates the transfer of radiation collectively, using the Fast Fourier Transform to convert radiation between position and Fourier spaces, so the computational cost will not increase with the number of ionizing sources. The method also automatically combines parallel rays with the same frequency at the same grid cell, thereby minimizing the memory requirement. The method is explicitly photon-conserving, i.e. the depletion of ionizing photons is guaranteed to equal the photoionizations they caused, and explicitly obeys the periodic boundary condition, i.e. the escape of ionizing photons from one side of a simulation volume is guaranteed to be compensated by emitting the same amount of photons into the volume through the opposite side. Together, these features make it possible to numerically simulate the transfer of ionizing photons more efficiently than previous methods. Since ionizing radiation such as the X-ray is responsible for heating the intergalactic gas when first stars and quasars form at high redshifts, our method can be applied to simulate thermal distribution, in addition to cosmic reionization, in three-dimensional inhomogeneous cosmological density field.
Optimal Search Mechanism Analysis of Light Ray Optimization Algorithm
Jihong SHEN; Jialian LI; Bin WEI
2012-01-01
Based on Fermat's principle and the automatic optimization mechanism in the propagation process of light,an optimal searching algorithm named light ray optimization is presented,where the laws of refraction and reflection of light rays are integrated into searching process of optimization.In this algorithm,coordinate space is assumed to be the space that is full of media with different refractivities,then the space is divided by grids,and finally the searching path is assumed to be the propagation path of light rays.With the law of refraction,the search direction is deflected to the direction that makes the value of objective function decrease.With the law of reflection,the search direction is changed,which makes the search continue when it cannot keep going with refraction.Only the function values of objective problems are used and there is no artificial rule in light ray optimization,so it is simple and easy to realize.Theoretical analysis and the results of numerical experiments show that the algorithm is feasible and effective.
Optimizing detector geometry for trace element mapping by X-ray fluorescence.
Sun, Yue; Gleber, Sophie-Charlotte; Jacobsen, Chris; Kirz, Janos; Vogt, Stefan
2015-05-01
Trace metals play critical roles in a variety of systems, ranging from cells to photovoltaics. X-Ray Fluorescence (XRF) microscopy using X-ray excitation provides one of the highest sensitivities available for imaging the distribution of trace metals at sub-100 nm resolution. With the growing availability and increasing performance of synchrotron light source based instruments and X-ray nanofocusing optics, and with improvements in energy-dispersive XRF detectors, what are the factors that limit trace element detectability? To address this question, we describe an analytical model for the total signal incident on XRF detectors with various geometries, including the spectral response of energy dispersive detectors. This model agrees well with experimentally recorded X-ray fluorescence spectra, and involves much shorter calculation times than with Monte Carlo simulations. With such a model, one can estimate the signal when a trace element is illuminated with an X-ray beam, and when just the surrounding non-fluorescent material is illuminated. From this signal difference, a contrast parameter can be calculated and this can in turn be used to calculate the signal-to-noise ratio (S/N) for detecting a certain elemental concentration. We apply this model to the detection of trace amounts of zinc in biological materials, and to the detection of small quantities of arsenic in semiconductors. We conclude that increased detector collection solid angle is (nearly) always advantageous even when considering the scattered signal. However, given the choice between a smaller detector at 90° to the beam versus a larger detector at 180° (in a backscatter-like geometry), the 90° detector is better for trace element detection in thick samples, while the larger detector in 180° geometry is better suited to trace element detection in thin samples.
Ray tracing optical analysis of offset solar collector for Space Station solar dynamic system
Jefferies, Kent S.
1988-01-01
OFFSET, a detailed ray tracing computer code, was developed at NASA Lewis Research Center to model the offset solar collector for the Space Station solar dynamic electric power system. This model traces rays from 50 points on the face of the sun to 10 points on each of the 456 collector facets. The triangular facets are modeled with spherical, parabolic, or toroidal reflective surface contour and surface slope errors. The rays are then traced through the receiver aperture to the walls of the receiver. Images of the collector and of the sun within the receiver produced by this code provide insight into the collector receiver interface. Flux distribution on the receiver walls, plotted by this code, is improved by a combination of changes to aperture location and receiver tilt angle. Power loss by spillage at the receiver aperture is computed and is considerably reduced by using toroidal facets.
Robust Image Denoising using a Virtual Flash Image for Monte Carlo Ray Tracing
Moon, Bochang; Jun, Jong Yun; Lee, JongHyeob
2013-01-01
parameters. To highlight the benefits of our method, we apply our method to two Monte Carlo ray tracing methods, photon mapping and path tracing, with various input scenes. We demonstrate that using virtual flash images and homogeneous pixels with a standard denoising method outperforms state-of-the-art......We propose an efficient and robust image-space denoising method for noisy images generated by Monte Carlo ray tracing methods. Our method is based on two new concepts: virtual flash images and homogeneous pixels. Inspired by recent developments in flash photography, virtual flash images emulate...... values. While denoising each pixel, we consider only homogeneous pixels—pixels that are statistically equivalent to each other. This makes it possible to define a stochastic error bound of our method, and this bound goes to zero as the number of ray samples goes to infinity, irrespective of denoising...
Reflection formulae for ray tracing in uniaxial anisotropic media using Huygens's principle.
Alemán-Castañeda, Luis A; Rosete-Aguilar, Martha
2016-11-01
Ray tracing in uniaxial anisotropic materials is important because they are widely used for instrumentation, liquid-crystal displays, laser cavities, and quantum experiments. There are previous works regarding ray tracing refraction and reflection formulae using the common electromagnetic theory approach, but only the refraction formulae have been deduced using Huygens's principle. In this paper we obtain the reflection expressions using this unconventional approach with a specific coordinate system in which both refraction and reflection formulae are simplified as well as their deduction. We compute some numerical examples to compare them with the common expressions obtained using electromagnetic theory.
A boundary integral formalism for stochastic ray tracing in billiards
Chappell, David J. [School of Science and Technology, Nottingham Trent University, Clifton Campus, Nottingham NG11 8NS (United Kingdom); Tanner, Gregor [School of Mathematical Sciences, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom)
2014-12-15
Determining the flow of rays or non-interacting particles driven by a force or velocity field is fundamental to modelling many physical processes. These include particle flows arising in fluid mechanics and ray flows arising in the geometrical optics limit of linear wave equations. In many practical applications, the driving field is not known exactly and the dynamics are determined only up to a degree of uncertainty. This paper presents a boundary integral framework for propagating flows including uncertainties, which is shown to systematically interpolate between a deterministic and a completely random description of the trajectory propagation. A simple but efficient discretisation approach is applied to model uncertain billiard dynamics in an integrable rectangular domain.
Magnetospheric Whistler Mode Ray Tracing with the Inclusion of Finite Electron and Ion Temperature
Maxworth, A. S.; Golkowski, M.
2015-12-01
Ray tracing is an important technique for the study of whistler mode wave propagation in the Earth's magnetosphere. In numerical ray tracing the trajectory of a wave packet is calculated at each point in space by solving the Haselgrove equations, assuming a smooth, loss-less medium with no mode coupling. Previous work on ray tracing has assumed a cold plasma environment with negligible electron and ion temperatures. In this work we present magnetospheric whistler mode wave ray tracing results with the inclusion of finite ion and electron temperature. The inclusion of finite temperature effects makes the fourth order dispersion relation become sixth order. We compare our results with the work done by previous researchers for cold plasma environments, using two near earth space models (NGO and GCPM). Inclusion of finite temperature closes the otherwise open refractive index surface near the lower hybrid resonance frequency and affects the magnetospheric reflection of whistler waves. We also asses the main changes in the ray trajectory and implications for cyclotron resonance wave particle interactions including energetic particle precipitation.
Fast and robust ray casting algorithms for virtual X-ray imaging
Freud, N.; Duvauchelle, P.; Létang, J. M.; Babot, D.
2006-07-01
Deterministic calculations based on ray casting techniques are known as a powerful alternative to the Monte Carlo approach to simulate X- or γ-ray imaging modalities (e.g. digital radiography and computed tomography), whenever computation time is a critical issue. One of the key components, from the viewpoint of computing resource expense, is the algorithm which determines the path length travelled by each ray through complex 3D objects. This issue has given rise to intensive research in the field of 3D rendering (in the visible light domain) during the last decades. The present work proposes algorithmic solutions adapted from state-of-the-art computer graphics to carry out ray casting in X-ray imaging configurations. This work provides an algorithmic basis to simulate direct transmission of X-rays, as well as scattering and secondary emission of radiation. Emphasis is laid on the speed and robustness issues. Computation times are given in a typical case of radiography simulation.
Fast and robust ray casting algorithms for virtual X-ray imaging
Freud, N. [CNDRI, Laboratory of Nondestructive Testing Using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)]. E-mail: Nicolas.Freud@insa-lyon.fr; Duvauchelle, P. [CNDRI, Laboratory of Nondestructive Testing Using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, Avenue Albert Einstein, 69621 Villeurbanne Cedex (France); Letang, J.M. [CNDRI, Laboratory of Nondestructive Testing Using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, Avenue Albert Einstein, 69621 Villeurbanne Cedex (France); Babot, D. [CNDRI, Laboratory of Nondestructive Testing Using Ionizing Radiations, INSA-Lyon Scientific and Technical University, Bat. Antoine de Saint-Exupery, 20, Avenue Albert Einstein, 69621 Villeurbanne Cedex (France)
2006-07-15
Deterministic calculations based on ray casting techniques are known as a powerful alternative to the Monte Carlo approach to simulate X- or {gamma}-ray imaging modalities (e.g. digital radiography and computed tomography), whenever computation time is a critical issue. One of the key components, from the viewpoint of computing resource expense, is the algorithm which determines the path length travelled by each ray through complex 3D objects. This issue has given rise to intensive research in the field of 3D rendering (in the visible light domain) during the last decades. The present work proposes algorithmic solutions adapted from state-of-the-art computer graphics to carry out ray casting in X-ray imaging configurations. This work provides an algorithmic basis to simulate direct transmission of X-rays, as well as scattering and secondary emission of radiation. Emphasis is laid on the speed and robustness issues. Computation times are given in a typical case of radiography simulation.
Odyssey: Ray tracing and radiative transfer in Kerr spacetime
Pu, Hung-Yi; Yun, Kiyun; Younsi, Ziri; Yoon, Suk-Jin
2016-01-01
Odyssey is a GPU-based General Relativistic Radiative Transfer (GRRT) code for computing images and/or spectra in Kerr metric describing the spacetime around a rotating black hole. Odyssey is implemented in CUDA C/C++. For flexibility, the namespace structure in C++ is used for different tasks; the two default tasks presented in the source code are the redshift of a Keplerian disk and the image of a Keplerian rotating shell at 340GHz. Odyssey_Edu, an educational software package for visualizing the ray trajectories in the Kerr spacetime that uses Odyssey, is also available.
Ray-tracing simulations of spherical Johann diffraction spectrometer for in-beam X-ray experiments
Jagodziński, P., E-mail: jagodzin@tu.kielce.pl [Department of Physics, Kielce University of Technology, Tysiaclecia PP 7, 25-314 Kielce (Poland); Pajek, M.; Banaś, D. [Institute of Physics, Jan Kochanowski University, Świȩtokrzyska 15, 25-406 Kielce (Poland); Beyer, H.F. [GSI Helmholtzzentrum für Schwerionenforschung, D-64291 Darmstadt (Germany); Trassinelli, M. [Institut des NanoSciences de Paris, Université Pierre et Marie Curie, 4 Place Jussieu, 75015 Paris (France); Stoehlker, Th. [GSI Helmholtzzentrum für Schwerionenforschung, D-64291 Darmstadt (Germany); Helmholtz-Insitut Jena, D-07743 Jena (Germany); Institut für Optic und Quantenelektronik, Friedrich-Schiller-Universität Jena, D-07743 Jena (Germany)
2014-07-01
The results of the Monte-Carlo ray-tracing simulations for a Johann-type Bragg spectrometer with spherically curved-crystal designed to detect the X-rays from a fast-moving source are reported. These calculations were performed to optimize the X-ray spectrometer to be used at the gas-target installed at ion storage ring for high-resolution X-ray experiments. In particular, the two-dimensional distributions of detected photons were studied using the Monte-Carlo method both for the stationary and moving X-ray sources, taking into account a detailed description of X-ray source and X-ray diffraction on the crystal as well as a role of the Doppler effect for in-beam experiments. The origin of the asymmetry of observed X-ray profiles was discussed in detail and the procedure to derive a precise (sub-eV) X-ray transition energy for such asymmetric profiles was proposed. The results are important for the investigations of 1s2p{sup 3}P{sub 2}→1s2s{sup 3}S{sub 1} intrashell transition in excited He-like uranium ions in in-beam X-ray experiments.
Seismic ray-tracing calculation based on parabolic travel-time interpolation
周竹生; 张赛民; 陈灵君
2004-01-01
A new seismic ray-tracing method is put forward based on parabolic travel-time interpolation(PTI) method, which is more accurate than the linear travel-time interpolation (LTI) method. Both PTI method and LTI method are used to compute seismic travel-time and ray-path in a 2-D grid cell model. Firstly, some basic concepts are introduced. The calculations of travel-time and ray-path are carried out only at cell boundaries. So, the ray-path is always straight in the same cells with uniform velocity. Two steps are applied in PTI and LTI method, step 1 computes travel-time and step 2 traces ray-path. Then, the derivation of LTI formulas is described. Because of the presence of refraction wave in shot cell, the formula aiming at shot cell is also derived. Finally, PTI method is presented. The calculation of PTI method is more complex than that of LTI method, but the error is limited. The results of numerical model show that PTI method can trace ray-path more accurately and efficiently than LTI method does.
A boundary integral formalism for stochastic ray tracing in billiards
Chappell, David J
2014-01-01
Determining the flow of rays or particles driven by a force or velocity field is fundamental to modelling many physical processes, including weather forecasting and the simulation of molecular dynamics. High frequency wave energy distributions can also be approximated using flow or transport equations. Applications arise in underwater and room acoustics, vibro-acoustics, seismology, electromagnetics, quantum mechanics and in producing computer generated imagery. In many practical applications, the driving field is not known exactly and the dynamics are determined only up to a degree of uncertainty. This paper presents a boundary integral framework for propagating flows including uncertainties, which is shown to systematically interpolate between a deterministic and a completely random description of the trajectory propagation. A simple but efficient discretisation approach is applied to model uncertain billiard dynamics in an integrable rectangular domain.
A Ray-tracing Method to Analyzing Modulated Planar Fabry-Perot Antennas
Hougs, Mikkel Dahl; Kim, Oleksiy S.; Breinbjerg, Olav
2015-01-01
A new approach for fast modelling of Fabry-Perot antennas with modulated partially reflective surfaces (PRS) using ray-tracing is proposed. For validation of the method, a configuration is introduced which consists of a cavity with a modulated PRS, fed internally by a magnetic dipole. The PRS con...
Emulating Ray-Tracing Channels in Multi-probe Anechoic Chamber Setups for Virtual Drive Testing
Fan, Wei; Llorente, Ines Carton; Kyösti, Pekka
2016-01-01
This paper discusses virtual drive testing (VDT) for multiple-input multiple-output (MIMO) capable terminals in multi-probe anechoic chamber (MPAC) setups. We propose to perform VDT, via reproducing ray tracing (RT) simulated channels with the field synthesis technique. Simulation results demonst...
A Ray-tracing Method to Analyzing Modulated Planar Fabry-Perot Antennas
Hougs, Mikkel Dahl; Kim, Oleksiy S.; Breinbjerg, Olav
2015-01-01
A new approach for fast modelling of Fabry-Perot antennas with modulated partially reflective surfaces (PRS) using ray-tracing is proposed. For validation of the method, a configuration is introduced which consists of a cavity with a modulated PRS, fed internally by a magnetic dipole. The PRS...
Ray-tracing simulations of liquid-crystal gradient-index lenses for three-dimensional displays
Sluijter, M.; Herzog, A.; De Boer, D.K.G.; Krijn, M.P.C.M.; Urbach, P.H.
2009-01-01
For the first time, to our knowledge, we report ray-tracing simulations of an advanced liquid-crystal gradientindex lens structure for application in switchable two-dimensional/three-dimensional (3D) autostereoscopic displays. We present ray-tracing simulations of the angular-dependent lens action.
Invisibility cloaking via non-smooth transformation optics and ray tracing
Crosskey, Miles M., E-mail: mmc31@duke.ed [Mathematics Department, Duke University, Box 90320, Durham, NC 27708-0320 (United States); Nixon, Andrew T., E-mail: andrew_nixon@brown.ed [Division of Applied Mathematics, Brown University, 182 George Street, Providence, RI 02912 (United States); Schick, Leland M., E-mail: lschick@math.arizona.ed [Department of Mathematics, University of Arizona, 617 N. Santa Rita Ave., P.O. Box 210089, Tucson, AZ 85721-0089 (United States); Kovacic, Gregor, E-mail: kovacg@rpi.ed [Mathematical Sciences Department, Rensselaer Polytechnic Institute, 110 8th Street, Troy, NY 12180 (United States)
2011-05-02
We present examples of theoretically-predicted invisibility cloaks with shapes other than spheres and cylinders, including cones and ellipsoids, as well as shapes spliced from parts of these simpler shapes. In addition, we present an example explicitly displaying the non-uniqueness of invisibility cloaks of the same shape. We depict rays propagating through these example cloaks using ray tracing for geometric optics. - Highlights: Theoretically-predicted conical and ellipsoidal invisibility cloaks. Non-smooth cloaks spliced from parts of simpler shapes. Example displaying non-uniqueness of invisibility cloaks of the same shape. Rays propagating through example cloaks depicted using geometric optics.
Junjie Ma
2015-05-01
Full Text Available Total-reflection X-ray fluorescence (TXRF has achieved remarkable success with the advantages of simultaneous multi-element analysis capability, decreased background noise, no matrix effects, wide dynamic range, ease of operation, and potential of trace analysis. Simultaneous quantitative online analysis of trace heavy metals is urgently required by dynamic environmental monitoring and management, and TXRF has potential in this application domain. However, it calls for an online analysis scheme based on TXRF as well as a robust and rapid quantification method, which have not been well explored yet. Besides, spectral overlapping and background effects may lead to loss of accuracy or even faulty results during practical quantitative TXRF analysis. This paper proposes an intelligent, multi-element quantification method according to the established online TXRF analysis platform. In the intelligent quantification method, collected characteristic curves of all existing elements and a pre-estimated background curve in the whole spectrum scope are used to approximate the measured spectrum. A novel hybrid algorithm, PSO-RBFN-SA, is designed to solve the curve-fitting problem, with offline global optimization and fast online computing. Experimental results verify that simultaneous quantification of trace heavy metals, including Cr, Mn, Fe, Co, Ni, Cu and Zn, is realized on the online TXRF analysis platform, and both high measurement precision and computational efficiency are obtained.
Ma, Junjie; Wang, Yeyao; Yang, Qi; Liu, Yubing; Shi, Ping
2015-05-06
Total-reflection X-ray fluorescence (TXRF) has achieved remarkable success with the advantages of simultaneous multi-element analysis capability, decreased background noise, no matrix effects, wide dynamic range, ease of operation, and potential of trace analysis. Simultaneous quantitative online analysis of trace heavy metals is urgently required by dynamic environmental monitoring and management, and TXRF has potential in this application domain. However, it calls for an online analysis scheme based on TXRF as well as a robust and rapid quantification method, which have not been well explored yet. Besides, spectral overlapping and background effects may lead to loss of accuracy or even faulty results during practical quantitative TXRF analysis. This paper proposes an intelligent, multi-element quantification method according to the established online TXRF analysis platform. In the intelligent quantification method, collected characteristic curves of all existing elements and a pre-estimated background curve in the whole spectrum scope are used to approximate the measured spectrum. A novel hybrid algorithm, PSO-RBFN-SA, is designed to solve the curve-fitting problem, with offline global optimization and fast online computing. Experimental results verify that simultaneous quantification of trace heavy metals, including Cr, Mn, Fe, Co, Ni, Cu and Zn, is realized on the online TXRF analysis platform, and both high measurement precision and computational efficiency are obtained.
Solar Proton Transport Within an ICRU Sphere Surrounded by a Complex Shield: Ray-trace Geometry
Slaba, Tony C.; Wilson, John W.; Badavi, Francis F.; Reddell, Brandon D.; Bahadori, Amir A.
2015-01-01
A computationally efficient 3DHZETRN code with enhanced neutron and light ion (Z is less than or equal to 2) propagation was recently developed for complex, inhomogeneous shield geometry described by combinatorial objects. Comparisons were made between 3DHZETRN results and Monte Carlo (MC) simulations at locations within the combinatorial geometry, and it was shown that 3DHZETRN agrees with the MC codes to the extent they agree with each other. In the present report, the 3DHZETRN code is extended to enable analysis in ray-trace geometry. This latest extension enables the code to be used within current engineering design practices utilizing fully detailed vehicle and habitat geometries. Through convergence testing, it is shown that fidelity in an actual shield geometry can be maintained in the discrete ray-trace description by systematically increasing the number of discrete rays used. It is also shown that this fidelity is carried into transport procedures and resulting exposure quantities without sacrificing computational efficiency.
ACCELERATION RENDERING METHOD ON RAY TRACING WITH ANGLE COMPARISON AND DISTANCE COMPARISON
Liliana liliana
2007-01-01
Full Text Available In computer graphics applications, to produce realistic images, a method that is often used is ray tracing. Ray tracing does not only model local illumination but also global illumination. Local illumination count ambient, diffuse and specular effects only, but global illumination also count mirroring and transparency. Local illumination count effects from the lamp(s but global illumination count effects from other object(s too. Objects that are usually modeled are primitive objects and mesh objects. The advantage of mesh modeling is various, interesting and real-like shape. Mesh contains many primitive objects like triangle or square (rare. A problem in mesh object modeling is long rendering time. It is because every ray must be checked with a lot of triangle of the mesh. Added by ray from other objects checking, the number of ray that traced will increase. It causes the increasing of rendering time. To solve this problem, in this research, new methods are developed to make the rendering process of mesh object faster. The new methods are angle comparison and distance comparison. These methods are used to reduce the number of ray checking. The rays predicted will not intersect with the mesh, are not checked weather the ray intersects the mesh. With angle comparison, if using small angle to compare, the rendering process will be fast. This method has disadvantage, if the shape of each triangle is big, some triangles will be corrupted. If the angle to compare is bigger, mesh corruption can be avoided but the rendering time will be longer than without comparison. With distance comparison, the rendering time is less than without comparison, and no triangle will be corrupted.
H. Vazquez-Leal
2014-01-01
Full Text Available We present a homotopy continuation method (HCM for finding multiple operating points of nonlinear circuits composed of devices modelled by using piecewise linear (PWL representations. We propose an adaptation of the modified spheres path tracking algorithm to trace the homotopy trajectories of PWL circuits. In order to assess the benefits of this proposal, four nonlinear circuits composed of piecewise linear modelled devices are analysed to determine their multiple operating points. The results show that HCM can find multiple solutions within a single homotopy trajectory. Furthermore, we take advantage of the fact that homotopy trajectories are PWL curves meant to replace the multidimensional interpolation and fine tuning stages of the path tracking algorithm with a simple and highly accurate procedure based on the parametric straight line equation.
A Formal Algorithm for Routing Traces on a Printed Circuit Board
Hedgley, David R., Jr.
1996-01-01
This paper addresses the classical problem of printed circuit board routing: that is, the problem of automatic routing by a computer other than by brute force that causes the execution time to grow exponentially as a function of the complexity. Most of the present solutions are either inexpensive but not efficient and fast, or efficient and fast but very costly. Many solutions are proprietary, so not much is written or known about the actual algorithms upon which these solutions are based. This paper presents a formal algorithm for routing traces on a print- ed circuit board. The solution presented is very fast and efficient and for the first time speaks to the question eloquently by way of symbolic statements.
TIM, a ray-tracing program for METATOY research and its dissemination
Lambert, Dean; Hamilton, Alasdair C.; Constable, George; Snehanshu, Harsh; Talati, Sharvil; Courtial, Johannes
2012-03-01
TIM (The Interactive METATOY) is a ray-tracing program specifically tailored towards our research in METATOYs, which are optical components that appear to be able to create wave-optically forbidden light-ray fields. For this reason, TIM possesses features not found in other ray-tracing programs. TIM can either be used interactively or by modifying the openly available source code; in both cases, it can easily be run as an applet embedded in a web page. Here we describe the basic structure of TIM's source code and how to extend it, and we give examples of how we have used TIM in our own research. Program summaryProgram title: TIM Catalogue identifier: AEKY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEKY_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License No. of lines in distributed program, including test data, etc.: 124 478 No. of bytes in distributed program, including test data, etc.: 4 120 052 Distribution format: tar.gz Programming language: Java Computer: Any computer capable of running the Java Virtual Machine (JVM) 1.6 Operating system: Any; developed under Mac OS X Version 10.6 RAM: Typically 145 MB (interactive version running under Mac OS X Version 10.6) Classification: 14, 18 External routines: JAMA [1] (source code included) Nature of problem: Visualisation of scenes that include scene objects that create wave-optically forbidden light-ray fields. Solution method: Ray tracing. Unusual features: Specifically designed to visualise wave-optically forbidden light-ray fields; can visualise ray trajectories; can visualise geometric optic transformations; can create anaglyphs (for viewing with coloured "3D glasses") and random-dot autostereograms of the scene; integrable into web pages. Running time: Problem-dependent; typically seconds for a simple scene.
Herlocker, J. A.; Jiang, J.; Garcia, K. J.
2008-08-01
Common digital display systems have evolved into sophisticated optical devices. The rapid market growth in liquid crystal displays makes the simulation of full systems attractive, promoting virtual prototyping with decreased development times and improved manufacturability. Realistic simulation using commercial non-sequential ray tracing tools has been instrumental in this process, but the need to accurately model polarization devices has become critical in many designs. As display systems seek more efficient use of light and more accurate color representation, the proper simulation of polarization devices with large acceptance angles is essential. This paper examines non-uniform polarization effects in the simulation of modern display devices using realistic polarizer and retarder models in the ASAPÂ® non-sequential ray-tracing environment.
Stress optical path difference analysis of off-axis lens ray trace footprint
Hsu, Ming-Ying; Chan, Chia-Yen; Lin, Wei-Cheng; Wu, Kun-Huan; Chen, Chih-Wen; Chan, Shenq-Tsong; Huang, Ting-Ming
2013-06-01
The mechanical and thermal stress on lens will cause the glass refractive index different, the refractive index of light parallel and light perpendicular to the direction of stress. The refraction index changes will introduce Optical Path Difference (OPD). This study is applying Finite Element Method (FEM) and optical ray tracing; calculate off axis ray stress OPD. The optical system stress distribution result is calculated from finite element simulation, and the stress coordinate need to rotate to optical path direction. Meanwhile, weighting stress to each optical ray path and sum the ray path OPD. The Z-direction stress OPD can be fitted by Zernike polynomial, the separated to sag difference, and rigid body motion. The fitting results can be used to evaluate the stress effect on optical component.
Mathematic models for a ray tracing method and its applications in wireless optical communications.
Zhang, Minglun; Zhang, Yangan; Yuan, Xueguang; Zhang, Jinnan
2010-08-16
This paper presents a new ray tracing method, which contains a whole set of mathematic models, and its validity is verified by simulations. In addition, both theoretical analysis and simulation results show that the computational complexity of the method is much lower than that of previous ones. Therefore, the method can be used to rapidly calculate the impulse response of wireless optical channels for complicated systems.
Modeling pyramidal sensors in ray-tracing software by a suitable user-defined surface
Antichi, Jacopo; Munari, Matteo; Magrin, Demetrio; Riccardi, Armando
2016-04-01
Following the unprecedented results in terms of performances delivered by the first light adaptive optics system at the Large Binocular Telescope, there has been a wide-spread and increasing interest on the pyramid wavefront sensor (PWFS), which is the key component, together with the adaptive secondary mirror, of the adaptive optics (AO) module. Currently, there is no straightforward way to model a PWFS in standard sequential ray-tracing software. Common modeling strategies tend to be user-specific and, in general, are unsatisfactory for general applications. To address this problem, we have developed an approach to PWFS modeling based on user-defined surface (UDS), whose properties reside in a specific code written in C language, for the ray-tracing software ZEMAX™. With our approach, the pyramid optical component is implemented as a standard surface in ZEMAX™, exploiting its dynamic link library (DLL) conversion then greatly simplifying ray tracing and analysis. We have utilized the pyramid UDS DLL surface-referred to as pyramidal acronyms may be too risky (PAM2R)-in order to design the current PWFS-based AO system for the Giant Magellan Telescope, evaluating tolerances, with particular attention to the angular sensitivities, by means of sequential ray-tracing tools only, thus verifying PAM2R reliability and robustness. This work indicates that PAM2R makes the design of PWFS as simple as that of other optical standard components. This is particularly suitable with the advent of the extremely large telescopes era for which complexity is definitely one of the main challenges.
A rapid and accurate two-point ray tracing method in horizontally layered velocity model
TIAN Yue; CHEN Xiao-fei
2005-01-01
A rapid and accurate method for two-point ray tracing in horizontally layered velocity model is presented in this paper. Numerical experiments show that this method provides stable and rapid convergence with high accuracies, regardless of various 1-D velocity structures, takeoff angles and epicentral distances. This two-point ray tracing method is compared with the pseudobending technique and the method advanced by Kim and Baag (2002). It turns out that the method in this paper is much more efficient and accurate than the pseudobending technique, but is only applicable to 1-D velocity model. Kim(s method is equivalent to ours for cases without large takeoff angles, but it fails to work when the takeoff angle is close to 90o. On the other hand, the method presented in this paper is applicable to cases with any takeoff angles with rapid and accurate convergence. Therefore, this method is a good choice for two-point ray tracing problems in horizontally layered velocity model and is efficient enough to be applied to a wide range of seismic problems.
Ray-tracing and physical-optics analysis of the aperture efficiency in a radio telescope.
Olmi, Luca; Bolli, Pietro
2007-07-01
The performance of telescope systems working at microwave or visible-IR wavelengths is typically described in terms of different parameters according to the wavelength range. Most commercial ray-tracing packages have been specifically designed for use with visible-IR systems and thus, though very flexible and sophisticated, do not provide the appropriate parameters to fully describe microwave antennas and to compare with specifications. We demonstrate that the Strehl ratio is equal to the phase efficiency when the apodization factor is taken into account. The phase efficiency is the most critical contribution to the aperture efficiency of an antenna and the most difficult parameter to optimize during the telescope design. The equivalence between the Strehl ratio and the phase efficiency gives the designer/user of the telescope the opportunity to use the faster commercial ray-tracing software to optimize the design. We also discuss the results of several tests performed to check the validity of this relationship that we carried out using a ray-tracing software, ZEMAX, and a full Physical Optics software, GRASP9.3, applied to three different telescope designs that span a factor of approximately 10 in terms of D/lambda. The maximum measured discrepancy between phase efficiency and Strehl ratio varies between approximately 0.4% and 1.9% up to an offset angle of >40 beams, depending on the optical configuration, but it is always less than 0.5% where the Strehl ratio is >0.95.
Ray-tracing critical-angle transmission gratings for the X-ray Surveyor and Explorer-size missions
Günther, Hans M.; Bautz, Marshall W.; Heilmann, Ralf K.; Huenemoerder, David P.; Marshall, Herman L.; Nowak, Michael A.; Schulz, Norbert S.
2016-07-01
We study a critical angle transmission (CAT) grating spectrograph that delivers a spectral resolution significantly above any X-ray spectrograph ever own. This new technology will allow us to resolve kinematic components in absorption and emission lines of galactic and extragalactic matter down to unprecedented dispersion levels. We perform ray-trace simulations to characterize the performance of the spectrograph in the context of an X-ray Surveyor or Arcus like layout (two mission concepts currently under study). Our newly developed ray-trace code is a tool suite to simulate the performance of X-ray observatories. The simulator code is written in Python, because the use of a high-level scripting language allows modifications of the simulated instrument design in very few lines of code. This is especially important in the early phase of mission development, when the performances of different configurations are contrasted. To reduce the run-time and allow for simulations of a few million photons in a few minutes on a desktop computer, the simulator code uses tabulated input (from theoretical models or laboratory measurements of samples) for grating efficiencies and mirror reflectivities. We find that the grating facet alignment tolerances to maintain at least 90% of resolving power that the spectrometer has with perfect alignment are (i) translation parallel to the optical axis below 0.5 mm, (ii) rotation around the optical axis or the groove direction below a few arcminutes, and (iii) constancy of the grating period to 1:105. Translations along and rotations around the remaining axes can be significantly larger than this without impacting the performance.
M. J. M. Penning de Vries
2015-09-01
Full Text Available Detecting the optical properties of aerosols using passive satellite-borne measurements alone is a difficult task due to the broadband effect of aerosols on the measured spectra and the influences of surface and cloud reflection. We present another approach to determine aerosol type, namely by studying the relationship of aerosol optical depth (AOD with trace gas abundance, aerosol absorption, and mean aerosol size. Our new Global Aerosol Classification Algorithm, GACA, examines relationships between aerosol properties (AOD and extinction Ångström exponent from the Moderate Resolution Imaging Spectroradiometer (MODIS, UV Aerosol Index from the second Global Ozone Monitoring Experiment, GOME-2 and trace gas column densities (NO2, HCHO, SO2 from GOME-2, and CO from MOPITT, the Measurements of Pollution in the Troposphere instrument on a monthly mean basis. First, aerosol types are separated based on size (Ångström exponent and absorption (UV Aerosol Index, then the dominating sources are identified based on mean trace gas columns and their correlation with AOD. In this way, global maps of dominant aerosol type and main source type are constructed for each season and compared with maps of aerosol composition from the global MACC (Monitoring Atmospheric Composition and Climate model. Although GACA cannot correctly characterize transported or mixed aerosols, GACA and MACC show good agreement regarding the global seasonal cycle, particularly for urban/industrial aerosols. The seasonal cycles of both aerosol type and source are also studied in more detail for selected 5° × 5° regions. Again, good agreement between GACA and MACC is found for all regions, but some systematic differences become apparent: the variability of aerosol composition (yearly and/or seasonal is often not well captured by MACC, the amount of mineral dust outside of the dust belt appears to be overestimated, and the abundance of secondary organic aerosols is underestimated in
M. J. M. Penning de Vries
2015-05-01
Full Text Available Detecting the optical properties of aerosols using passive satellite-borne measurements alone is a difficult task due to the broad-band effect of aerosols on the measured spectra and the influences of surface and cloud reflection. We present another approach to determine aerosol type, namely by studying the relationship of aerosol optical depth (AOD with trace gas abundance, aerosol absorption, and mean aerosol size. Our new Global Aerosol Classification Algorithm, GACA, examines relationships between aerosol properties (AOD and extinction Ångström exponent from the Moderate Resolution Imaging Spectroradiometer (MODIS, UV Aerosol Index from the second Global Ozone Monitoring Experiment, GOME-2 and trace gas column densities (NO2, HCHO, SO2 from GOME-2, and CO from MOPITT, the Measurements of Pollution in the Troposphere instrument on a monthly mean basis. First, aerosol types are separated based on size (Ångström exponent and absorption (UV Aerosol Index, then the dominating sources are identified based on mean trace gas columns and their correlation with AOD. In this way, global maps of dominant aerosol type and main source type are constructed for each season and compared with maps of aerosol composition from the global MACC (Monitoring Atmospheric Composition and Climate model. Although GACA cannot correctly characterize transported or mixed aerosols, GACA and MACC show good agreement regarding the global seasonal cycle, particularly for urban/industrial aerosols. The seasonal cycles of both aerosol type and source are also studied in more detail for selected 5° × 5° regions. Again, good agreement between GACA and MACC is found for all regions, but some systematic differences become apparent: the variability of aerosol composition (yearly and/or seasonal is often not well captured by MACC, the amount of mineral dust outside of the dust belt appears to be overestimated, and the abundance of secondary organic aerosols is underestimated
Efficient x-ray image enhancement algorithm using image fusion.
Shen, Kuan; Wen, Yumei; Cai, Yufang
2009-01-01
Multiresolution Analysis (MRA) plays an important role in image and signal processing fields, and it can extract information at different scales. Image fusion is a process of combining two or more images into an image, which extracts features from source images and provides more information than one image. The research presented in this article is aimed at the development of an automated imaging enhancement system in digital radiography (DR) images, which can clearly display all the defects in one image and don't bring blocking effect. In terms of characteristic of the collected radiographic signals, in the proposed scheme the subsection of signals is mapped to 0-255 gray scale to form several gray images and then these images are fused to form a new enhanced image. This article focuses on comparing the discriminating power of several multiresolution images decomposing methods using contrast pyramid, wavelet, and ridgelet respectively. The algorithms are extensively tested and the results are compared with standard image enhancement algorithms. Tests indicate that the fused images present a more detailed representation of the x-ray image. Detection, recognition, and search tasks may therefore benefit from this.
Ray tracing simulation of aero-optical effect using multiple gradient index layer
Yang, Seul Ki; Seong, Sehyun; Ryu, Dongok; Kim, Sug-Whan; Kwon, Hyeuknam; Jin, Sang-Hun; Jeong, Ho; Kong, Hyun Bae; Lim, Jae Wan; Choi, Jong Hwa
2016-10-01
We present a new ray tracing simulation of aero-optical effect through anisotropic inhomogeneous media as supersonic flow field surrounds a projectile. The new method uses multiple gradient-index (GRIN) layers for construction of the anisotropic inhomogeneous media and ray tracing simulation. The cone-shaped projectile studied has 19° semi-vertical angle; a sapphire window is parallel to the cone angle; and an optical system of the projectile was assumed via paraxial optics and infrared image detector. The condition for the steady-state solver conducted through computational fluid dynamics (CFD) included Mach numbers 4 and 6 in speed, 25 km altitude, and 0° angle of attack (AoA). The grid refractive index of the flow field via CFD analysis and Gladstone-Dale relation was discretized into equally spaced layers which are parallel with the projectile's window. Each layer was modeled as a form of 2D polynomial by fitting the refractive index distribution. The light source of ray set generated 3,228 rays for varying line of sight (LOS) from 10° to 40°. Ray tracing simulation adopted the Snell's law in 3D to compute the paths of skew rays in the GRIN layers. The results show that optical path difference (OPD) and boresight error (BSE) decreases exponentially as LOS increases. The variation of refractive index decreases, as the speed of flow field increases the OPD and its rate of decay at Mach number 6 in speed has somewhat larger value than at Mach number 4 in speed. Compared with the ray equation method, at Mach number 4 and 10° LOS, the new method shows good agreement, generated 0.33% of relative root-mean-square (RMS) OPD difference and 0.22% of relative BSE difference. Moreover, the simulation time of the new method was more than 20,000 times faster than the conventional ray equation method. The technical detail of the new method and simulation is presented with results and implication.
Comparing FDTD and Ray-Tracing Models in Numerical Simulation of HgCdTe LWIR Photodetectors
Vallone, Marco; Goano, Michele; Bertazzi, Francesco; Ghione, Giovanni; Schirmacher, Wilhelm; Hanna, Stefan; Figgemeier, Heinrich
2016-09-01
We present a simulation study of HgCdTe-based long-wavelength infrared detectors, focusing on methodological comparisons between the finite-difference time-domain (FDTD) and ray-tracing optical models. We performed three-dimensional simulations to determine the absorbed photon density distributions and the corresponding photocurrent and quantum efficiency spectra of isolated n-on- p uniform-composition pixels, systematically comparing the results obtained with FDTD and ray tracing. Since ray tracing is a classical optics approach, unable to describe interference effects, its applicability has been found to be strongly wavelength dependent, especially when reflections from metallic layers are relevant. Interesting cavity effects around the material cutoff wavelength are described, and the cases where ray tracing can be considered a viable approximation are discussed.
Ray-tracing for coordinate knowledge in the JWST Integrated Science Instrument Module
Sabatke, Derek; Rohrbach, Scott; Kubalak, David
2014-01-01
Optical alignment and testing of the Integrated Science Instrument Module of the James Webb Space Telescope is underway. We describe the Optical Telescope Element Simulator used to feed the science instruments with point images of precisely known location and chief ray pointing, at appropriate wavelengths and flux levels, in vacuum and at operating temperature. The simulator's capabilities include a number of devices for in situ monitoring of source flux, wavefront error, pupil illumination, image position and chief ray angle. Taken together, these functions become a fascinating example of how the first order properties and constructs of an optical design (coordinate systems, image surface and pupil location) acquire measurable meaning in a real system. We illustrate these functions with experimental data, and describe the ray tracing system used to provide both pointing control during operation and analysis support subsequently. Prescription management takes the form of optimization and fitting. Our core too...
Maliage, M
2012-05-01
Full Text Available The purpose of this paper is to validate SolTrace for concentrating solar investigations at CSIR by means of a test case: the comparison of the flux distribution in the focal spot of a 1.25 m2 target aligned heliostat predicted by the ray tracing...
Development of a total reflection X-ray fluorescence spectrometer for ultra-trace element analysis
M K Tiwari; B Gowrishankar; V K Raghuvanshi; R V Nandedkar; K J S Sawhney
2002-10-01
A simple and fairly inexpensive total reflection X-ray fluorescence (TXRF) spectrometer has been designed, constructed and realized. The spectrometer is capable of ultra-trace multielement analysis as well as performs surface characterization of thin films. The TXRF setup comprises of an X-ray generator, a slitcollimator arrangement, a monochromator/cutoff-stage, a sample reflector stage and an X-ray detection system. The glancing angle of incidence on the two reflectors is implemented using a sine-bar mechanism that enables precise angle adjustments. An energy dispersive detector and a GM counter are employed for measuring respectively the fluorescence intensities and the direct X-ray beam intensity. A Cu-target X-ray generator with its line focus window is used as an excitation source. The spectrometer is quite portable with its compact design and use of a peltier cooled solid state detector for energy dispersive detection. Alignment and characterization of the TXRF system has been performed and the minimum detection limits for various elements have been determined to be in the range of 100 pg to 5 ng even at low X-ray generator powers of 30 kV, 5 mA. The capability of the TXRF system developed for thin film characterization is also demonstrated.
A comprehensive ray tracing study on the impact of solar reflections from glass curtain walls.
Wong, Justin S J
2016-01-01
To facilitate the investigation of the impact of solar reflection from the façades of skyscrapers to surrounding environment, a comprehensive ray tracing model has been developed using the International Commerce Centre (ICC) in Hong Kong as an example. Taking into account the actual physical dimensions of buildings and meteorological data, the model simulates and traces the paths of solar reflections from ICC to the surrounding buildings, assessing the impact in terms of hit locations, light intensity and the hit time on each day throughout the year. Our analyses show that various design and architectural features of ICC have amplified the intensity of reflected solar rays and increased the hit rates of surrounding buildings. These factors include the high reflectivity of glass panels, their upward tilting angles, the concave profile of the 'Dragon Tail' (glass panels near the base), the particular location and orientation of ICC, as well as the immense height of ICC with its large reflective surfaces. The simulation results allow us to accurately map the date and time when the ray projections occur on each of the target buildings, rendering important information such as the number of converging (overlapping) projections, and the actual light intensity hitting each of the buildings at any given time. Comparisons with other skyscrapers such as Taipei 101 in Taiwan and 2-IFC (International Finance Centre) Hong Kong are made. Remedial actions for ICC and preventive measures are also discussed.
FRESCO+: an improved O2 A-band cloud retrieval algorithm for tropospheric trace gas retrievals
M. van Roozendael
2008-11-01
Full Text Available The FRESCO (Fast Retrieval Scheme for Clouds from the Oxygen A-band algorithm has been used to retrieve cloud information from measurements of the O2 A-band around 760 nm by GOME, SCIAMACHY and GOME-2. The cloud parameters retrieved by FRESCO are the effective cloud fraction and cloud pressure, which are used for cloud correction in the retrieval of trace gases like O3 and NO2. To improve the cloud pressure retrieval for partly cloudy scenes, single Rayleigh scattering has been included in an improved version of the algorithm, called FRESCO+. We compared FRESCO+ and FRESCO effective cloud fractions and cloud pressures using simulated spectra and one month of GOME measured spectra. As expected, FRESCO+ gives more reliable cloud pressures over partly cloudy pixels. Simulations and comparisons with ground-based radar/lidar measurements of clouds show that the FRESCO+ cloud pressure is about the optical midlevel of the cloud. Globally averaged, the FRESCO+ cloud pressure is about 50 hPa higher than the FRESCO cloud pressure, while the FRESCO+ effective cloud fraction is about 0.01 larger. The effect of FRESCO+ cloud parameters on O3 and NO2 vertical column density (VCD retrievals is studied using SCIAMACHY data and ground-based DOAS measurements. We find that the FRESCO+ algorithm has a significant effect on tropospheric NO2 retrievals but a minor effect on total O3 retrievals. The retrieved SCIAMACHY tropospheric NO2 VCDs using FRESCO+ cloud parameters (v1.1 are lower than the tropospheric NO2VCDs which used FRESCO cloud parameters (v1.04, in particular over heavily polluted areas with low clouds. The difference between SCIAMACHY tropospheric NO2 VCDs v1.1 and ground-based MAXDOAS measurements performed in Cabauw, The Netherlands, during the DANDELIONS campaign is about −2.12×1014molec cm−2.
Spin tracking simulations in AGS based on ray-tracing methods - bare lattice, no snakes -
Meot, F.; Ahrens, L.; Gleen, J.; Huang, H.; Luccio, A.; MacKay, W. W.; Roser, T.; Tsoupas, N.
2009-09-01
This Note reports on the first simulations of and spin dynamics in the AGS using the ray-tracing code Zgoubi. It includes lattice analysis, comparisons with MAD, DA tracking, numerical calculation of depolarizing resonance strengths and comparisons with analytical models, etc. It also includes details on the setting-up of Zgoubi input data files and on the various numerical methods of concern in and available from Zgoubi. Simulations of crossing and neighboring of spin resonances in AGS ring, bare lattice, without snake, have been performed, in order to assess the capabilities of Zgoubi in that matter, and are reported here. This yields a rather long document. The two main reasons for that are, on the one hand the desire of an extended investigation of the energy span, and on the other hand a thorough comparison of Zgoubi results with analytical models as the 'thin lens' approximation, the weak resonance approximation, and the static case. Section 2 details the working hypothesis : AGS lattice data, formulae used for deriving various resonance related quantities from the ray-tracing based 'numerical experiments', etc. Section 3 gives inventories of the intrinsic and imperfection resonances together with, in a number of cases, the strengths derived from the ray-tracing. Section 4 gives the details of the numerical simulations of resonance crossing, including behavior of various quantities (closed orbit, synchrotron motion, etc.) aimed at controlling that the conditions of particle and spin motions are correct. In a similar manner Section 5 gives the details of the numerical simulations of spin motion in the static case: fixed energy in the neighboring of the resonance. In Section 6, weak resonances are explored, Zgoubi results are compared with the Fresnel integrals model. Section 7 shows the computation of the {rvec n} vector in the AGS lattice and tuning considered. Many details on the numerical conditions as data files etc. are given in the
X-rays across the galaxy population I: tracing the main sequence of star formation
Aird, J; Georgakakis, A
2016-01-01
We use deep Chandra imaging to measure the distribution of X-ray luminosities (L_X) for samples of star-forming galaxies as a function of stellar mass and redshift, using a Bayesian method to push below the nominal X-ray detection limits. Our luminosity distributions all show narrow peaks at L_X < 10^{42} erg/s that we associate with star formation, as opposed to AGN that are traced by a broad tail to higher L_X. Tracking the luminosity of these peaks as a function of stellar mass reveals an "X-ray main sequence" with a constant slope ~0.63 +/- 0.03 over 8.5 < log M*/Msun < 11.5 and 0.1 < z < 4, with a normalization that increases with redshift as (1+z)^{3.79+/-0.12}. We also compare the peak X-ray luminosities with UV-to-IR tracers of star formation rates (SFRs) to calibrate the scaling between L_X and SFR. We find that L_X \\propto SFR^{0.83} x (1+z)^{1.3}, where the redshift evolution and non-linearity likely reflect changes in high-mass X-ray binary populations of star-forming galaxies. Usin...
Infrasonic ray tracing applied to mesoscale atmospheric structures: refraction by hurricanes.
Bedard, Alfred J; Jones, R Michael
2013-11-01
A ray-tracing program is used to estimate the refraction of infrasound by the temperature structure of the atmosphere and by hurricanes represented by a Rankine-combined vortex wind plus a temperature perturbation. Refraction by the hurricane winds is significant, giving rise to regions of focusing, defocusing, and virtual sources. The refraction of infrasound by the temperature anomaly associated with a hurricane is small, probably no larger than that from uncertainties in the wind field. The results are pertinent to interpreting ocean wave generated infrasound in the vicinities of tropical cyclones.
Shi, Guangyuan; Li, Song; Huang, Ke; Li, Zile; Zheng, Guoxing
2016-10-01
We have developed a new numerical ray-tracing approach for LIDAR signal power function computation, in which the light round-trip propagation is analyzed by geometrical optics and a simple experiment is employed to acquire the laser intensity distribution. It is relatively more accurate and flexible than previous methods. We emphatically discuss the relationship between the inclined angle and the dynamic range of detector output signal in biaxial LIDAR system. Results indicate that an appropriate negative angle can compress the signal dynamic range. This technique has been successfully proved by comparison with real measurements.
X-Ray fluorescence analysis of trace elements in fruit juice
Bao, Sheng-Xiang; Wang, Zhi-Hong; Liu, Jing-Song
1999-12-01
X-Ray fluorescence spectrometry is applied to the determination of trace elements in fruit juice characterized by a high content of sugar and other soluble solid substances. Samples are prepared by evaporation, carbonization and pressing into discs. The synthesis of standards is described in detail. All element concentrations are directly estimated from linear calibration curves obtained without any matrix correction. The results of the analysis are in good agreement with those given by inductively coupled plasma-atomic emission spectrometry and atomic absorption spectrometry techniques.
Mitsuishi, I.; Ezoe, Y.; Ogawa, T.; Sato, M.; Nakamura, K.; Numazawa, M.; Takeuchi, K.; Ohashi, T.; Ishikawa, K.; Mitsuda, K.
2016-01-01
To investigate a feasibility for in situ X-ray imaging spectrometer JUXTA (Jupiter X-ray Telescope Array) onboard a Japanese Jupiter exploration mission, we demonstrated the ideal performances, i.e., angular resolution, effective area and grasp, of our original, conically-approximated Wolter type-I MEMS-processed optics, by extending the previous ray-tracing simulator. The novel simulator enables us to study both on- and off-axis responses for our optics with two-stage optical configurations for the first time. The on-axis angular resolution is restricted to ∼ 13 μm corresponding to ∼ 10 arcsec on the detector plane without considering the diffraction effect and dominated by the diffraction effect below ∼ 1 keV (e.g., 13 arcsec at 1 keV). Si optics can achieve effective area of >700 mm2 and grasp of >1600 mm2 deg2 at our interesting energy of 600 eV. Larger effective area and grasp can be attained by employing Ni as a substrate material or Ir as a reflecting surface material. However, other factors produced in the fabrication processes such as the waviness on the mirror surface and the deformation error cause the significant performance degradation. Thus, we concluded that MEMS-processed optics can satisfy all the requirements of JUXTA only if the manufacturing accuracy can be controlled.
Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.; Salim, S.
2008-01-01
The Green Solar Collector (GSC), a photobioreactor designed for area efficient outdoor cultivation of microalgae uses Fresnel lenses and light guides to focus, transport and distribute direct light into the algae suspension. Calculating the path of rays of light, so-called ray tracing, is used to de
Zijffers, J.F.; Janssen, M.G.J.; Tramper, J.; Wijffels, R.H.; Salim, S.
2008-01-01
The Green Solar Collector (GSC), a photobioreactor designed for area efficient outdoor cultivation of microalgae uses Fresnel lenses and light guides to focus, transport and distribute direct light into the algae suspension. Calculating the path of rays of light, so-called ray tracing, is used to de
Modeling of 3D In—Building Propagation by Ray Tracing Technique
GongKe; XuRui
1995-01-01
The modeling of in-building propagation is of great importance for planning of indoor wireless networks.To model the transmission system comprising of transmitter,receiver and dif-ferent kinds of obstacles,ray tracing technique is used by taking a transmitter as a source launch-ing radio rays in different directions,some of these can reach the receiver through different paths with different path loss and delay,adding them together gives out the field strength at the receiv-ing point.Based on this model,computer simulation is carried out to predict the propagation loss and delay spread,it is shown that the simulation agrees well with the experiments.
Novel applications of the x-ray tracing software package McXtrace
Bergbäck Knudsen, Erik; Nielsen, Martin Meedom; Haldrup, Kristoffer
2014-01-01
We will present examples of applying the X-ray tracing software package McXtrace to different kinds of X-ray scattering experiments. In particular we will be focusing on time-resolved type experiments. Simulations of full scale experiments are particularly useful for this kind, especially when...... some of the issues encountered. Generally more than one or all of these effects are present at once. Simulations can in these cases be used to identify distinct footprints of such distortions and thus give the experimenter a means of deconvoluting them from the signal. We will present a study...... of this kind along with the newest developments of the McXtrace software package....
Ray-tracing analysis of crosstalk in multi-core polymer optical fibers.
Berganza, Amaia; Aldabaldetreku, Gotzon; Zubia, Joseba; Durana, Gaizka
2010-10-11
The aim of this paper is to present a new ray-tracing model which describes the propagation of light in multi-core polymer optical fibers (MCPOFs), taking into account the crosstalk among their cores. The new model overcomes many of the limitations of previous approaches allowing us to simulate MCPOFs of arbitrary designs. Additionally, it provides us with the output ray distribution at the end of the fiber, making it possible to calculate useful parameters related to the fiber performance such as the Near-Field Pattern, the Far-Field Pattern or the bandwidth. We also present experimental measurements in order to validate the computational model and we analyze the importance of crosstalk in different MCPOF configurations.
Photorealistic ray tracing of free-space invisibility cloaks made of uniaxial dielectrics
Halimeh, Jad C
2012-01-01
The design rules of transformation optics generally lead to spatially inhomogeneous and anisotropic impedance-matched magneto-dielectric material distributions for, e.g., free-space invisibility cloaks. Recently, simplified anisotropic non-magnetic free-space cloaks made of a locally uniaxial dielectric material (calcite) have been realized experimentally. In a two-dimensional setting and for in-plane polarized light propagating in this plane, the cloaking performance can still be perfect for light rays. However, for general views in three dimensions, various imperfections are expected. In this paper, we study two different purely dielectric uniaxial cylindrical free-space cloaks. For one, the optic axis is along the radial direction, for the other one it is along the azimuthal direction. The azimuthal uniaxial cloak has not been suggested previously to the best of our knowledge. We visualize the cloaking performance of both by calculating photorealistic images rendered by ray tracing. Following and complemen...
Tracing X-rays through an L-shaped laterally graded multilayer mirror: a synchrotron application.
Honnicke, Marcelo Goncalves; Huang, Xianrong; Keister, Jeffrey W; Kodituwakku, Chaminda Nalaka; Cai, Yong Q
2010-05-01
A theoretical model to trace X-rays through an L-shaped (nested or Montel Kirkpatrick-Baez mirrors) laterally graded multilayer mirror to be used in a synchrotron application is presented. The model includes source parameters (size and divergence), mirror figure (parabolic and elliptic), multilayer parameters (reflectivity, which depends on layer material, thickness and number of layers) and figure errors (slope error, roughness, layer thickness fluctuation Deltad/d and imperfection in the corners). The model was implemented through MATLAB/OCTAVE scripts, and was employed to study the performance of a multilayer mirror designed for the analyzer system of an ultrahigh-resolution inelastic X-ray scattering spectrometer at National Synchrotron Light Source II. The results are presented and discussed.
Ben Zakour, Sihem; Taleb, Hassen
2016-06-01
Endpoint detection (EPD) is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done on the right way. It is truly a crucial part of supplying repeatable effects in every single wafer. When the film to be etched has been completely erased, the endpoint is reached. In order to ensure the desired device performance on the produced integrated circuit, many sensors are used to detect the endpoint, such as the optical, electrical, acoustical/vibrational, thermal, and frictional. But, except the optical sensor, the other ones show their weaknesses due to the environmental conditions which affect the exactness of reaching endpoint. Unfortunately, some exposed area to the film to be etched is very low (signal and showing the incapacity of the traditional endpoint detection method to determine the wind-up of the etch process. This work has provided a means to improve the endpoint detection sensitivity by collecting a huge numbers of full spectral data containing 1201 spectra for each run, then a new unsophisticated algorithm is proposed to select the important endpoint traces named shift endpoint trace selection (SETS). Then, a sensitivity analysis of linear methods named principal component analysis (PCA) and factor analysis (FA), and the nonlinear method called wavelet analysis (WA) for both approximation and details will be studied to compare performances of the methods mentioned above. The signal to noise ratio (SNR) is not only computed based on the main etch (ME) period but also the over etch (OE) period. Moreover, a new unused statistic for EPD, coefficient of variation (CV), is proposed to reach the endpoint in plasma etches process.
Integrated ray tracing simulation of spectral bio-signatures from full 3D earth model
Ryu, Dongok; Seong, Sehyun; Lee, Jae-Min; Hong, Jinsuk; Jeong, Soomin; Jeong, Yukyeong; Kim, Sug-Whan
2009-08-01
Accurate identification and understanding of spectral bio-signatures from possible extra terrestrial planets have received an ever increasing attention from both astronomy and space science communities in recent years. In pursuance of this subject, one of the most important scientific breakthroughs would be to obtain the detailed understanding on spectral biosignatures of the Earth, as it serves as a reference datum for accurate interpretation of collapsed (in temporal and spatial domains) information from the spectral measurement using TPF instruments. We report a new Integrated Ray Tracing (IRT) model capable of computing various spectral bio-signatures as they are observed from the Earth surface. The model includes the Sun, the full 3-D Earth, and an optical instrument, all combined into single ray tracing environment in real scale. In particular, the full 3-D Earth surface is constructed from high resolution coastal line data and defined with realistic reflectance and BSDF characteristics depending on wavelength, vegetation types and their distributions. We first examined the model validity by confirming the imaging and radiometric performance of the AmonRa visible channel camera, simulating the Earth observation from the L1 halo orbit. We then computed disk averaged spectra, light curves and NDVI indexes, leading to the construction of the observed disk averaged spectra at the AmonRa instrument detector plane. The model, computational procedure and the simulation results are presented. The future plan for the detailed spectral signature simulation runs for various input conditions including seasonal vegetation changes and variable cloud covers is discussed.
Fast ray-tracing of human eye optics on Graphics Processing Units.
Wei, Qi; Patkar, Saket; Pai, Dinesh K
2014-05-01
We present a new technique for simulating retinal image formation by tracing a large number of rays from objects in three dimensions as they pass through the optic apparatus of the eye to objects. Simulating human optics is useful for understanding basic questions of vision science and for studying vision defects and their corrections. Because of the complexity of computing such simulations accurately, most previous efforts used simplified analytical models of the normal eye. This makes them less effective in modeling vision disorders associated with abnormal shapes of the ocular structures which are hard to be precisely represented by analytical surfaces. We have developed a computer simulator that can simulate ocular structures of arbitrary shapes, for instance represented by polygon meshes. Topographic and geometric measurements of the cornea, lens, and retina from keratometer or medical imaging data can be integrated for individualized examination. We utilize parallel processing using modern Graphics Processing Units (GPUs) to efficiently compute retinal images by tracing millions of rays. A stable retinal image can be generated within minutes. We simulated depth-of-field, accommodation, chromatic aberrations, as well as astigmatism and correction. We also show application of the technique in patient specific vision correction by incorporating geometric models of the orbit reconstructed from clinical medical images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Monte Carlo tolerancing tool using nonsequential ray tracing on a computer cluster
Reimer, Christopher
2010-08-01
The development of a flexible tolerancing tool for illumination systems based on Matlab® and Zemax® is described in this paper. Two computationally intensive techniques are combined, Monte Carlo tolerancing and non-sequential ray tracing. Implementation of the tool on a computer cluster allows for relatively rapid tolerancing. This paper explores the tool structure, describing the splitting the task of tolerancing between Zemax and Matlab. An equation is derived that determines the number of simulated ray traces needed to accurately resolve illumination uniformity. Two examples of tolerancing illuminators are given. The first one is a projection system consisting of a pico-DLP, a light pipe, a TIR prism and the critical illumination relay optics. The second is a wide band, high performance Köhler illuminator, which includes a modified molded LED as the light source. As high performance illumination systems evolve, the practice of applying standard workshop tolerances to these systems may need to be re-examined.
Kjartansson, Einar; Bjarnason, Ingi Th.
2017-04-01
Tools for ray-tracing through one dimensional earth models consisting of layers of constant velocity gradients, and continuous values across layers, have been developed. They are used to investigate stability and robustness of earthquake locations and velocity determinations in the South Iceland Lowlands (SIL) a transform seismic zone. These tools will also be used to invert for velocity functions for different regions and time periods, by inverting simultaneously for micro-earthquake source parameters and P and S velocities. Increase of velocity gradient with depth will cause rays with different take-off angles to cross, which can result in focusing and triplication when velocity is plotted versus time. It is therefore important to constrain the velocity solutions to avoid this. Large changes in gradient between adjacent layers causes variability of ray density and geometrical spreading, particularly for rays that turn just below the boundaries. This may create artificial clustering in the depth distribution of micro-earthquake source solutions. Resampling of the velocity functions using cubic spline interpolation can be used to reduce these effects. The software is open source and can be accessed at https://github.com/4dseismic
Identification of gravity wave sources using reverse ray tracing over Indian region
M. Pramitha
2014-07-01
Full Text Available Reverse ray tracing method is successfully implemented for the first time in the Indian region for identification of the sources and propagation characteristics of the gravity waves observed using airglow emissions from Gadanki (13.5° N, 79.2° E and Hyderabad (17.5° N, 78.5° E. Wave amplitudes are also traced back for these wave events by including both radiative and diffusive damping. Background temperature and wind data obtained from MSISE-90 and HWM-07 models, respectively, are used for the ray tracing. For Gadanki region suitability of these models is tested. Further, a climatological model of background atmosphere for Gadanki region has been developed using a long-term of nearly 30 years of observations available from a variety of ground-based (MST radar, radiosonde, MF radar, rocket-, and satellite-borne measurements. For considering real-time atmospheric inputs, ERA-Interim products are utilized. By this reverse ray method, the source locations for nine wave events could be identified to be in the upper troposphere, whereas, for five other events the waves seem to have been ducted in the mesosphere itself. Uncertainty in locating the terminal points in the horizontal direction is estimated to be within 50–100 and 150–300 km for Gadanki and Hyderabad wave events, respectively. This uncertainty arises mainly due to non-consideration of the day-to-day variability in tidal amplitudes. As no convection in-and-around the terminal points are noticed, it is unlikely to be the source. Interestingly, large (~9 m s−1 km−1 vertical shear in the horizontal wind is noted near the ray terminal points (at 10–12 km altitude and is identified to be the source for generating the nine wave events. Conditions prevailing at the terminal points for each of the 14 events are also provided. These events provide leads to a greater understanding of the tropical lower and upper atmospheric coupling through gravity waves.
BENDING RAY-TRACING BASED ON SIMULATED ANNEALING METHOD%基于模拟退火法的弯曲射线追踪
周竹生; 谢金伟
2011-01-01
This paper proposes a new ray-tracing method based on the concept of simulated annealing. With the new method, not only the problem that the traditional ray-tracing method is over dependent on pre - established initial ray-paths is well solved, but also the quality of desirable ray-paths construction and the associated traveltime calculation between fixed sources and receivers is ensured, even if the model is of much complicated velocity-field. As a result, the ray-paths whose traveltime approach is overall minimum are searched out successfully. Furthermore, the algorithm may calculate ray-paths with local extreme lower traveltime too and restrict them easily by instructing rays to pass through some fixed points. The feasibility and stability of the method have been proved by trial results of theoretical models.%提出了一种新的射线追踪方法——模拟退火法.新方法不仅较好地解决了传统射线追踪方法过分依赖初始模型的问题,而且对于复杂速度场模型也能保证在固定的发射与接收点之间构建令人满意的射线路径及其相应的走时,搜索到满足旅行时全局最小的射线路径.此外,新方法还可计算局部最小旅行时,并可方便地通过指定射线经过固定点来对射线路径进行限制.理论模型的试算结果证明了该方法的可行性和稳健性.
Okumura, Akira; Rulten, Cameron
2016-01-01
We have developed a non-sequential ray-tracing simulation library, ROOT-based simulator for ray tracing (ROBAST), which is aimed to be widely used in optical simulations of cosmic-ray (CR) and gamma-ray telescopes. The library is written in C++, and fully utilizes the geometry library of the ROOT framework. Despite the importance of optics simulations in CR experiments, no open-source software for ray-tracing simulations that can be widely used in the community has existed. To reduce the dispensable effort needed to develop multiple ray-tracing simulators by different research groups, we have successfully used ROBAST for many years to perform optics simulations for the Cherenkov Telescope Array (CTA). Among the six proposed telescope designs for CTA, ROBAST is currently used for three telescopes: a Schwarzschild-Couder (SC) medium-sized telescope, one of SC small-sized telescopes, and a large-sized telescope (LST). ROBAST is also used for the simulation and development of hexagonal light concentrators propose...
Shah, R.G.; Salafia, C.M.; Girardi, T.; Conrad, L.; Keaty, K.
2015-01-01
Variability in placental chorionic surface vessel networks (PCSVNs) may mark developmental and functional changes in fetal health. Here we report a protocol of manually tracing PCSVNs from digital 2D images of post-delivery placentas and its validation by a shape matching method to compare the similarity between paint-injected and unmanipulated (uninjected and deflated vessels) tracings of PCSVNs. We show that tracings of unmanipulated vessels produce networks that are very comparable to the networks obtained by tracing paint-injected PCSVNs. We suggest that manual tracings of unmanipulated PCSVNs can extract features of PCSVN growth and structure that may impact fetal wellbeing. PMID:26100723
GPU-based four-dimensional general-relativistic ray tracing
Kuchelmeister, Daniel; Müller, Thomas; Ament, Marco; Wunner, Günter; Weiskopf, Daniel
2012-10-01
This paper presents a new general-relativistic ray tracer that enables image synthesis on an interactive basis by exploiting the performance of graphics processing units (GPUs). The application is capable of visualizing the distortion of the stellar background as well as trajectories of moving astronomical objects orbiting a compact mass. Its source code includes metric definitions for the Schwarzschild and Kerr spacetimes that can be easily extended to other metric definitions, relying on its object-oriented design. The basic functionality features a scene description interface based on the scripting language Lua, real-time image output, and the ability to edit almost every parameter at runtime. The ray tracing code itself is implemented for parallel execution on the GPU using NVidia's Compute Unified Device Architecture (CUDA), which leads to performance improvement of an order of magnitude compared to a single CPU and makes the application competitive with small CPU cluster architectures. Program summary Program title: GpuRay4D Catalog identifier: AEMV_v1_0 Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEMV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 73649 No. of bytes in distributed program, including test data, etc.: 1334251 Distribution format: tar.gz Programming language: C++, CUDA. Computer: Linux platforms with a NVidia CUDA enabled GPU (Compute Capability 1.3 or higher), C++ compiler, NVCC (The CUDA Compiler Driver). Operating system: Linux. RAM: 2 GB Classification: 1.5. External routines: OpenGL Utility Toolkit development files, NVidia CUDA Toolkit 3.2, Lua5.2 Nature of problem: Ray tracing in four-dimensional Lorentzian spacetimes. Solution method: Numerical integration of light rays, GPU-based parallel programming using CUDA, 3D
,
2015-01-01
We have developed a non-sequential ray-tracing simulation library, ROot-BAsed Simulator for ray Tracing (ROBAST), which is aimed for wide use in optical simulations of cosmic-ray (CR) and gamma-ray telescopes. The library is written in C++ and fully utilizes the geometry library of the ROOT analysis framework. Despite the importance of optics simulations in CR experiments, no open-source software for ray-tracing simulations that can be widely used existed. To reduce the unnecessary effort demanded when different research groups develop multiple ray-tracing simulators, we have successfully used ROBAST for many years to perform optics simulations for the Cherenkov Telescope Array (CTA). Among the proposed telescope designs for CTA, ROBAST is currently being used for three telescopes: a Schwarzschild--Couder telescope, one of the Schwarzschild--Couder small-sized telescopes, and a large-sized telescope (LST). ROBAST is also used for the simulations and the development of hexagonal light concentrators that has be...
X-rays across the galaxy population - I. Tracing the main sequence of star formation
Aird, J.; Coil, A. L.; Georgakakis, A.
2017-03-01
We use deep Chandra imaging to measure the distribution of X-ray luminosities (LX) for samples of star-forming galaxies as a function of stellar mass and redshift, using a Bayesian method to push below the nominal X-ray detection limits. Our luminosity distributions all show narrow peaks at LX ≲ 1042 erg s-1 that we associate with star formation, as opposed to AGN that are traced by a broad tail to higher LX. Tracking the luminosity of these peaks as a function of stellar mass reveals an 'X-ray main sequence' with a constant slope ≈0.63 ± 0.03 over 8.5 ≲ log {M}_{ast }/M_{⊙} ≲ 11.5 and 0.1 ≲ z ≲ 4, with a normalization that increases with redshift as (1 + z)3.79 ± 0.12. We also compare the peak X-ray luminosities with UV-to-IR tracers of star formation rates (SFRs) to calibrate the scaling between LX and SFR. We find that LX ∝ SFR0.83 × (1 + z)1.3, where the redshift evolution and non-linearity likely reflect changes in high-mass X-ray binary populations of star-forming galaxies. Using galaxies with a broader range of SFR, we also constrain a stellar-mass-dependent contribution to LX, likely related to low-mass X-ray binaries. Using this calibration, we convert our X-ray main sequence to SFRs and measure a star-forming main sequence with a constant slope ≈0.76 ± 0.06 and a normalization that evolves with redshift as (1 + z)2.95 ± 0.33. Based on the X-ray emission, there is no evidence for a break in the main sequence at high stellar masses, although we cannot rule out a turnover given the uncertainties in the scaling of LX to SFR.
Determination of minor and trace elements in kidney stones by x-ray fluorescence analysis
Srivastava, Anjali; Heisinger, Brianne J.; Sinha, Vaibhav; Lee, Hyong-Koo; Liu, Xin; Qu, Mingliang; Duan, Xinhui; Leng, Shuai; McCollough, Cynthia H.
2014-03-01
The determination of accurate material composition of a kidney stone is crucial for understanding the formation of the kidney stone as well as for preventive therapeutic strategies. Radiations probing instrumental activation analysis techniques are excellent tools for identification of involved materials present in the kidney stone. In particular, x-ray fluorescence (XRF) can be very useful for the determination of minor and trace materials in the kidney stone. The X-ray fluorescence measurements were performed at the Radiation Measurements and Spectroscopy Laboratory (RMSL) of department of nuclear engineering of Missouri University of Science and Technology and different kidney stones were acquired from the Mayo Clinic, Rochester, Minnesota. Presently, experimental studies in conjunction with analytical techniques were used to determine the exact composition of the kidney stone. A new type of experimental set-up was developed and utilized for XRF analysis of the kidney stone. The correlation of applied radiation source intensity, emission of X-ray spectrum from involving elements and absorption coefficient characteristics were analyzed. To verify the experimental results with analytical calculation, several sets of kidney stones were analyzed using XRF technique. The elements which were identified from this techniques are Silver (Ag), Arsenic (As), Bromine (Br), Chromium (Cr), Copper (Cu), Gallium (Ga), Germanium (Ge), Molybdenum (Mo), Niobium (Nb), Rubidium (Rb), Selenium (Se), Strontium (Sr), Yttrium (Y), Zirconium (Zr). This paper presents a new approach for exact detection of accurate material composition of kidney stone materials using XRF instrumental activation analysis technique.
Yang, Yufei; Yan, Changxiang
2016-02-20
The polarization properties of a two-axis periscopic optical scanner constituted by a pair of rotating planar mirrors have been studied by using the three-dimensional polarization ray-tracing matrix method. The separate and cumulative matrices that define the transformation of the polarization state are obtained and expressed in terms of the rotation angles of two mirrors. The variations of diattenuation and retardance are investigated and graphically shown as functions of the rotation angles. On this basis, a further investigation about the cumulative polarization aberrations of three different metal-coated periscopic scanners is accomplished. Finally, the output polarization states of the three metal-coated scanners are calculated with the input beam of the arbitrary polarization states, and the results show that aluminum film is more appropriate than gold film or silver film for the polarization-maintaining periscopic scanner.
Oprea, Cristiana; Gustova, Marina V; Oprea, Ioan A; Buzguta, Violeta L
2014-01-01
X-ray fluorescence spectrometry (XRFS) was used as a multielement method of evaluation of individual whole human tooth or tooth tissues for their amounts of trace elements. Measurements were carried out on human enamel, dentine, and dental cementum, and some differences in tooth matrix composition were noted. In addition, the elemental concentrations determined in teeth from subjects of different ages, nutritional states, professions and gender, living under various environmental conditions and dietary habits, were included in a comparison by multivariate statistical analysis (MVSA) methods. By factor analysis it was established that inorganic components of human teeth varied consistently with their source in the tissue, with more in such tissue from females than in that from males, and more in tooth incisor than in tooth molar.
Simulation of radiation damping in rings, using stepwise ray-tracing methods
Méot, F.
2015-06-01
The ray-tracing code Zgoubi computes particle trajectories in arbitrary magnetic and/or electric field maps or analytical field models. It includes a built-in fitting procedure, spin tracking, many Monte Carlo processes. The accuracy of the integration method makes it an efficient tool for multi-turn tracking in periodic machines. Energy loss by synchrotron radiation, based on Monte Carlo techniques, had been introduced in Zgoubi in the early 2000s for studies regarding the linear collider beam delivery system. However, only recently has this Monte Carlo tool been used for systematic beam dynamics and spin diffusion studies in rings, including the eRHIC electron-ion collider project at the Brookhaven National Laboratory. Some beam dynamics aspects of this recent use of Zgoubi capabilities, including considerations of accuracy as well as further benchmarking in the presence of synchrotron radiation in rings, are reported here.
Shaofei Xie
2012-02-01
Full Text Available Based on the theory of stochastic resonance, an adaptive single-well stochastic resonance (ASSR coupled with genetic algorithm was developed to enhance the signal-to-noise ratio of weak chromatographic signals. In conventional stochastic resonance algorithm, there are two or more parameters needed to be optimized and the proper parameters values were obtained by a universal searching within a given range. In the developed ASSR, the optimization of system parameter was simplified and automatic implemented. The ASSR was applied to the trace analysis of clenbuterol in human urine and it helped to significantly improve the limit of detection and limit of quantification of clenbuterol. Good linearity, precision and accuracy of the proposed method ensure that it could be an effective tool for trace analysis and the improvement of detective sensibility of current detectors.
Tracing the Lowest Propeller Line in Magellanic High-mass X-Ray Binaries
Christodoulou, Dimitris M.; Laycock, Silas G. T.; Yang, Jun; Fingerman, Samuel
2016-09-01
We have combined the published observations of high-mass X-ray binary (HMXB) pulsars in the Magellanic Clouds with a new processing of the complete archival data sets from the XMM-Newton and Chandra observatories in an attempt to trace the lowest propeller line below which accretion to polar caps is inhibited by the centrifugal force and the pulsations from the most weakly magnetized pulsars cease. Previously published data reveal that some of the faster-spinning pulsars with spin periods of P S < 12 s, detected at relatively low X-ray luminosities L X , appear to define such a line in the P S -L X diagram, characterized by a magnetic moment of μ = 3 × 1029 G cm3. This value implies the presence of surface magnetic fields of B ≥ 3 × 1011 G in the compact objects of this class. Only a few quiescent HMXBs are found below the propeller line: LXP4.40 and SXP4.78, for which XMM-Newton and Chandra null detections respectively placed firm upper limits on their X-ray fluxes in deep quiescence; and A0538-66, for which many sub-Eddington detections have never measured any pulsations. On the other hand, the data from the XMM-Newton and Chandra archives show clearly that, during routine observation cycles, several sources have been detected below the propeller line in extremely faint, nonpulsating states that can be understood as the result of weak magnetospheric emission when accretion to the poles is centrifugally stalled or severely diminished. We also pay attention to the anomalous X-ray pulsar CXOU J010043.1-721134 that was reported in HMXB surveys. Its pulsations and locations near and above the propeller line indicate that this pulsar could be accreting from a fossil disk.
Akberov, R F; Gorshkov, A N
1997-01-01
The X-ray endoscopic semiotics of precancerous gastric mucosal changes (epithelial dysplasia, intestinal epithelial rearrangement) was examined by the results of 1574 gastric examination. A diagnostic algorithm was developed for radiation studies in the diagnosis of the above pathology.
Huang, Rong; Limburg, Karin; Rohtla, Mehis
2017-05-01
X-ray fluorescence computed tomography is often used to measure trace element distributions within low-Z samples, using algorithms capable of X-ray absorption correction when sample self-absorption is not negligible. Its reconstruction is more complicated compared to transmission tomography, and therefore not widely used. We describe in this paper a very practical iterative method that uses widely available transmission tomography reconstruction software for fluorescence tomography. With this method, sample self-absorption can be corrected not only for the absorption within the measured layer but also for the absorption by material beyond that layer. By combining tomography with analysis for scanning X-ray fluorescence microscopy, absolute concentrations of trace elements can be obtained. By using widely shared software, we not only minimized the coding, took advantage of computing efficiency of fast Fourier transform in transmission tomography software, but also thereby accessed well-developed data processing tools coming with well-known and reliable software packages. The convergence of the iterations was also carefully studied for fluorescence of different attenuation lengths. As an example, fish eye lenses could provide valuable information about fish life-history and endured environmental conditions. Given the lens's spherical shape and sometimes the short distance from sample to detector for detecting low concentration trace elements, its tomography data are affected by absorption related to material beyond the measured layer but can be reconstructed well with our method. Fish eye lens tomography results are compared with sliced lens 2D fluorescence mapping with good agreement, and with tomography providing better spatial resolution.
Cheng-Tang Pan
2014-01-01
Full Text Available Traditional surgical shadowless halogen lamps are generally designed as projection type with many light bulbs, which can produce not only mercury pollution but also heat radiation that are serious problems to patient. The study utilized Runge-Kutta methods and mathematical algorithms to design and optimize the freeform lens. The LED (light-emitting diode was adopted to replace the traditional halogen lamp. A uniform lens was designed and fabricated based on the energy conservation. At first, the light field of LED is concentrated through the freeform lens to improve the optical efficiency. Second, the three-shell elliptic curves are applied to the reflective surgical shadowless lamps, where only few LED chips are needed. Light rays emitting from different directions to the target plane can achieve the goal of shadowless. In this study, the LED’s luminance flux is 1,895 lm. The shadow dilution on the target plane is 54%. Ec (central illuminance is 114,900 lux, and the d50/d10 is 57% which is higher than the regulation by 7%, whereas the power consumption is only 20 W. The energy of reflective surgical shadowless lamps can save more than 50%, compared with the traditional projective one.
Okumura, Akira; Noda, Koji; Rulten, Cameron
2016-03-01
We have developed a non-sequential ray-tracing simulation library, ROOT-basedsimulatorforraytracing (ROBAST), which is aimed to be widely used in optical simulations of cosmic-ray (CR) and gamma-ray telescopes. The library is written in C++, and fully utilizes the geometry library of the ROOT framework. Despite the importance of optics simulations in CR experiments, no open-source software for ray-tracing simulations that can be widely used in the community has existed. To reduce the dispensable effort needed to develop multiple ray-tracing simulators by different research groups, we have successfully used ROBAST for many years to perform optics simulations for the Cherenkov Telescope Array (CTA). Among the six proposed telescope designs for CTA, ROBAST is currently used for three telescopes: a Schwarzschild-Couder (SC) medium-sized telescope, one of SC small-sized telescopes, and a large-sized telescope (LST). ROBAST is also used for the simulation and development of hexagonal light concentrators proposed for the LST focal plane. Making full use of the ROOT geometry library with additional ROBAST classes, we are able to build the complex optics geometries typically used in CR experiments and ground-based gamma-ray telescopes. We introduce ROBAST and its features developed for CR experiments, and show several successful applications for CTA.
Three-dimensional ray tracing for refractive correction of human eye ametropies
Jimenez-Hernandez, J. A.; Diaz-Gonzalez, G.; Trujillo-Romero, F.; Iturbe-Castillo, M. D.; Juarez-Salazar, R.; Santiago-Alvarado, A.
2016-09-01
Ametropies of the human eye, are refractive defects hampering the correct imaging on the retina. The most common ways to correct them is by means of spectacles, contact lenses, and modern methods as laser surgery. However, in any case it is very important to identify the ametropia grade for designing the optimum correction action. In the case of laser surgery, it is necessary to define a new shape of the cornea in order to obtain the wanted refractive correction. Therefore, a computational tool to calculate the focal length of the optical system of the eye versus variations on its geometrical parameters is required. Additionally, a clear and understandable visualization of the evaluation process is desirable. In this work, a model of the human eye based on geometrical optics principles is presented. Simulations of light rays coming from a punctual source at six meter from the cornea are shown. We perform a ray-tracing in three dimensions in order to visualize the focusing regions and estimate the power of the optical system. The common parameters of ametropies can be easily modified and analyzed in the simulation by an intuitive graphic user interface.
Kashima RAy-Tracing Service (KARATS) for high accurate GNSS positioning
Ichikawa, R.; Hobiger, T.; Hasegawa, S.; Tsutsumi, M.; Koyama, Y.; Kondo, T.
2010-12-01
Radio signal delays associated with the neutral atmosphere are one of the major error sources of space geodesy such as GPS, GLONASS, GALILEO, VLBI, In-SAR measurements. We have developed a state-of-art tool to estimate the atmospheric path delays by ray-tracing through JMA meso-scale analysis (MANAL data) data. The tools, which we have named 'KAshima RAytracing Tools (KARAT)', are capable of calculating total slant delays and ray-bending angles considering real atmospheric phenomena. Numerical weather models such as MANAL data have undergone a significant improvement of accuracy and spatial resolution, which makes it feasible to utilize them for the correction of atmosphere excess path delays. In the previous studies for evaluating KARAT performance, the KARAT solutions are slightly better than the solutions using VMF1 and GMF with linear gradient model for horizontal and height positions. Based on these results we have started the web-based online service, 'KAshima RAytracing Service (KARATS)' for providing the atmospheric delay correction of RINEX files on Jan 27th, 2010. The KARATS receives user's RINEX data via a proper web site (http://vps.nict.go.jp/karats/index.html) and processes user's data files using KARAT for reducing atmospheric slant delays. The reduced RINEX files are archived in the specific directory for each user on the KARATS server. Once the processing is finished the information of data archive is sent privately via email to each user. If user want to process a large amount of data files, user can prepare own server which archives them. The KARATS can get these files from the user's server using GNU ¥emph{wget} and performs ray-traced corrections. We will present a brief status of the KARATS and summarize first experiences gained after this service went operational in December 2009. In addition, we will also demonstrate the newest KARAT performance based on the 5km MANAL data which has been operational from April 7th, 2009 and an outlook on
A unified algorithm for target detection and tracing based on data of array sensors
WANG Zhong; CHEN Fuhu
2008-01-01
A unified method for target detection and tracing based on data from sensors of array is presented in order to improve detection and tracking abilities of the weak targets with low signal-to-noise ratio. Assuming that the multiple targets are uncorrelated each other and the number of the targets is known a priori, the status of the targets can be estimated with the maximum a-posteriori (MAP) method directly through the sensors data. The proposed method is different from the classical method, by which it can detect and track targets simultaneously by adding the target's signal energy information besides its direction of arrival(DOA) information.Simulated and sea trial data results show that the detection and tracing capabilities of weak targets can be improved and wrong tracing and missing tracing problems, which exist in the classical tracing method when it is faced with the crossing targets, can be resolved by the proposed method.
Advanced signal separation and recovery algorithms for digital x-ray spectroscopy
Mahmoud, Imbaby I.; El Tokhy, Mohamed S.
2015-02-01
X-ray spectroscopy is widely used for in-situ applications for samples analysis. Therefore, spectrum drawing and assessment of x-ray spectroscopy with high accuracy is the main scope of this paper. A Silicon Lithium Si(Li) detector that cooled with a nitrogen is used for signal extraction. The resolution of the ADC is 12 bits. Also, the sampling rate of ADC is 5 MHz. Hence, different algorithms are implemented. These algorithms were run on a personal computer with Intel core TM i5-3470 CPU and 3.20 GHz. These algorithms are signal preprocessing, signal separation and recovery algorithms, and spectrum drawing algorithm. Moreover, statistical measurements are used for evaluation of these algorithms. Signal preprocessing based on DC-offset correction and signal de-noising is performed. DC-offset correction was done by using minimum value of radiation signal. However, signal de-noising was implemented using fourth order finite impulse response (FIR) filter, linear phase least-square FIR filter, complex wavelet transforms (CWT) and Kalman filter methods. We noticed that Kalman filter achieves large peak signal to noise ratio (PSNR) and lower error than other methods. However, CWT takes much longer execution time. Moreover, three different algorithms that allow correction of x-ray signal overlapping are presented. These algorithms are 1D non-derivative peak search algorithm, second derivative peak search algorithm and extrema algorithm. Additionally, the effect of signal separation and recovery algorithms on spectrum drawing is measured. Comparison between these algorithms is introduced. The obtained results confirm that second derivative peak search algorithm as well as extrema algorithm have very small error in comparison with 1D non-derivative peak search algorithm. However, the second derivative peak search algorithm takes much longer execution time. Therefore, extrema algorithm introduces better results over other algorithms. It has the advantage of recovering and
Use of a ray-based reconstruction algorithm to accurately quantify preclinical microSPECT images
Bert Vandeghinste; Roel Van Holen; Christian Vanhove; Filip De Vos; Stefaan Vandenberghe; Steven Staelens
2014-01-01
This work aimed to measure the in vivo quantification errors obtained when ray-based iterative reconstruction is used in micro-singlephoton emission computed tomography (SPECT). This was investigated with an extensive phantom-based evaluation and two typical in vivo studies using (99m) Tc and In-111, measured on a commercially available cadmium zinc telluride (CZT)-based small-animal scanner. Iterative reconstruction was implemented on the GPU using ray tracing, including (1) scatter correcti...
Accounting for partiality in serial crystallography using ray-tracing principles.
Kroon-Batenburg, Loes M J; Schreurs, Antoine M M; Ravelli, Raimond B G; Gros, Piet
2015-09-01
Serial crystallography generates `still' diffraction data sets that are composed of single diffraction images obtained from a large number of crystals arbitrarily oriented in the X-ray beam. Estimation of the reflection partialities, which accounts for the expected observed fractions of diffraction intensities, has so far been problematic. In this paper, a method is derived for modelling the partialities by making use of the ray-tracing diffraction-integration method EVAL. The method estimates partialities based on crystal mosaicity, beam divergence, wavelength dispersion, crystal size and the interference function, accounting for crystallite size. It is shown that modelling of each reflection by a distribution of interference-function weighted rays yields a `still' Lorentz factor. Still data are compared with a conventional rotation data set collected from a single lysozyme crystal. Overall, the presented still integration method improves the data quality markedly. The R factor of the still data compared with the rotation data decreases from 26% using a Monte Carlo approach to 12% after applying the Lorentz correction, to 5.3% when estimating partialities by EVAL and finally to 4.7% after post-refinement. The merging R(int) factor of the still data improves from 105 to 56% but remains high. This suggests that the accuracy of the model parameters could be further improved. However, with a multiplicity of around 40 and an R(int) of ∼50% the merged still data approximate the quality of the rotation data. The presented integration method suitably accounts for the partiality of the observed intensities in still diffraction data, which is a critical step to improve data quality in serial crystallography.
The Use of Pro/Engineer CAD Software and Fishbowl Tool Kit in Ray-tracing Analysis
Nounu, Hatem N.; Kim, Myung-Hee Y.; Ponomarev, Artem L.; Cucinotta, Francis A.
2009-01-01
This document is designed as a manual for a user who wants to operate the Pro/ENGINEER (ProE) Wildfire 3.0 with the NASA Space Radiation Program's (SRP) custom-designed Toolkit, called 'Fishbowl', for the ray tracing of complex spacecraft geometries given by a ProE CAD model. The analysis of spacecraft geometry through ray tracing is a vital part in the calculation of health risks from space radiation. Space radiation poses severe risks of cancer, degenerative diseases and acute radiation sickness during long-term exploration missions, and shielding optimization is an important component in the application of radiation risk models. Ray tracing is a technique in which 3-dimensional (3D) vehicle geometry can be represented as the input for the space radiation transport code and subsequent risk calculations. In ray tracing a certain number of rays (on the order of 1000) are used to calculate the equivalent thickness, say of aluminum, of the spacecraft geometry seen at a point of interest called the dose point. The rays originate at the dose point and terminate at a homogenously distributed set of points lying on a sphere that circumscribes the spacecraft and that has its center at the dose point. The distance a ray traverses in each material is converted to aluminum or other user-selected equivalent thickness. Then all equivalent thicknesses are summed up for each ray. Since each ray points to a direction, the aluminum equivalent of each ray represents the shielding that the geometry provides to the dose point from that particular direction. This manual will first list for the user the contact information for help in installing ProE and Fishbowl in addition to notes on the platform support and system requirements information. Second, the document will show the user how to use the software to ray trace a Pro/E-designed 3-D assembly and will serve later as a reference for troubleshooting. The user is assumed to have previous knowledge of ProE and CAD modeling.
The Super Gaussian Laser Intensity Profile in HYDRA's 3D Laser Ray Trace Package
Sepke, Scott M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2017-01-05
In this note, the laser focal plane intensity pro le for a beam modeled using the 3D ray trace package in HYDRA is determined. First, the analytical model is developed followed by a practical numerical model for evaluating the resulting computationally intensive normalization factor for all possible input parameters.
User and programmers guide to the neutron ray-tracing package McStas, version 1.2
Nielsen, K.; Lefmann, K.
2000-01-01
The software package McStas is a tool for writing Monte Carlo ray-tracing simulations of neutron scattering instruments with very high complexity and precision. The simulations can compute all aspects of the performance of instruments and can thus be usedto optimize the use of existing equipment...
The forms of trace metals in an Illinois basin coal by x-ray absorption fine structure spectroscopy
Chou, I.-Ming; Bruinius, J.A.; Lytle, J.M.; Ruch, R.R.; Huggins, Frank E.; Huffman, G.P.; Ho, K.K.
1997-01-01
Utilities burning Illinois coals currently do not consider trace elements in their flue gas emissions. After the US EPA completes an investigation on trace elements, however, this may change and flue gas emission standards may be established. The mode of occurrence of a trace element may determine its cleanability and Hue gas emission potential. X-ray Absorption Fine Structure (XAFS) is a spectroscopic technique that can differentiate the mode of occurrence of an element, even at the low concentrations that trace elements are found in coal. This is principally accomplished by comparing the XAFS spectra of a coal to a database of reference sample spectra. This study evaluated the technique as a potential tool to examine six trace elements in an Illinois #6 coal. For the elements As and Zn, the present database provides a definitive interpretation on their mode of occurrence. For the elements Ti, V, Cr, and Mn the database of XAFS spectra of trace elements in coal was still too limited to allow a definitive interpretation. The data obtained on these elements, however, was sufficient to rule out several of the mineralogical possibilities that have been suggested previously. The results indicate that XAFS is a promising technique for the study of trace elements in coal.
Total reflection X-ray fluorescence analysis of trace-elements in candies marketed in Mexico
Martinez, T., E-mail: tmc@servidor.unam.m [Facultad de Quimica, Departamento de Quimica Inorganica y Nuclear. Universidad Nacional Autonoma de Mexico, Mexico D.F. 04510 (Mexico); Lartigue, J. [Facultad de Quimica, Departamento de Quimica Inorganica y Nuclear. Universidad Nacional Autonoma de Mexico, Mexico, D.F. 04510 (Mexico); Zarazua, G.; Avila-Perez, P. [National Institute of Nuclear Research. Ocoyoacac, Edo. de Mexico, 05045 (Mexico); Navarrete, M. [Facultad de Quimica, Departamento de Quimica Inorganica y Nuclear. Universidad Nacional Autonoma de Mexico, Mexico, D.F. 04510 (Mexico); Tejeda, S. [National Institute of Nuclear Research. Ocoyoacac, Edo. de Mexico, 05045 (Mexico)
2010-06-15
Trace metals concentrations in food are significant for nutrition, due either to their nature or toxicity. Sweets, including chewing gum and candies, are not exactly a food, but they usually are unwearied consumed by children, the most vulnerable age-group to any kind of metal contamination in the food chain. The presence of relatively high concentrations of heavy metals such as Lead elicits concern since children are highly susceptible to heavy metals poisoning. Trace-metals concentrations were determined for six different flavors of a Mexican candy by means of Total X-ray Fluorescence Spectrometry. Triplicate samples of the various candy's flavours (strawberry, pineapple, lemon, blackberry, orange and chilli) were digested in 8 mL of a mix of supra-pure HNO{sub 3} and H{sub 2}O{sub 2} (6 mL: 2 mL) in a microwave oven MARS-X. Results show the presence of essential and toxic elements such as Ti, Cr, Mn, Fe, Ni, Cu, Zn, Br, Rb, Sr, and Pb. All metal concentrations were higher and significantly different ({alpha} = 0.05) in chilli candy, compared to other candy flavours. Lead concentration fluctuated in the range of 0.102 to 0.342 {mu}g g{sup -1}. A discussion about risk consumption and concentration allowed by Mexican and International Norms is made. As a part of the Quality Control Program, a NIST standard of 'Citrus Leaves' and a blank were treated in the same way.
Identification of Gravity wave Sources over Tropical Latitudes Using Reverse Ray Tracing technique
Venkat Ratnam, Madineni; Pramitha, M.
2016-07-01
Sources and propagation characteristics of high-frequency gravity waves (GWs) observed in the mesosphere using airglow emissions from Gadanki (13.5oN, 79.2oE) and Hyderabad (17.5oN, 78.5oE) are investigated using reverse ray tracing. Wave amplitudes are also traced back, including both radiative and diffusive damping. For this a climatological model of the background atmosphere for the Gadanki region has been developed using nearly 30 years of observations available from a variety of ground based (MST radar, radiosondes, MF radar) and rocket- and satellite-borne measurements. With the reverse ray-tracing method, the source locations for wave events could be identified to be in the upper troposphere. Uncertainty in locating the terminal points of wave events in the horizontal direction is estimated to be within 50-100 km and 150-300 km for Gadanki and Hyderabad wave events, respectively. This uncertainty arises mainly due to non-consideration of the day-to-day variability in the tidal amplitudes. Interestingly, large (~9ms-1 km-1) vertical shears in the horizontal wind are noticed near the ray terminal points (at 10-12 km altitude) and are thus identified to be the source for generating the observed high phase- speed, high-frequency GWs. We also tried to identify the sources for the GWs which are observed during Indo-French campaign conducted during May 2014. Uniqueness of the present study lies in using near-real time background atmosphere data from simultaneous radiosonde and meteor radar covering both source and propagation/dissipation regions of GWs. When we searched for the sources near the terminal points, deep convection is found to be a source for these events. We also tried to identify the sources of inertia-gravity waves (IGWs) that are observed in the troposphere and lower stratosphere during different seasons using long-term (2006-2014) high resolution radiosonde observations. In general, 50% of the waves observed over this location have convection as
Magnetospherically reflected chorus waves revealed by ray tracing with CLUSTER data
M. Parrot
Full Text Available This paper is related to the propagation characteristics of a chorus emission recorded simultaneously by the 4 satellites of the CLUSTER mission on 29 October 2001 between 01:00 and 05:00 UT. During this day, the spacecraft (SC 1, 2, and 4 are relatively close to each other but SC3 has been delayed by half an hour. We use the data recorded aboard CLUSTER by the STAFF spectrum analyser. This instrument provides the cross spectral matrix of three magnetic and two electric field components. Dedicated software processes this spectral matrix in order to determine the wave normal directions relative to the Earth’s magnetic field. This calculation is done for the 4 satellites at different times and different frequencies and allows us to check the directions of these waves. Measurements around the magnetic equator show that the parallel component of the Poynting vector changes its sign when the satellites cross the equator region. It indicates that the chorus waves propagate away from this region which is considered as the source area of these emissions. This is valid for the most intense waves observed on the magnetic and electric power spectrograms. But it is also observed on SC1, SC2, and SC4 that lower intensity waves propagate toward the equator simultaneously with the SC3 intense chorus waves propagating away from the equator. Both waves are at the same frequency. Using the wave normal directions of these waves, a ray tracing study shows that the waves observed by SC1, SC2, and SC4 cross the equatorial plane at the same location as the waves observed by SC3. SC3 which is 30 minutes late observes the waves that originate first from the equator; meanwhile, SC1, SC2, and SC4 observe the same waves that have suffered a Lower Hybrid Resonance (LHR reflection at low altitudes (based on the ray tracing analysis and now return to the equator at a different location with a lower intensity. Similar phenomenon is observed when all SC are on the other side
Gok, Gokhan; Mosna, Zbysek; Arikan, Feza; Arikan, Orhan; Erdem, Esra
2016-07-01
Ionospheric observation is essentially accomplished by specialized radar systems called ionosondes. The time delay between the transmitted and received signals versus frequency is measured by the ionosondes and the received signals are processed to generate ionogram plots, which show the time delay or reflection height of signals with respect to transmitted frequency. The critical frequencies of ionospheric layers and virtual heights, that provide useful information about ionospheric structurecan be extracted from ionograms . Ionograms also indicate the amount of variability or disturbances in the ionosphere. With special inversion algorithms and tomographical methods, electron density profiles can also be estimated from the ionograms. Although structural pictures of ionosphere in the vertical direction can be observed from ionosonde measurements, some errors may arise due to inaccuracies that arise from signal propagation, modeling, data processing and tomographic reconstruction algorithms. Recently IONOLAB group (www.ionolab.org) developed a new algorithm for effective and accurate extraction of ionospheric parameters and reconstruction of electron density profile from ionograms. The electron density reconstruction algorithm applies advanced optimization techniques to calculate parameters of any existing analytical function which defines electron density with respect to height using ionogram measurement data. The process of reconstructing electron density with respect to height is known as the ionogram scaling or true height analysis. IONOLAB-RAY algorithm is a tool to investigate the propagation path and parameters of HF wave in the ionosphere. The algorithm models the wave propagation using ray representation under geometrical optics approximation. In the algorithm , the structural ionospheric characteristics arerepresented as realistically as possible including anisotropicity, inhomogenity and time dependence in 3-D voxel structure. The algorithm is also used
Eccentric small-zone ray tracing wavefront aberrometry for refraction in keratoconus.
Fredriksson, Anneli; Behndig, Anders
2016-11-01
To compare objective refraction using small-zone eccentric laser ray tracing (LRT) wavefront aberrometry to standard autorefraction in keratoconus (KC), and whether the visual acuities achieved with these refractions differ from corresponding values in healthy eyes. Twenty-nine eyes of 29 patients with KC and 29 eyes of 29 healthy controls were included in this prospective unmasked case-control study. The uncorrected (UCVA) and spectacle-corrected (SCVA) Early Treatment Diabetic Retinopathy Study (ETDRS) visual acuities based on refractions derived from LRT in central and four eccentric zones were compared to those achieved with standard autorefraction. The spherical equivalent (M) and two astigmatic power vectors (C0 and C45) were calculated for all refractions. Pentacam HR(®) was used to generate keratometry readings of the corresponding zones. In KC, the refraction from the upper nasal zone rendered a higher SCVA than the standard autorefraction more often than in the controls (p refractions rendered similar SCVA:s in KC. Pentacam HR(®) showed higher keratometry readings infero-temporally, but also lower readings supero-nasally, compared to controls. In KC, eccentric LRT measurements gave better SCVA than standard autorefraction more often than in healthy eyes. Eccentric LRT may become a valuable tool in the demanding task of subjective refraction in KC. © 2016 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.
Stevens, John Colby [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). The Joint Center for Artificial Photosynthesis; Univ. of California, Berkeley, CA (United States). Dept. of Mechanical Engineering
2012-12-01
Ray tracing was used to perform optical optimization of arrays of photovoltaic microrods and explore the interaction between light and bubbles of oxygen gas on the surface of the microrods. The incident angle of light was varied over a wide range. The percent of incident light absorbed by the microrods and reflected by the bubbles was computed over this range. It was found that, for the 10 μm diameter, 100 μm tall SrTiO_{3} microrods simulated in the model, the optimal center-to-center spacing was 14 μm for a square grid. This geometry produced 75% average and 90% maximum absorbance. For a triangular grid using the same microrods, the optimal center-to-center spacing was 14 μm. This geometry produced 67% average and 85% maximum absorbance. For a randomly laid out grid of 5 μm diameter, 100 μm tall SrTiO_{3} microrods with an average center-to-center spacing of 20 μm, the average absorption was 23% and the maximum absorption was 43%. For a 50% areal coverage fraction of bubbles on the absorber surface, between 2%-20% of the incident light energy was reflected away from the rods by the bubbles, depending upon incident angle and bubble morphology.
Yang, Que; Wang, Shanshan; Wang, Kai; Zhang, Chunyu; Zhang, Lu; Meng, Qingyu; Zhu, Qiudong
2015-08-01
For normal eyes without history of any ocular surgery, traditional equations for calculating intraocular lens (IOL) power, such as SRK-T, Holladay, Higis, SRK-II, et al., all were relativley accurate. However, for eyes underwent refractive surgeries, such as LASIK, or eyes diagnosed as keratoconus, these equations may cause significant postoperative refractive error, which may cause poor satisfaction after cataract surgery. Although some methods have been carried out to solve this problem, such as Hagis-L equation[1], or using preoperative data (data before LASIK) to estimate K value[2], no precise equations were available for these eyes. Here, we introduced a novel intraocular lens power estimation method by accurate ray tracing with optical design software ZEMAX. Instead of using traditional regression formula, we adopted the exact measured corneal elevation distribution, central corneal thickness, anterior chamber depth, axial length, and estimated effective lens plane as the input parameters. The calculation of intraocular lens power for a patient with keratoconus and another LASIK postoperative patient met very well with their visual capacity after cataract surgery.
Sakurai, K; Inoue, K; Yagi, N
2001-01-01
The downsizing of a Johansson-type X-ray fluorescence (XRF) spectrometer has been examined as a way of enhancing detection efficiency with a tolerable loss of energy resolution. A compact spectrometer equipped with a Ge(2 2 0) analyzing crystal with a Rowland radius of 120 mm has been tested with a highly brilliant helical undulator source at BL40XU, SPring-8. The energy resolution obtained for cobalt K alpha sub 1 (6930.32 eV) was 8.8 eV, which is 10-20 times better than that obtained using a Si(Li) detector, and effectively improved the signal-to-background ratio for XRF spectra. The combination of the present spectrometer and a third generation synchrotron source could provide new opportunities for trace analytical applications, which have been difficult so far by conventional synchrotron XRF experiments based on a Si(Li) detector system. The detection limit obtained for solid bulk samples has reached a level of several tens of ppb.
Liang, Yicheng; Peng, Hao
2015-02-07
Depth-of-interaction (DOI) poses a major challenge for a PET system to achieve uniform spatial resolution across the field-of-view, particularly for small animal and organ-dedicated PET systems. In this work, we implemented an analytical method to model system matrix for resolution recovery, which was then incorporated in PET image reconstruction on a graphical processing unit platform, due to its parallel processing capacity. The method utilizes the concepts of virtual DOI layers and multi-ray tracing to calculate the coincidence detection response function for a given line-of-response. The accuracy of the proposed method was validated for a small-bore PET insert to be used for simultaneous PET/MR breast imaging. In addition, the performance comparisons were studied among the following three cases: 1) no physical DOI and no resolution modeling; 2) two physical DOI layers and no resolution modeling; and 3) no physical DOI design but with a different number of virtual DOI layers. The image quality was quantitatively evaluated in terms of spatial resolution (full-width-half-maximum and position offset), contrast recovery coefficient and noise. The results indicate that the proposed method has the potential to be used as an alternative to other physical DOI designs and achieve comparable imaging performances, while reducing detector/system design cost and complexity.
Woei Leow, Shin; Corrado, Carley; Osborn, Melissa; Isaacson, Michael; Alers, Glenn; Carter, Sue A.
2013-06-01
Luminescent solar concentrators (LSC) collect ambient light from a broad range of angles and concentrate the captured light onto photovoltaic (PV) cells. LSCs with front-facing cells collect direct and indirect sunlight ensuring a gain factor greater than one. The flexible placement and percentage coverage of PV cells on the LSC panel allow for layout adjustments to be made in order to balance re-absorption losses and the level of light concentration desired. A weighted Monte Carlo ray tracing program was developed to study the transport of photons and loss mechanisms in the LSC to aid in design optimization. The program imports measured absorption/emission spectra of an organic luminescent dye (LR305), the transmission coefficient, and refractive index of acrylic as parameters that describe the system. Simulations suggest that for LR305, 8-10 cm of luminescent material surrounding the PV cell yields the highest increase in power gain per unit area of LSC added, thereby determining the ideal spacing between PV cells in the panel. For rectangular PV cells, results indicate that for each centimeter of PV cell width, an additional increase of 0.15 mm to the waveguide thickness is required to efficiently transport photon collected by the LSC to the PV cell with minimal loss.
A model of polarized-beam AGS in the ray-tracing code Zgoubi
Meot, F. [Brookhaven National Lab. (BNL), Upton, NY (United States); Ahrens, L. [Brookhaven National Lab. (BNL), Upton, NY (United States); Brown, K. [Brookhaven National Lab. (BNL), Upton, NY (United States); Dutheil, Y. [Brookhaven National Lab. (BNL), Upton, NY (United States); Glenn, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Huang, H. [Brookhaven National Lab. (BNL), Upton, NY (United States); Roser, T. [Brookhaven National Lab. (BNL), Upton, NY (United States); Shoefer, V. [Brookhaven National Lab. (BNL), Upton, NY (United States); Tsoupas, N. [Brookhaven National Lab. (BNL), Upton, NY (United States)
2016-07-12
A model of the Alternating Gradient Synchrotron, based on the AGS snapramps, has been developed in the stepwise ray-tracing code Zgoubi. It has been used over the past 5 years in a number of accelerator studies aimed at enhancing RHIC proton beam polarization. It is also used to study and optimize proton and Helion beam polarization in view of future RHIC and eRHIC programs. The AGS model in Zgoubi is operational on-line via three different applications, ’ZgoubiFromSnaprampCmd’, ’AgsZgoubiModel’ and ’AgsModelViewer’, with the latter two essentially interfaces to the former which is the actual model ’engine’. All three commands are available from the controls system application launcher in the AGS ’StartUp’ menu, or from eponymous commands on shell terminals. Main aspects of the model and of its operation are presented in this technical note, brief excerpts from various studies performed so far are given for illustration, means and methods entering in ZgoubiFromSnaprampCmd are developed further in appendix.
Leow, Shin Woei; Corrado, Carley; Osborn, Melissa; Carter, Sue A.
2013-09-01
Luminescent solar concentrators (LSCs) have the ability to receive light from a wide range of angles, concentrating the captured light onto small photo active areas. This enables greater incorporation of LSCs into building designs as windows, skylights and wall claddings in addition to rooftop installations of current solar panels. Using relatively cheap luminescent dyes and acrylic waveguides to effect light concentration onto lesser photovoltaic (PV) cells, there is potential for this technology to approach grid price parity. We employ a panel design in which the front facing PV cells collect both direct and concentrated light ensuring a gain factor greater than one. This also allows for flexibility in determining the placement and percentage coverage of PV cells during the design process to balance reabsorption losses against the power output and level of light concentration desired. To aid in design optimization, a Monte-Carlo ray tracing program was developed to study the transport of photons and loss mechanisms in LSC panels. The program imports measured absorption/emission spectra and transmission coefficients as simulation parameters with interactions of photons in the panel determined by comparing calculated probabilities with random number generators. LSC panels with multiple dyes or layers can also be simulated. Analysis of the results reveals optimal panel dimensions and PV cell layouts for maximum power output for a given dye concentration, absorbtion/emission spectrum and quantum efficiency.
Using Stochastic Ray Tracing to Simulate a Dense Time Series of Gross Primary Productivity
Martin van Leeuwen
2015-12-01
Full Text Available Eddy-covariance carbon dioxide flux measurement is an established method to estimate primary productivity at the forest stand level (typically 10 ha. To validate eddy-covariance estimates, researchers rely on extensive time-series analysis and an assessment of flux contributions made by various ecosystem components at spatial scales much finer than the eddy-covariance footprint. Scaling these contributions to the stand level requires a consideration of the heterogeneity in the canopy radiation field. This paper presents a stochastic ray tracing approach to predict the probabilities of light absorption from over a thousand hemispherical directions by thousands of individual scene elements. Once a look-up table of absorption probabilities is computed, dynamic illumination conditions can be simulated in a computationally realistic time, from which stand-level gross primary productivity can be obtained by integrating photosynthetic assimilation over the scene. We demonstrate the method by inverting a leaf-level photosynthesis model with eddy-covariance and meteorological data. Optimized leaf photosynthesis parameters and canopy structure were able to explain 75% of variation in eddy-covariance gross primary productivity estimates, and commonly used parameters, including photosynthetic capacity and quantum yield, fell within reported ranges. Remaining challenges are discussed including the need to address the distribution of radiation within shoots and needles.
Zhu, Yang; Zhang, Xin; Liu, Tao; Wu, Yanxiong; Shi, Guangwei; Wang, Lingjie
2015-07-01
A long wave infrared imaging system operated for space exploration of faint target is highly sensitive to stray radiation. We present an integrative suppression process of internal and external stray radiation. A compact and re-imaging LWIR catadioptric telescope is designed as practical example and internal and external stray radiation is analyzed for this telescope. The detector is cryogenically cooled with 100% cold shield efficiency of Lyot stop. A non-sequential ray tracing technique is applied to investigate how the stray radiation propagates inside optical system. The simulation and optimization during initial design stage are proceeded to avoid subversive defect that the stray radiation disturbs the target single. The quantitative analysis of stray radiation irradiance emitted by lenses and structures inside is presented in detail. The optical elements, which operate at room-temperature due to the limitation of weight and size, turn to be the significant stray radiation sources. We propose a method combined infrared material selection and optical form optimization to reduce the internal stray radiation of lens. We design and optimize mechanical structures to achieve a further attenuation of internal stray radiation power. The point source transmittance (PST) is calculated to assess the external radiation which comes from the source out of view field. The ghost of bright target due to residual reflection of optical coatings is simulated. The results show that the performance of stray radiation suppression is dramatically improved by iterative optimization and modification of optomechanical configurations.
Trace metal content in aspirin and women's cosmetics via proton induced x-ray emission (PIXE)
Hichwa, B.P.; Pun, D.D.; Wang, D.
1981-04-01
A multielemental analysis to determine the trace metal content of generic and name-brand aspirins and name-brand lipsticks was done via proton induced x-ray (PIXE) measurements. The Hope College PIXE system is described as well as the target preparation methods. The trace metal content of twelve brands of aspirin and aspirin substitutes and fourteen brands of lipstick are reported. Detection limits for most elements are in the range of 100 parts per billion (ppb) to 10 parts per million (ppm).
无
2000-01-01
A method using synchrotron radiation X-ray fluorescence analysis to in situ determine trace elements during protein electrophoretically separating process was established. The distribution of elements in protein bands for human liver cytosolic sample separated by SDS-PAGE was analyzed along polyacrylamide gel. The results showed that the protein fraction of peak III in cytosol was mainly composed of metal ion Zn-associated proteins, being in agreement with that given by atomic absorption spectrometry. Thus, it demonstrated the feasibility of this novel technique for in situ analysis of trace elements in protein bands.
陈春英; 章佩群; 柴之芳; 李光城; 黄宇营
2000-01-01
A method using synchrotron radiation X-ray fluorescence analysis to in situ determine trace elements during protein electrophoretically separating process was established. The distribution of elements in protein bands for human liver cytosolic sample separated by SDS-PAGE was analyzed a-long polyacrylamide gel. The results showed that the protein fraction of peak III in cvtosol was mainly composed of metal ion Zn-associated proteins, being in agreement with that given by atomic absorption spectrometry. Thus, it demonstrated the feasibility of this novel technique for in situ analysis of trace elements in protein bands.
Sun, Pengfei; Sun, Changku; Li, Wenqiang; Wang, Peng
2015-01-01
Pose estimation aims at measuring the position and orientation of a calibrated camera using known image features. The pinhole model is the dominant camera model in this field. However, the imaging precision of this model is not accurate enough for an advanced pose estimation algorithm. In this paper, a new camera model, called incident ray tracking model, is introduced. More importantly, an advanced pose estimation algorithm based on the perspective ray in the new camera model, is proposed. The perspective ray, determined by two positioning points, is an abstract mathematical equivalent of the incident ray. In the proposed pose estimation algorithm, called perspective-ray-based scaled orthographic projection with iteration (PRSOI), an approximate ray-based projection is calculated by a linear system and refined by iteration. Experiments on the PRSOI have been conducted, and the results demonstrate that it is of high accuracy in the six degrees of freedom (DOF) motion. And it outperforms three other state-of-the-art algorithms in terms of accuracy during the contrast experiment.
Pengfei Sun
Full Text Available Pose estimation aims at measuring the position and orientation of a calibrated camera using known image features. The pinhole model is the dominant camera model in this field. However, the imaging precision of this model is not accurate enough for an advanced pose estimation algorithm. In this paper, a new camera model, called incident ray tracking model, is introduced. More importantly, an advanced pose estimation algorithm based on the perspective ray in the new camera model, is proposed. The perspective ray, determined by two positioning points, is an abstract mathematical equivalent of the incident ray. In the proposed pose estimation algorithm, called perspective-ray-based scaled orthographic projection with iteration (PRSOI, an approximate ray-based projection is calculated by a linear system and refined by iteration. Experiments on the PRSOI have been conducted, and the results demonstrate that it is of high accuracy in the six degrees of freedom (DOF motion. And it outperforms three other state-of-the-art algorithms in terms of accuracy during the contrast experiment.
Component-level test of molded freeform optics for LED beam shaping using experimental ray tracing
Gutierrez, Gustavo; Hilbig, David; Fleischmann, Friedrich; Henning, Thomas
2017-06-01
Due to the high demand of LED light sources, the need to modify their radiation pattern to meet specific application requirements has also increased. This is mostly achieved by using molded secondary optics, which are composed of a combination of several aspherical and freeform surfaces. Unfortunately, the manufacturers of these secondary optics only provide output information at system level, making impossible to independently characterize the secondary optic in order to determine the sources of erroneous results. For this reason, it is necessary to perform a component-level verification leading to the validation of the correctness of the produced secondary optic independently of the light source. To understand why traditional inspection methods fail, it is necessary to take into account that not only errors due to irregularities on the lens surface like pores, glass indentations or scratches affect the performance of the lens, but also differences in refractive index appear after the compression during fabrication process. These internal alterations are generally produced during the cooling stage and their effect over the performance of the lens are not possible to be measured using tactile techniques. Additionally, the small size of the lens and the freeform characteristics of its surface introduce additional difficulties to perform its validation. In this work, the component-level test is done by obtaining the ray mapping function (RMF) which describes the deflection of the light beam as a function of the input angle. To obtain the RMF, firstly a collimated light source is held fix and the lens is rotated. Thus, a virtual point source is created and subsequently by using experimental ray tracing it is possible to determine the ray slopes, which are used to the retrieve the RMF. Under the assumption that the optical system under analysis is lossless and considering the principle of energy conservation, it is possible under specific conditions to use this new
E. Achmad
2006-12-01
Full Text Available Gravity wave signatures were extracted from OH airglow observations using all-sky CCD imagers at four different stations: Cachoeira Paulista (CP (22.7° S, 45° W and São João do Cariri (7.4° S, 36.5° W, Brazil; Tanjungsari (TJS (6.9° S, 107.9° E, Indonesia and Shigaraki (34.9° N, 136° E, Japan. The gravity wave parameters are used as an input in a reverse ray tracing model to study the gravity wave vertical propagation trajectory and to estimate the wave source region. Gravity waves observed near the equator showed a shorter period and a larger phase velocity than those waves observed at low-middle latitudes. The waves ray traced down into the troposphere showed the largest horizontal wavelength and phase speed. The ray tracing results also showed that at CP, Cariri and Shigaraki the majority of the ray paths stopped in the mesosphere due to the condition of m2m2m|→∞, which suggests the presence of ducting waves and/or waves generated in-situ. In the troposphere, the possible gravity wave sources are related to meteorological front activities and cloud convections at CP, while at Cariri and TJS tropical cloud convections near the equator are the most probable gravity wave sources. The tropospheric jet stream and the orography are thought to be the major responsible sources for the waves observed at Shigaraki.
Yuan, Cadmus C. A.
2015-12-01
Optical ray tracing modeling applied Beer-Lambert method in the single luminescence material system to model the white light pattern from blue LED light source. This paper extends such algorithm to a mixed multiple luminescence material system by introducing the equivalent excitation and emission spectrum of individual luminescence materials. The quantum efficiency numbers of individual material and self-absorption of the multiple luminescence material system are considered as well. By this combination, researchers are able to model the luminescence characteristics of LED chip-scaled packaging (CSP), which provides simple process steps and the freedom of the luminescence material geometrical dimension. The method will be first validated by the experimental results. Afterward, a further parametric investigation has been then conducted.
Fitting heavy-tailed HTTP traces with the new stratified EM-algorithm
Sadre, R.; Haverkort, Boudewijn R.H.M.
A typical step in the model-based evaluation of communication systems is to fit measured data to analytically tractable distributions. Due to the increased speed of today's networks, even basic measurements, such as logging the requests at a Web server, can quickly generate large data traces with
Bailey, M J; Morgan, R M; Comini, P; Calusi, S; Bull, P A
2012-03-06
The independent verification in a forensics context of quartz grain morphological typing by scanning electron microscopy was demonstrated using particle-induced X-ray emission (PIXE) and particle-induced γ-ray emission (PIGE). Surface texture analysis by electron microscopy and high-sensitivity trace element mapping by PIXE and PIGE are independent analytical techniques for identifying the provenance of quartz in sediment samples in forensic investigations. Trace element profiling of the quartz grain matrix separately from the quartz grain inclusions served to differentiate grains of different provenance and indeed went some way toward discriminating between different quartz grain types identified in a single sample of one known forensic provenance. These results confirm the feasibility of independently verifying the provenance of critical samples from forensic cases.
An exponential modeling algorithm for protein structure completion by X-ray crystallography.
Shneerson, V L; Wild, D L; Saldin, D K
2001-03-01
An exponential modeling algorithm is developed for protein structure completion by X-ray crystallography and tested on experimental data from a 59-residue protein. An initial noisy difference Fourier map of missing residues of up to half of the protein is transformed by the algorithm into one that allows easy identification of the continuous tube of electron density associated with that polypeptide chain. The method incorporates the paradigm of phase hypothesis generation and cross validation within an automated scheme.
A dynamic material discrimination algorithm for dual MV energy X-ray digital radiography.
Li, Liang; Li, Ruizhe; Zhang, Siyuan; Zhao, Tiao; Chen, Zhiqiang
2016-08-01
Dual-energy X-ray radiography has become a well-established technique in medical, industrial, and security applications, because of its material or tissue discrimination capability. The main difficulty of this technique is dealing with the materials overlapping problem. When there are two or more materials along the X-ray beam path, its material discrimination performance will be affected. In order to solve this problem, a new dynamic material discrimination algorithm is proposed for dual-energy X-ray digital radiography, which can also be extended to multi-energy X-ray situations. The algorithm has three steps: α-curve-based pre-classification, decomposition of overlapped materials, and the final material recognition. The key of the algorithm is to establish a dual-energy radiograph database of both pure basis materials and pair combinations of them. After the pre-classification results, original dual-energy projections of overlapped materials can be dynamically decomposed into two sets of dual-energy radiographs of each pure material by the algorithm. Thus, more accurate discrimination results can be provided even with the existence of the overlapping problem. Both numerical and experimental results that prove the validity and effectiveness of the algorithm are presented.
Cheng, Ruida; Jackson, Jennifer N.; McCreedy, Evan S.; Gandler, William; Eijkenboom, J. J. F. A.; van Middelkoop, M.; McAuliffe, Matthew J.; Sheehan, Frances T.
2016-03-01
The paper presents an automatic segmentation methodology for the patellar bone, based on 3D gradient recalled echo and gradient recalled echo with fat suppression magnetic resonance images. Constricted search space outlines are incorporated into recursive ray-tracing to segment the outer cortical bone. A statistical analysis based on the dependence of information in adjacent slices is used to limit the search in each image to between an outer and inner search region. A section based recursive ray-tracing mechanism is used to skip inner noise regions and detect the edge boundary. The proposed method achieves higher segmentation accuracy (0.23mm) than the current state-of-the-art methods with the average dice similarity coefficient of 96.0% (SD 1.3%) agreement between the auto-segmentation and ground truth surfaces.
Weeratunga, S K
2008-11-06
Ares and Kull are mature code frameworks that support ALE hydrodynamics for a variety of HEDP applications at LLNL, using two widely different meshing approaches. While Ares is based on a 2-D/3-D block-structured mesh data base, Kull is designed to support unstructured, arbitrary polygonal/polyhedral meshes. In addition, both frameworks are capable of running applications on large, distributed-memory parallel machines. Currently, both these frameworks separately support assorted collections of physics packages related to HEDP, including one for the energy deposition by laser/ion-beam ray tracing. This study analyzes the options available for developing a common laser/ion-beam ray tracing package that can be easily shared between these two code frameworks and concludes with a set of recommendations for its development.
Fokker-Planck/Ray Tracing for Electron Bernstein and Fast Wave Modeling in Support of NSTX
Harvey, R. W. [CompX, Del Mar, CA (United States)
2009-11-12
This DOE grant supported fusion energy research, a potential long-term solution to the world's energy needs. Magnetic fusion, exemplified by confinement of very hot ionized gases, i.e., plasmas, in donut-shaped tokamak vessels is a leading approach for this energy source. Thus far, a mixture of hydrogen isotopes has produced 10's of megawatts of fusion power for seconds in a tokamak reactor at Princeton Plasma Physics Laboratory in New Jersey. The research grant under consideration, ER54684, uses computer models to aid in understanding and projecting efficacy of heating and current drive sources in the National Spherical Torus Experiment, a tokamak variant, at PPPL. The NSTX experiment explores the physics of very tight aspect ratio, almost spherical tokamaks, aiming at producing steady-state fusion plasmas. The current drive is an integral part of the steady-state concept, maintaining the magnetic geometry in the steady-state tokamak. CompX further developed and applied models for radiofrequency (rf) heating and current drive for applications to NSTX. These models build on a 30 year development of rf ray tracing (the all-frequencies GENRAY code) and higher dimensional Fokker-Planck rf-collisional modeling (the 3D collisional-quasilinear CQL3D code) at CompX. Two mainline current-drive rf modes are proposed for injection into NSTX: (1) electron Bernstein wave (EBW), and (2) high harmonic fast wave (HHFW) modes. Both these current drive systems provide a means for the rf to access the especially high density plasma--termed high beta plasma--compared to the strength of the required magnetic fields. The CompX studies entailed detailed modeling of the EBW to calculate the efficiency of the current drive system, and to determine its range of flexibility for driving current at spatial locations in the plasma cross-section. The ray tracing showed penetration into NSTX bulk plasma, relatively efficient current drive, but a limited ability to produce current over
He, Wenjun; Fu, Yuegang; Zheng, Yang; Zhang, Lei; Wang, Jiake; Liu, Zhiying; Zheng, Jianping
2013-07-01
The output polarization states of corner cubes (for both uncoated and metal-coated surfaces) with an input beam of arbitrary polarization state and of arbitrary tilt angle to the cube have been analyzed by using the three-dimensional polarization ray-tracing matrix method. The diattenuation and retardance of the corner-cube retroreflector (CCR) for all six different ray paths are calculated, and the relationships to the tilt angle and the tilt orientation angle are shown. When the tilt angle is large, hollow metal-coated CCR is more appropriate than solid metal-coated CCR for the case that the polarization states of output beam should be controlled.
Jefferies, K.
1994-01-01
OFFSET is a ray tracing computer code for optical analysis of a solar collector. The code models the flux distributions within the receiver cavity produced by reflections from the solar collector. It was developed to model the offset solar collector of the solar dynamic electric power system being developed for Space Station Freedom. OFFSET has been used to improve the understanding of the collector-receiver interface and to guide the efforts of NASA contractors also researching the optical components of the power system. The collector for Space Station Freedom consists of 19 hexagonal panels each containing 24 triangular, reflective facets. Current research is geared toward optimizing flux distribution inside the receiver via changes in collector design and receiver orientation. OFFSET offers many options for experimenting with the design of the system. The offset parabolic collector model configuration is determined by an input file of facet corner coordinates. The user may choose other configurations by changing this file, but to simulate collectors that have other than 19 groups of 24 triangular facets would require modification of the FORTRAN code. Each of the roughly 500 facets in the assembled collector may be independently aimed to smooth out, or tailor, the flux distribution on the receiver's wall. OFFSET simulates the effects of design changes such as in receiver aperture location, tilt angle, and collector facet contour. Unique features of OFFSET include: 1) equations developed to pseudo-randomly select ray originating sources on the Sun which appear evenly distributed and include solar limb darkening; 2) Cone-optics technique used to add surface specular error to the ray originating sources to determine the apparent ray sources of the reflected sun; 3) choice of facet reflective surface contour -- spherical, ideal parabolic, or toroidal; 4) Gaussian distributions of radial and tangential components of surface slope error added to the surface normals at
3-D TECATE/BREW: Thermal, stress, and birefringent ray-tracing codes for solid-state laser design
Gelinas, R. J.; Doss, S. K.; Nelson, R. G.
1994-07-01
This report describes the physics, code formulations, and numerics that are used in the TECATE (totally Eulerian code for anisotropic thermo-elasticity) and BREW (birefringent ray-tracing of electromagnetic waves) codes for laser design. These codes resolve thermal, stress, and birefringent optical effects in 3-D stationary solid-state systems. This suite of three constituent codes is a package referred to as LASRPAK.
Sheil, Conor; Goncharov, Alexander V.
2013-05-01
A physical model eye was constructed to test the quality of ophthalmic instruments. The accuracy and precision of two commercially available instruments were analysed. For these instruments, a particular model eye was obtained which mimicked the physical properties that would be usually measured e.g. corneal topography or optical path within the human eye. The model eye was designed using relatively simple optical components (e.g. plano-convex lenses) separated by appropriate intraocular distances taken from the literature. The dimensions of the model eye were known a priori: The lenses used in the construction of the model eye were characterised ac cording to values given in the manufacturers' data sheets and also through measurement using an interferometer. The distances between the lens surfaces were calculated using the interferometric data with reverse ray-tracing. Optical paths were calculated as the product of refractive index and axial distance. The errors inherent in mea suring these ocular parameters by different ophthalmic instruments can be considered as producing an erroneous value for the overall refractive power of the eye. The latter is a useful metric for comparing various ophthalmic devices where the direct comparison of quality is not possible or is not practical. For example, a 1% error in anterior corneal radius of curvature will have a more detrimental effect than the same error in posterior corneal radius, due to the relative differences in refractive indices at those surface boundaries. To quantify the error in ocular refractive power, a generic eye model was created in ZEMAX optical design software. The parametric errors were then used to compute the overall error in predicting ocular refractive power, thus highlighting the relative importance of individual errors. This work will help in future determination of acceptable levels of metrological errors in ocular instrumentation.
Exploring Design Tradeoffs Of A Distributed Algorithm For Cosmic Ray Event Detection
Yousaf, Suhail; van Steen, Maarten; Voulgaris, Spyros; Kelley, John L
2012-01-01
Many sensor networks, including large particle detector arrays measuring high-energy cosmic-ray air showers, traditionally rely on centralised trigger algorithms to find spatial and temporal coincidences of individual nodes. Such schemes suffer from scalability problems, especially if the nodes communicate wirelessly or have bandwidth limitations. However, nodes which instead communicate with each other can, in principle, use a distributed algorithm to find coincident events themselves without communication with a central node. We present such an algorithm and consider various design tradeoffs involved, in the context of a potential trigger for the Auger Engineering Radio Array (AERA).
Shi, Shengxian; Ding, Junfei; New, T. H.; Soria, Julio
2017-07-01
This paper presents a dense ray tracing reconstruction technique for a single light-field camera-based particle image velocimetry. The new approach pre-determines the location of a particle through inverse dense ray tracing and reconstructs the voxel value using multiplicative algebraic reconstruction technique (MART). Simulation studies were undertaken to identify the effects of iteration number, relaxation factor, particle density, voxel-pixel ratio and the effect of the velocity gradient on the performance of the proposed dense ray tracing-based MART method (DRT-MART). The results demonstrate that the DRT-MART method achieves higher reconstruction resolution at significantly better computational efficiency than the MART method (4-50 times faster). Both DRT-MART and MART approaches were applied to measure the velocity field of a low speed jet flow which revealed that for the same computational cost, the DRT-MART method accurately resolves the jet velocity field with improved precision, especially for the velocity component along the depth direction.
Detecting Road Intersections from GPS Traces Using Longest Common Subsequence Algorithm
Xingzhe Xie
2016-12-01
Full Text Available Intersections are important components of road networks, which are critical to both route planning and path optimization. Most existing methods define the intersections as locations where the road users change their moving directions and identify the intersections from GPS traces through analyzing the road users’ turning behaviors. However, these methods suffer from finding an appropriate threshold for the moving direction change, leading to true intersections being undetected or spurious intersections being falsely detected. In this paper, the intersections are defined as locations that connect three or more road segments in different directions. We propose to detect the intersections under this definition by finding the common sub-tracks of the GPS traces. We first detect the Longest Common Subsequences (LCSS between each pair of GPS traces using the dynamic programming approach. Second, we partition the longest nonconsecutive subsequences into consecutive sub-tracks. The starting and ending points of the common sub-tracks are collected as connecting points. At last, intersections are detected from the connecting points through Kernel Density Estimation (KDE. Experimental results show that our proposed method outperforms the turning point-based methods in terms of the F-score.
Trace Driven Cache Attack on LBlock Algorithm%针对LBlock算法的踪迹驱动Cache攻击
朱嘉良; 韦永壮
2015-01-01
LBlock是一种轻量级分组密码算法，其由于优秀的软硬件实现性能而备受关注。目前针对LBlock的安全性研究多侧重于抵御传统的数学攻击。缓存( Cache)攻击作为一种旁路攻击技术，已经被证实对密码算法的工程实现具有实际威胁，其中踪迹驱动Cache攻击分析所需样本少、分析效率高。为此，根据LBlock的算法结构及密钥输入特点，利用访问Cache过程中密码泄露的旁路信息，给出针对LBlock算法的踪迹驱动Cache攻击。分析结果表明，该攻击选择106个明文，经过约27.71次离线加密时间即可成功恢复LBlock的全部密钥。与LBlock侧信道立方攻击和具有Feistel结构的DES算法踪迹驱动Cache攻击相比，其攻击效果更明显。%As a new lightweight block cipher,LBlock cipher receives much attention since its excellent performance on hardware and software platforms. Currently, the secure evaluation on LBlock cipher heavy relies on the traditional mathematical attacks. The cache attack is a type of side channel attacks, and it has actual threat to the secure implementation of ciphers algorithm. In all kinds of Cache attacks,trace driven Cache attack has the advantage of using less samples and having higher efficiency. Based on the structure of the cipher algorithm and the property of its key schedule,this paper proposes a trace driven Cache attack on the LBlock algorithm. This attack recovers the secret key by capturing the leaked information in the process of accessing to the Cache. Analysis result shows that this attack requires a data complexity of about 106 chosen plaintexts,and a time complexity of about 27. 71 encryption operations. Compared with the proposed side channel cube attacks on LBlock and trace driven Cache attack on DES which also has the structure of Feistel,the attack is more favorable.
Fogle, M; Daly, B; Evans, M; Justiniano, E L; Kovacs, C J; Shinpaugh, J L; Toburen, L H
2001-11-01
Although altered levels of circulating essential trace elements are known to accompany malignant disease, the lack of sensitivity of conventional detection methods has generally limited their study to clinical conditions involving extensive disease (i.e., significant tumor burden). As such, the application of altered trace element levels as potential prognostic guides or as response indicators subsequent to treatment has been of limited use. During this study, proton-induced X-ray emission spectroscopy was evaluated as a tool to determine trace element imbalances in a murine tumor model. Using plasma from C57B1/6 mice bearing the syngeneic Lewis lung carcinoma (LLCa), levels of Fe, Cu, and Zn, as well as changes in the Cu /Zn ratio, were measured in animals carrying an increasing primary tumor burden. The plasma levels of Fe, Cu, and Zn were found to decrease significantly 7 d following implants of LLCa cells with no significant change observed in the Cu/Zn ratio. By d 21, however, an increase in the Cu/Zn ratio was found to accompany increased growth of the LLCa tumor; the plasma levels of Cu had returned to normal levels, whereas both the Fe and Zn plasma levels remained lowered. Collectively, the results suggest that although a net change in individual plasma trace element concentrations might not be accurately associated with tumor growth, a clear relationship was established between the Cu/Zn ratio and tumor size.
Heidt, Alexander M; Spangenberg, Dirk-Mathys; Brügmann, Michael; Rohwer, Erich G; Feurer, Thomas
2016-11-01
We demonstrate that time-domain ptychography, a recently introduced iterative ultrafast pulse retrieval algorithm, has properties well suited for the reconstruction of complex light pulses with large time-bandwidth products from a cross-correlation frequency-resolved optical gating (XFROG) measurement. It achieves temporal resolution on the scale of a single optical cycle using long probe pulses and low sampling rates. In comparison to existing algorithms, ptychography minimizes the data to be recorded and processed, and significantly reduces the computational time of the reconstruction. Experimentally, we measure the temporal waveform of an octave-spanning, 3.5 ps long, supercontinuum pulse generated in photonic crystal fiber, resolving features as short as 5.7 fs with sub-fs resolution and 30 dB dynamic range using 100 fs probe pulses and similarly large delay steps.
Wen-An Yang
2016-01-01
Full Text Available The rolling element bearing is a core component of many systems such as aircraft, train, steamboat, and machine tool, and their failure can lead to reduced capability, downtime, and even catastrophic breakdowns. Due to misoperation, manufacturing deficiencies, or the lack of monitoring and maintenance, it is often found to be the most unreliable component within these systems. Therefore, effective and efficient fault diagnosis of rolling element bearings has an important role in ensuring the continued safe and reliable operation of their host systems. This study presents a trace ratio criterion-based kernel discriminant analysis (TR-KDA for fault diagnosis of rolling element bearings. The binary immune genetic algorithm (BIGA is employed to solve the trace ratio problem in TR-KDA. The numerical results obtained using extensive simulation indicate that the proposed TR-KDA using BIGA (called TR-KDA-BIGA can effectively and efficiently classify different classes of rolling element bearing data, while also providing the capability of real-time visualization that is very useful for the practitioners to monitor the health status of rolling element bearings. Empirical comparisons show that the proposed TR-KDA-BIGA performs better than existing methods in classifying different classes of rolling element bearing data. The proposed TR-KDA-BIGA may be a promising tool for fault diagnosis of rolling element bearings.
The Ray Tracing Analytical Solution within the RAMOD framework. The case of a Gaia-like observer
Crosta, Mariateresa; de Felice, Fernando; Lattanzi, Mario Gilberto
2015-01-01
This paper presents the analytical solution of the inverse ray tracing problem for photons emitted by a star and collected by an observer located in the gravitational field of the Solar System. This solution has been conceived to suit the accuracy achievable by the ESA Gaia satellite (launched on December 19, 2013) consistently with the measurement protocol in General relativity adopted within the RAMOD framework. Aim of this study is to provide a general relativistic tool for the science exploitation of such a revolutionary mission, whose main goal is to trace back star directions from within our local curved space-time, therefore providing a three-dimensional map of our Galaxy. The results are useful for a thorough comparison and cross-checking validation of what already exists in the field of Relativistic Astrometry. Moreover, the analytical solutions presented here can be extended to model other measurements that require the same order of accuracy expected for Gaia.
Relaxed Linearized Algorithms for Faster X-Ray CT Image Reconstruction.
Nien, Hung; Fessler, Jeffrey A
2016-04-01
Statistical image reconstruction (SIR) methods are studied extensively for X-ray computed tomography (CT) due to the potential of acquiring CT scans with reduced X-ray dose while maintaining image quality. However, the longer reconstruction time of SIR methods hinders their use in X-ray CT in practice. To accelerate statistical methods, many optimization techniques have been investigated. Over-relaxation is a common technique to speed up convergence of iterative algorithms. For instance, using a relaxation parameter that is close to two in alternating direction method of multipliers (ADMM) has been shown to speed up convergence significantly. This paper proposes a relaxed linearized augmented Lagrangian (AL) method that shows theoretical faster convergence rate with over-relaxation and applies the proposed relaxed linearized AL method to X-ray CT image reconstruction problems. Experimental results with both simulated and real CT scan data show that the proposed relaxed algorithm (with ordered-subsets [OS] acceleration) is about twice as fast as the existing unrelaxed fast algorithms, with negligible computation and memory overhead.
Kolski, Jeffrey S. [Los Alamos National Laboratory; Barlow, David B. [Los Alamos National Laboratory; Macek, Robert J. [Los Alamos National Laboratory; McCrady, Rodney C. [Los Alamos National Laboratory
2011-01-01
Particle ray tracing through simulated 3D magnetic fields was executed to investigate the effective quadrupole strength of the edge focusing of the rectangular bending magnets in the Los Alamos Proton Storage Ring (PSR). The particle rays receive a kick in the edge field of the rectangular dipole. A focal length may be calculated from the particle tracking and related to the fringe field integral (FINT) model parameter. This tech note introduces the baseline lattice model of the PSR and motivates the need for an improvement in the baseline model's vertical tune prediction, which differs from measurement by .05. An improved model of the PSR is created by modifying the fringe field integral parameter to those suggested by the ray tracing investigation. This improved model is then verified against measurement at the nominal PSR operating set point and at set points far away from the nominal operating conditions. Lastly, Linear Optics from Closed Orbits (LOCO) is employed in an orbit response matrix method for model improvement to verify the quadrupole strengths of the improved model.
Algorithms and interface for ocean acoustic ray-tracing (Developed in MATLAB)
Murty, T.V.R.; Rao, M.M.M.; Prakash, S.S.; Chandramouli, P.; Murthy, K.S.R.
elements. We also ignore viscosity since fluid motions are small. Now the equations governing this flow field are the continuity equation (Ramana Murty & Mahadevan, 1994) 0. ????? Vt ?? ( 1 ) and the momentum equation, kg... the liberalized form of the governing equations as 0).( 10 1 ??? ? ? V t ? ? ( 3 ) and 11)( 0 PVt ???? ? ( 4 ) eliminating V1 between these equations, we get 122 12 P t ??? ? ? ( 5 ) 7 The frequency of acoustic oscillations is quite high...
Chorus wave-normal statistics in the Earth's radiation belts from ray tracing technique
H. Breuillard
2012-08-01
Full Text Available Discrete ELF/VLF (Extremely Low Frequency/Very Low Frequency chorus emissions are one of the most intense electromagnetic plasma waves observed in radiation belts and in the outer terrestrial magnetosphere. These waves play a crucial role in the dynamics of radiation belts, and are responsible for the loss and the acceleration of energetic electrons. The objective of our study is to reconstruct the realistic distribution of chorus wave-normals in radiation belts for all magnetic latitudes. To achieve this aim, the data from the electric and magnetic field measurements onboard Cluster satellite are used to determine the wave-vector distribution of the chorus signal around the equator region. Then the propagation of such a wave packet is modeled using three-dimensional ray tracing technique, which employs K. Rönnmark's WHAMP to solve hot plasma dispersion relation along the wave packet trajectory. The observed chorus wave distributions close to waves source are first fitted to form the initial conditions which then propagate numerically through the inner magnetosphere in the frame of the WKB approximation. Ray tracing technique allows one to reconstruct wave packet properties (electric and magnetic fields, width of the wave packet in k-space, etc. along the propagation path. The calculations show the spatial spreading of the signal energy due to propagation in the inhomogeneous and anisotropic magnetized plasma. Comparison of wave-normal distribution obtained from ray tracing technique with Cluster observations up to 40° latitude demonstrates the reliability of our approach and applied numerical schemes.
Djibrilla Saley, A.; Jardani, A.; Soueid Ahmed, A.; Raphael, A.; Dupont, J. P.
2016-11-01
Estimating spatial distributions of the hydraulic conductivity in heterogeneous aquifers has always been an important and challenging task in hydrology. Generally, the hydraulic conductivity field is determined from hydraulic head or pressure measurements. In the present study, we propose to use temperature data as source of information for characterizing the spatial distributions of the hydraulic conductivity field. In this way, we performed a laboratory sandbox experiment with the aim of imaging the heterogeneities of the hydraulic conductivity field from thermal monitoring. During the laboratory experiment, we injected a hot water pulse, which induces a heat plume motion into the sandbox. The induced plume was followed by a set of thermocouples placed in the sandbox. After the temperature data acquisition, we performed a hydraulic tomography using the stochastic Hybrid Monte Carlo approach, also called the Hamiltonian Monte Carlo (HMC) algorithm to invert the temperature data. This algorithm is based on a combination of the Metropolis Monte Carlo method and the Hamiltonian dynamics approach. The parameterization of the inverse problem was done with the Karhunen-Loève (KL) expansion to reduce the dimensionality of the unknown parameters. Our approach has provided successful reconstruction of the hydraulic conductivity field with low computational effort.
Kohei Arai
2013-01-01
Full Text Available Monte Carlo Ray Tracing: MCRT based sensitivity analysis of the geophysical parameters (the atmosphere and the ocean on Top of the Atmosphere: TOA radiance in visible to near infrared wavelength regions is conducted. As the results, it is confirmed that the influence due to the atmosphere is greater than that of the ocean. Scattering and absorption due to aerosol particles and molecules in the atmosphere is major contribution followed by water vapor and ozone while scattering due to suspended solid is dominant contribution for the ocean parameters.
Cervera, M. A.; Harris, T. J.
2014-01-01
The Defence Science and Technology Organisation (DSTO) has initiated an experimental program, Spatial Ionospheric Correlation Experiment, utilizing state-of-the-art DSTO-designed high frequency digital receivers. This program seeks to understand ionospheric disturbances at scales employ a 3-D magnetoionic Hamiltonian ray tracing engine, developed by DSTO, to (1) model the various disturbance features observed on both the O and X polarization modes in our QVI data and (2) understand how they are produced. The ionospheric disturbances which produce the observed features were modeled by perturbing the ionosphere with atmospheric gravity waves.
Pujol Nadal, Ramon; Martínez Moll, Víctor
2013-10-20
Fixed-mirror solar concentrators (FMSCs) use a static reflector and a moving receiver. They are easily installable on building roofs. However, for high-concentration factors, several flat mirrors would be needed. If curved mirrors are used instead, high-concentration levels can be achieved, and such a solar concentrator is called a curved-slats fixed-mirror solar concentrator (CSFMSC), on which little information is available. Herein, a methodology is proposed to characterize the CSFMSC using 3D ray-tracing tools. The CSFMSC shows better optical characteristics than the FMSC, as it needs fewer reflector segments for achieving the same concentration and optical efficiency.
Imaging Algorithms for Cosmic Ray Muon Radiography Detection of Nuclear Materials
LIU Yuanyuan; CHEN Zhiqiang; ZHAO Ziran; ZHANG Li; WANG Zhentian
2009-01-01
Cosmic ray muon radiography which has good penetration ability and is sensitive to high-Z mate-rials, is an effective method to detect shielded nuclear materials. This paper summarizes methods developed to process muon radiography in Tsinghua University. The methods include detector data correction, recon-struction algorithms (maximum likelihood scattering, MLS, and the maximum likelihood scattering and dis-placement, MLSD) acceleration, and the modification of the normalized mean absolute distance measure (NMADM) into a picture comparison binarization method (PCBM) which is more suitable for cosmic ray muon radiographs. Simulations demonstrate that all these methods give excellent results, so that cosmic muon radiography can become more widely used.
一种基于细节点的脊线追踪算法%A Fingerprint Ridge Tracing Algorithm Based on Minutias
彭玲; 李敏敏
2012-01-01
Fingerprint attributes are important features of the fingerprint. Ridge tracing is the premise of the fingerprint attribute extraction. A ridge tracing algorithm is proposed which based on fingerprint minutias like end and fork points. The implementation of this algorithm is based on the thinned fingerprint. The ridge tracing started from the forks or the ends, searching the next pixel in the ridge until completing tracing all the ridges. The experimental results show that this algorithm had a better ridge tracing effect.%指纹属性是指纹的重要特征,脊线追踪是提取指纹属性的前提.提出了一种基于指纹细节端点和叉点的脊线追踪算法.该算法在细化后的指纹图像上分别以端点和叉点为起始点,依次寻找脊线上的下一个像素点,从而遍历整幅指纹图像的脊线.实验结果表明,该算法对指纹细化图具有较好的脊线追踪效果.
Ray-tracing formulas for refraction and internal reflection in uniaxial crystals.
Beyerle, G; McDermid, I S
1998-12-01
Formulas for the calculation of the direction cosines of refracted and internally reflected rays in anisotropic uniaxial crystals are presented. The method is based on a transformation to a nonorthonormal coordinate system in which the normal surface associated with the extraordinary ray is of spherical shape. A numerical example for the case of refraction and internal reflection in calcite is given.
Solis, C.; Issac O, K. [Instituto de Fisica, Departamento de Fisica Experimental, UNAM, Apartado Postal 20-364, 01000 Mexico D. F. (Mexico); Martinez, A.; Lavoisier, E.; Martinez, M. A. [Instituto de Investigaciones en Materiales, UNAM, Ciudad Universitaria, 04510 Mexico D. F. (Mexico)
2008-02-15
The growing urban and tourist activity in the Mexican Caribbean coasts has resulted in an increase of chemical substances, metals in particular, discharged to the coastal waters. In order to reach an adequate management and conservation of these marine ecosystems it is necessary to perform an inventory of the actual conditions that reflect the vulnerability and the level of damage. Sea-grasses are considered good biological indicators of heavy metal contamination in marine systems. The goal of this preliminary work is to evaluate the concentrations of trace metals such as Cr, Mn, Fe, Co, Cu, Zn, and Pb in Thalassia testudinum, a very common sea-grass in the Mexican Caribbean Sea. Samples were collected from several locations in the coasts of the Yucatan Peninsula: Holbox, Blanquizal and Punta Allen, areas virtually uninfluenced by anthropogenic activities. Trace elements in different part plants were determined by particle induced X-ray emission (PIXE). This is a very suitable technique since it offers a fast, accurate and multi-element analysis. Also, the analysis by PIXE can be performed directly on powdered leaves without a laborious sample preparation. The trace metal concentration determined in sea-grasses growing in Caribbean generally fall in the range of the lowest valuables reported for sea grasses from the Gulf of Mexico. The results indicate that the studied areas do not present contamination by heavy metals. (Author)
Zeng, Guo-Qiang; Luo, Yao-Yao; Ge, Liang-Quan; Zhang, Qing-Xian; Gu, Yi; Cheng, Feng
2014-02-01
In the energy dispersive X-ray fluorescence spectrum analysis, scintillation detector such as NaI (Tl) detector usually has a low energy resolution at around 8%. The low energy resolution causes problems in spectral data analysis especially in the high background and low counts condition, it is very limited to strip the overlapped spectrum, and the more overlapping the peaks are, the more difficult to peel the peaks, and the qualitative and quantitative analysis can't be carried out because we can't recognize the peak address and peak area. Based on genetic algorithm and immune algorithm, we build a new racial algorithm which uses the Euclidean distance as the judgment of evolution, the maximum relative error as the iterative criterion to be put into overlapped spectrum analysis, then we use the Gaussian function to simulate different overlapping degrees of the spectrum, and the racial algorithm is used in overlapped peak separation and full spectrum simulation, the peak address deviation is in +/- 3 channels, the peak area deviation is no more than 5%, and it is proven that this method has a good effect in energy dispersive X-ray fluorescence overlapped spectrum analysis.
An FBP image reconstruction algorithm for x-ray differential phase contrast CT
Qi, Zhihua; Chen, Guang-Hong
2008-03-01
Most recently, a novel data acquisition method has been proposed and experimentally implemented for x-ray differential phase contrast computed tomography (DPC-CT), in which a conventional x-ray tube and a Talbot-Lau type interferometer were utilized in data acquisition. The divergent nature of the data acquisition system requires a divergent-beam image reconstruction algorithm for DPC-CT. This paper focuses on addressing this image reconstruction issue. We developed a filtered backprojection algorithm to directly reconstruct the DPC-CT images from acquired projection data. The developed algorithm allows one to directly reconstruct the decrement of the real part of the refractive index from the measured data. In order to accurately reconstruct an image, the data need to be acquired over an angular range of at least 180° plus the fan-angle. Different from the parallel beam data acquisition and reconstruction methods, a 180° rotation angle for data acquisition system does not provide sufficient data for an accurate reconstruction of the entire field of view. Numerical simulations have been conducted to validate the image reconstruction algorithm.
Preliminary analysis using multi-atlas labeling algorithms for tracing longitudinal change.
Kim, Regina E Y; Lourens, Spencer; Long, Jeffrey D; Paulsen, Jane S; Johnson, Hans J
2015-01-01
Multicenter longitudinal neuroimaging has great potential to provide efficient and consistent biomarkers for research of neurodegenerative diseases and aging. In rare disease studies it is of primary importance to have a reliable tool that performs consistently for data from many different collection sites to increase study power. A multi-atlas labeling algorithm is a powerful brain image segmentation approach that is becoming increasingly popular in image processing. The present study examined the performance of multi-atlas labeling tools for subcortical identification using two types of in-vivo image database: Traveling Human Phantom (THP) and PREDICT-HD. We compared the accuracy (Dice Similarity Coefficient; DSC and intraclass correlation; ICC), multicenter reliability (Coefficient of Variance; CV), and longitudinal reliability (volume trajectory smoothness and Akaike Information Criterion; AIC) of three automated segmentation approaches: two multi-atlas labeling tools, MABMIS and MALF, and a machine-learning-based tool, BRAINSCut. In general, MALF showed the best performance (higher DSC, ICC, lower CV, AIC, and smoother trajectory) with a couple of exceptions. First, the results of accumben, where BRAINSCut showed higher reliability, were still premature to discuss their reliability levels since their validity is still in doubt (DSC atlas labeling methods. While multi-atlas labeling methods are likely to help improve overall segmentation quality, caution has to be taken when one chooses an approach, as our results suggest that segmentation outcome can vary depending on research interest.
卢江波; 方志
2016-01-01
动态网络最短路径射线追踪算法中的向后追踪方法能够解决线性走时插值算法(LTI)向后追踪过程不稳定的问题，但是其计算效率较低。综合利用节点次级源的位置信息以及波的传播规律，提出了改进方法，排除了动态网络最短路径射线追踪算法向后追踪过程中存在的大量冗余计算。数值算例表明，改进的向后追踪方法具有较高的计算效率，是动态网络最短路径射线追踪算法中向后追踪方法的几倍至几十倍；若将改进后的向后追踪方法应用于动态网络最短路径射线追踪改进算法，则该算法的计算效率将提高一倍左右。%The backward tracing method of the shortest path ray tracing algorithm with dynamic net-works can solve the unstability problem in the backward tracing procedure of the LTI (Linear Travel-time Interpolation)algorithm,but the computational efficiency of the method is low.This study presented an improved method on backward tracing.According to the location information of the secondary sources for the nodes and the law of wave propagation,a large number of redundancy calculation are excluded in the backward tracing of the dynamic networks tracing algorithm.The numerical examples show that the im-proved method exhibits the higher computational efficiency.The calculation efficiency of the improved method is several times that of the backward tracing method of the dynamic networks tracing algorithm. When the improved method is applied to the improved algorithm of the shortest path ray tracing with dy-namic networks,the computational efficiency of the algorithm can be increased by about 100 %.
Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER
Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena
2015-11-01
Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.
Preliminary Analysis Using Multi-atlas Labeling Algorithms for Tracing Longitudinal Change
Eun Young eKim
2015-07-01
Full Text Available Multicenter longitudinal neuroimaging has great potential to provide efficient and consistent biomarkers for research of neurodegenerative diseases and aging. In rare disease studies it is of primary importance to have a reliable tool that performs consistently for data from many different collection sites to increase study power. A multi-atlas labeling algorithm is a powerful brain image segmentation approach that is becoming increasingly popular in image processing. The present study examined the performance of multi-atlas labeling tools for subcortical identification using two types of in-vivo image database: Traveling Human Phantom and PREDICT-HD. We compared the accuracy (Dice Similarity Coefficient; DSC and intraclass correlation; ICC, multicenter reliability (Coefficient of Variance; CV, and longitudinal reliability (volume trajectory smoothness and Akaike Information Criterion; AIC of three automated segmentation approaches: two multi-atlas labeling tools, MABMIS and MALF, and a machine-learning-based tool, BRAINSCut. In general, MALF showed the best performance (higher DSC, ICC, lower CV, AIC, and smoother trajectory with a couple of exceptions. First, the results of accumben, where BRAINSCut showed higher reliability, were still premature to discuss their reliability levels since their validity is still in doubt (DSC<0.7, ICC < 0.7. For caudate, BRAINSCut presented slightly better accuracy while MALF showed significantly smoother longitudinal trajectory. We discuss advantages and limitations of these performance variations and conclude that improved segmentation quality can be achieved using multi-atlas labeling methods. While multi-atlas labeling methods are likely to help improve overall segmentation quality, caution has to be taken when one chooses an approach, as our results suggest that segmentation outcome can vary depending on research interest.
Optimizing heliostat positions with local search metaheuristics using a ray tracing optical model
Reinholz, Andreas; Husenbeth, Christof; Schwarzbözl, Peter; Buck, Reiner
2017-06-01
The life cycle costs of solar tower power plants are mainly determined by the investment costs of its construction. Significant parts of these investment costs are used for the heliostat field. Therefore, an optimized placement of the heliostats gaining the maximal annual power production has a direct impact on the life cycle costs revenue ratio. We present a two level local search method implemented in MATLAB utilizing the Monte Carlo raytracing software STRAL [1] for the evaluation of the annual power output for a specific weighted annual time scheme. The algorithm was applied to a solar tower power plant (PS10) with 624 heliostats. Compared to former work of Buck [2], we were able to improve both runtime of the algorithm and quality of the output solutions significantly. Using the same environment for both algorithms, we were able to reach Buck's best solution with a speed up factor of about 20.
Tsujimura, T., Ii; Kubo, S.; Takahashi, H.; Makino, R.; Seki, R.; Yoshimura, Y.; Igami, H.; Shimozuma, T.; Ida, K.; Suzuki, C.; Emoto, M.; Yokoyama, M.; Kobayashi, T.; Moon, C.; Nagaoka, K.; Osakabe, M.; Kobayashi, S.; Ito, S.; Mizuno, Y.; Okada, K.; Ejiri, A.; Mutoh, T.
2015-11-01
The central electron temperature has successfully reached up to 7.5 keV in large helical device (LHD) plasmas with a central high-ion temperature of 5 keV and a central electron density of 1.3× {{10}19} m-3. This result was obtained by heating with a newly-installed 154 GHz gyrotron and also the optimisation of injection geometry in electron cyclotron heating (ECH). The optimisation was carried out by using the ray-tracing code ‘LHDGauss’, which was upgraded to include the rapid post-processing three-dimensional (3D) equilibrium mapping obtained from experiments. For ray-tracing calculations, LHDGauss can automatically read the relevant data registered in the LHD database after a discharge, such as ECH injection settings (e.g. Gaussian beam parameters, target positions, polarisation and ECH power) and Thomson scattering diagnostic data along with the 3D equilibrium mapping data. The equilibrium map of the electron density and temperature profiles are then extrapolated into the region outside the last closed flux surface. Mode purity, or the ratio between the ordinary mode and the extraordinary mode, is obtained by calculating the 1D full-wave equation along the direction of the rays from the antenna to the absorption target point. Using the virtual magnetic flux surfaces, the effects of the modelled density profiles and the magnetic shear at the peripheral region with a given polarisation are taken into account. Power deposition profiles calculated for each Thomson scattering measurement timing are registered in the LHD database. The adjustment of the injection settings for the desired deposition profile from the feedback provided on a shot-by-shot basis resulted in an effective experimental procedure.
Sarmah, Nabin; Richards, Bryce S; Mallick, Tapas K
2011-07-01
We present a detailed design concept and optical performance evaluation of stationary dielectric asymmetric compound parabolic concentrators (DiACPCs) using ray-tracing methods. Three DiACPC designs, DiACPC-55, DiACPC-66, and DiACPC-77, of acceptance half-angles (0° and 55°), (0° and 66°), and (0° and 77°), respectively, are designed in order to optimize the concentrator for building façade photovoltaic applications in northern latitudes (>55 °N). The dielectric concentrator profiles have been realized via truncation of the complete compound parabolic concentrator profiles to achieve a geometric concentration ratio of 2.82. Ray-tracing simulation results show that all rays entering the designed concentrators within the acceptance half-angle range can be collected without escaping from the parabolic sides and aperture. The maximum optical efficiency of the designed concentrators is found to be 83%, which tends to decrease with the increase in incidence angle. The intensity is found to be distributed at the receiver (solar cell) area in an inhomogeneous pattern for a wide range of incident angles of direct solar irradiance with high-intensity peaks at certain points of the receiver. However, peaks become more intense for the irradiation incident close to the extreme acceptance angles, shifting the peaks to the edge of the receiver. Energy flux distribution at the receiver for diffuse radiation is found to be homogeneous within ±12% with an average intensity of 520 W/m².
Evaluation of a new reconstruction algorithm for x-ray phase-contrast imaging
Seifert, Maria; Hauke, Christian; Horn, Florian; Lachner, Sebastian; Ludwig, Veronika; Pelzer, Georg; Rieger, Jens; Schuster, Max; Wandner, Johannes; Wolf, Andreas; Michel, Thilo; Anton, Gisela
2016-04-01
X-ray grating-based phase-contrast imaging might open up entirely new opportunities in medical imaging. However, transferring the interferometer technique from laboratory setups to conventional imaging systems the necessary rigidity of the system is difficult to achieve. Therefore, vibrations or distortions of the system lead to inaccuracies within the phase-stepping procedure. Given insufficient stability of the phase-step positions, up to now, artifacts in phase-contrast images occur, which lower the image quality. This is a problem with regard to the intended use of phase-contrast imaging in clinical routine as for example tiny structures of the human anatomy cannot be observed. In this contribution we evaluate an algorithm proposed by Vargas et.al.1 and applied to X-ray imaging by Pelzer et.al. that enables us to reconstruct a differential phase-contrast image without the knowledge of the specific phase-step positions. This method was tested in comparison to the standard reconstruction by Fourier analysis. The quality of phase-contrast images remains stable, even if the phase-step positions are completely unknown and not uniformly distributed. To also achieve attenuation and dark-field images the proposed algorithm has been combined with a further algorithm of Vargas et al.3 Using this algorithm, the phase-step positions can be reconstructed. With the help of the proper phase-step positions it is possible to get information about the phase, the amplitude and the offset of the measured data. We evaluated this algorithm concerning the measurement of thick objects which show a high absorbency.
一种基于指纹细化图的脊线追踪算法%A fingerprint ridge tracing algorithm based on thinned fingerprint
彭玲; 李敏敏
2012-01-01
Fingerprint attribute is an important feature of the fingerprint,and ridge tracing is the premise of picking up the fingerprint attribute.This paper proposed an algorithm which implemented ridge tracing based on thinned fingerprint image.The algorithm started from the end or the fork,then traced along the ridge orientation until tracing all the ridges.The experimental results showed that this algorithm had a satisfied result.It laid a good foundation for picking up fingerprint attribute.% 指纹属性是指纹的重要特征，脊线追踪是进行指纹属性拾取的前提。本文提出了一种在细化后的指纹图像上实现脊线追踪的算法，该算法以细节端点和叉点为起始点，沿着脊线的方向逐点进行跟踪，从而遍历指纹细化图中所有脊线。实验结果表明，该算法对细化后的指纹图像具有较好的脊线追踪效果，从而为指纹属性的拾取打下了良好的基础。
Haeruddin; Saepuloh, A.; Heriawan, M. N.; Kubo, T.
2016-09-01
Indonesia has about 40% of geothermal energy resources in the world. An area with the potential geothermal energy in Indonesia is Wayang Windu located at West Java Province. The comprehensive understanding about the geothermal system in this area is indispensable for continuing the development. A geothermal system generally associated with joints or fractures and served as the paths for the geothermal fluid migrating to the surface. The fluid paths are identified by the existence of surface manifestations such as fumaroles, solfatara and the presence of alteration minerals. Therefore the analyses of the liner features to geological structures are crucial for identifying geothermal potential. Fractures or joints in the form of geological structures are associated with the linear features in the satellite images. The Segment Tracing Algorithm (STA) was used for the basis to determine the linear features. In this study, we used satellite images of ALOS PALSAR in Ascending and Descending orbit modes. The linear features obtained by satellite images could be validated by field observations. Based on the application of STA to the ALOS PALSAR data, the general direction of extracted linear features were detected in WNW-ESE, NNE-SSW and NNW-SSE. The directions are consistent with the general direction of faults system in the field. The linear features extracted from ALOS PALSAR data based on STA were very useful to identify the fractured zones at geothermal field.
A fast forward algorithm for real-time geosteering of azimuthal gamma-ray logging.
Qin, Zhen; Pan, Heping; Wang, Zhonghao; Wang, Bintao; Huang, Ke; Liu, Shaohua; Li, Gang; Amara Konaté, Ahmed; Fang, Sinan
2017-05-01
Geosteering is an effective method to increase the reservoir drilling rate in horizontal wells. Based on the features of an azimuthal gamma-ray logging tool and strata spatial location, a fast forward calculation method of azimuthal gamma-ray logging is deduced by using the natural gamma ray distribution equation in formation. The response characteristics of azimuthal gamma-ray logging while drilling in the layered formation models with different thickness and position are simulated and summarized by using the method. The result indicates that the method calculates quickly, and when the tool nears a boundary, the method can be used to identify the boundary and determine the distance from the logging tool to the boundary in time. Additionally, the formation parameters of the algorithm in the field can be determined after a simple method is proposed based on the information of an offset well. Therefore, the forward method can be used for geosteering in the field. A field example validates that the forward method can be used to determine the distance from the azimuthal gamma-ray logging tool to the boundary for geosteering in real-time.
Custo, Graciela [Unidad de Actividad Quimica, Comision Nacional de Energia Atomica, Av. Gral Paz 1499 (B1650KNA) San Martin, Buenos Aires (Argentina); Litter, Marta I. [Unidad de Actividad Quimica, Comision Nacional de Energia Atomica, Av. Gral Paz 1499 (B1650KNA) San Martin, Buenos Aires (Argentina); Escuela de Posgrado, Universidad de General San Martin, San Lorenzo 3391 Villa Ballester, 1653. Prov. de Buenos Aires (Argentina); Rodriguez, Diana [Universidad Nacional de Lujan, Ruta 5 y 7. Prov. de Buenos Aires (Argentina); Vazquez, Cristina [Unidad de Actividad Quimica, Comision Nacional de Energia Atomica, Av. Gral Paz 1499 (B1650KNA) San Martin, Buenos Aires (Argentina) and Laboratorio de Quimica de Sistemas Heterogeneos, Facultad de Ingenieria, Universidad de Buenos Aires, P. Colon 850 (C1063ACU), Buenos Aires (Argentina)]. E-mail: Cristina.Vazquez@cnea.gov.ar
2006-11-15
It is well known that Hg species cause high noxious effects on the health of living organisms even at very low levels (5 {mu}g/L). Quantification of this element is an analytical challenge due to the peculiar physicochemical properties of all Hg species. The regulation of the maximal allowable Hg concentration led to search for sensitive methods for its determination. Total reflection X-ray fluorescence is a proved instrumental analytical tool for the determination of trace elements. In this work, the use of total reflection X-ray fluorescence for Hg quantification is investigated. However, experimental determination by total reflection X-ray fluorescence requires depositing a small volume of sample on the reflector and evaporation of the solvent until dryness to form a thin film. Because of volatilization of several Hg forms, a procedure to capture these volatile species in liquid samples by using complexing agents is proposed. Acetate, oxalic acid, ethylenediaminetetracetic acid and ammonium pyrrolidine-dithiocarbamate were assayed for trapping the analytes into the solution during the preparation of the sample and onto the reflector during total reflection X-ray fluorescence measurements. The proposed method was applied to evaluate Hg concentration during TiO{sub 2}-heterogeneous photocatalysis, one of the most known advanced oxidation technologies. Advanced oxidation technologies are processes for the treatment of effluents in waters and air that involve the generation of very active oxidative and reductive species. In heterogeneous photocatalysis, Hg is transformed to several species under ultraviolet illumination in the presence of titanium dioxide. Total reflection X-ray fluorescence was demonstrated to be applicable in following the extent of the heterogeneous photocatalysis reaction by determining non-transformed Hg in the remaining solution.
Custo, Graciela; Litter, Marta I.; Rodríguez, Diana; Vázquez, Cristina
2006-11-01
It is well known that Hg species cause high noxious effects on the health of living organisms even at very low levels (5 μg/L). Quantification of this element is an analytical challenge due to the peculiar physicochemical properties of all Hg species. The regulation of the maximal allowable Hg concentration led to search for sensitive methods for its determination. Total reflection X-ray fluorescence is a proved instrumental analytical tool for the determination of trace elements. In this work, the use of total reflection X-ray fluorescence for Hg quantification is investigated. However, experimental determination by total reflection X-ray fluorescence requires depositing a small volume of sample on the reflector and evaporation of the solvent until dryness to form a thin film. Because of volatilization of several Hg forms, a procedure to capture these volatile species in liquid samples by using complexing agents is proposed. Acetate, oxalic acid, ethylenediaminetetracetic acid and ammonium pyrrolidine-dithiocarbamate were assayed for trapping the analytes into the solution during the preparation of the sample and onto the reflector during total reflection X-ray fluorescence measurements. The proposed method was applied to evaluate Hg concentration during TiO 2-heterogeneous photocatalysis, one of the most known advanced oxidation technologies. Advanced oxidation technologies are processes for the treatment of effluents in waters and air that involve the generation of very active oxidative and reductive species. In heterogeneous photocatalysis, Hg is transformed to several species under ultraviolet illumination in the presence of titanium dioxide. Total reflection X-ray fluorescence was demonstrated to be applicable in following the extent of the heterogeneous photocatalysis reaction by determining non-transformed Hg in the remaining solution.
Fundamental parameter based quantification algorithm for confocal nano-X-ray fluorescence analysis
Schoonjans, Tom, E-mail: Tom.Schoonjans@UGent.be [X-ray Microspectroscopy and Imaging Research Group (XMI), Department of Analytical Chemistry, Ghent University, Krijgslaan 281 S12, B-9000 Ghent (Belgium); Silversmit, Geert; Vekemans, Bart [X-ray Microspectroscopy and Imaging Research Group (XMI), Department of Analytical Chemistry, Ghent University, Krijgslaan 281 S12, B-9000 Ghent (Belgium); Schmitz, Sylvia [Geosciences Institute/Mineralogy, Goethe University Frankfurt, Altenhoeferallee 1, D-60438 Frankfurt (Germany); Burghammer, Manfred; Riekel, Christian [ESRF, 6 rue Jules Horowitz, BP220, F-38043 Grenoble Cedex (France); Brenker, Frank E. [Geosciences Institute/Mineralogy, Goethe University Frankfurt, Altenhoeferallee 1, D-60438 Frankfurt (Germany); Vincze, Laszlo, E-mail: Laszlo.Vincze@UGent.be [X-ray Microspectroscopy and Imaging Research Group (XMI), Department of Analytical Chemistry, Ghent University, Krijgslaan 281 S12, B-9000 Ghent (Belgium)
2012-01-15
A new method for the quantification of X-ray fluorescence (XRF) was derived based on the fundamental parameter method (FPM). The FPM equations were adapted to accommodate the special case of confocal nano-XRF, i.e. X-ray nano-beam excitation coupled with confocal detection, taking into account the special characteristics of the detector channel polycapillary. A thorough error estimation algorithm based on the Monte Carlo method was applied, producing a detailed analysis of the uncertainties of the quantification results. The new FPM algorithm was applied on confocal nano-XRF data obtained from cometary dust returned by NASA's Stardust mission, recorded at beamline ID13 of the European Synchrotron Radiation Facility. - Highlights: Black-Right-Pointing-Pointer A new method for the quantification of confocal XRF is presented. Black-Right-Pointing-Pointer The quantification is based on the fundamental parameter method (FPM). Black-Right-Pointing-Pointer The new FPM algorithm was applied for the analysis of unique cometary dust particles. Black-Right-Pointing-Pointer The cometary particles were returned by NASA's Stardust mission in 2006. Black-Right-Pointing-Pointer Error estimation is based on the Monte Carlo method.
Nosikov, I. A.; Klimenko, M. V.; Bessarab, P. F.; Zhbankov, G. A.
2017-07-01
Point-to-point ray tracing is an important problem in many fields of science. While direct variational methods where some trajectory is transformed to an optimal one are routinely used in calculations of pathways of seismic waves, chemical reactions, diffusion processes, etc., this approach is not widely known in ionospheric point-to-point ray tracing. We apply the Nudged Elastic Band (NEB) method to a radio wave propagation problem. In the NEB method, a chain of points which gives a discrete representation of the radio wave ray is adjusted iteratively to an optimal configuration satisfying the Fermat's principle, while the endpoints of the trajectory are kept fixed according to the boundary conditions. Transverse displacements define the radio ray trajectory, while springs between the points control their distribution along the ray. The method is applied to a study of point-to-point ionospheric ray tracing, where the propagation medium is obtained with the International Reference Ionosphere model taking into account traveling ionospheric disturbances. A 2-dimensional representation of the optical path functional is developed and used to gain insight into the fundamental difference between high and low rays. We conclude that high and low rays are minima and saddle points of the optical path functional, respectively.
改进的地震模型初值射线追踪方法%Improved Seismic Model Initial Value Ray Tracing Method
贺中银; 高阳
2011-01-01
The initial value ray tracing method is one of major method in modem ray tracing methods. It overcomes time-consuming computing efficiency in two spots ray tracing. Based on eikonal equation, improved initial value ray tracing, that is using square slowness to replace velocity parameters in model, make eikonal equation produces analytic solutions, a step further to derive computing expressions of reflection and transmission slowness vectors when the ray confiont with interface, and reflection and transmission coefficients function expressions. Through ray tracings of simple two layered interface syncline model and complex multiple layered salt-dome model, have shown the improvement of initial value ray tracing by comparison with Runge-Kutta discrete numerical solution, not only improved ray tracing efficiency (about 10 times), but also extended limit for the use of ray tracing method.%初值射线追踪方法是现代射线追踪方法中的一个很重要的理论,它克服了两点法射线追踪方法耗时的计算效率问题.以程函方程为基础,对初值射线追踪方法进行改进,即利用平方慢度来替换模型中的速度参数,使得程函方程产生解析解,从而进一步导出当射线遇到界面时的反射和透射慢度向量的计算表达式,以及反射、透射系数的函数表达式.通过对简单的两层界面向斜模型及复杂的多层盐丘模型的射线追踪,表明该初值射线追踪方法的改进相比于以往的龙格库塔离散数值解法,不但使射线追踪效率得到了大幅度提高(10倍左右),且也扩大了射线法使用范围.
A Three-Dimensional Ray-Tracing Study of R-X Mode Waves during High Geomagnetic Activity
XIAO Fu-Liang; CHEN Lun-Jin; ZHENG Hui-Nan; WANG Shui; GUO Jun
2008-01-01
We further present a three-dimensional(3D)ray-tracing study on the propagation characteristic of the superluminous R-X mode waves during high geomagnetic activity following our recent two-dimensional results [J.Geophys.Res.112(2007)A10214].We perform numerical calculations for this mode which originates at specific altitude r=2.0RE in the souice cavity along a 70°night geomagnetic field line.We demonstrate that the ray path of the R-X mode is essentially governed by the azimuthal angle of the wave vector k.Ray paths starting with azimuthal angle 180°(or in the meridian plane)can reach the lowest latitude,but stay at relatively higher latitudes with the azimuthal anglas other than 180°(or off the meridian plane).The results further supports the previous finding that the R-X mode may be physically present in the radiation belts under appropriate conditions.
Xiao, Yi; Tholen, Danny; Zhu, Xin-Guang
2016-11-01
Leaf photosynthesis is determined by biochemical properties and anatomical features. Here we developed a three-dimensional leaf model that can be used to evaluate the internal light environment of a leaf and its implications for whole-leaf electron transport rates (J). This model includes (i) the basic components of a leaf, such as the epidermis, palisade and spongy tissues, as well as the physical dimensions and arrangements of cell walls, vacuoles and chloroplasts; and (ii) an efficient forward ray-tracing algorithm, predicting the internal light environment for light of wavelengths between 400 and 2500nm. We studied the influence of leaf anatomy and ambient light on internal light conditions and J The results show that (i) different chloroplasts can experience drastically different light conditions, even when they are located at the same distance from the leaf surface; (ii) bundle sheath extensions, which are strips of parenchyma, collenchyma or sclerenchyma cells connecting the vascular bundles with the epidermis, can influence photosynthetic light-use efficiency of leaves; and (iii) chloroplast positioning can also influence the light-use efficiency of leaves. Mechanisms underlying leaf internal light heterogeneity and implications of the heterogeneity for photoprotection and for the convexity of the light response curves are discussed.
A comparison of iterative algorithms and a mixed approach for in-line x-ray phase retrieval.
Meng, Fanbo; Zhang, Da; Wu, Xizeng; Liu, Hong
2009-08-15
Previous studies have shown that iterative in-line x-ray phase retrieval algorithms may have higher precision than direct retrieval algorithms. This communication compares three iterative phase retrieval algorithms in terms of accuracy and efficiency using computer simulations. We found the Fourier transformation based algorithm (FT) is of the fastest convergence, while the Poisson-solver based algorithm (PS) has higher precision. The traditional Gerchberg-Saxton algorithm (GS) is very slow and sometimes does not converge in our tests. Then a mixed FT-PS algorithm is presented to achieve both high efficiency and high accuracy. The mixed algorithm is tested using simulated images with different noise level and experimentally obtained images of a piece of chicken breast muscle.
Beyerlein, Kenneth R.; White, Thomas A.; Yefanov, Oleksandr
2017-01-01
A novel algorithm for indexing multiple crystals in snapshot X-ray diffraction images, especially suited for serial crystallography data, is presented. The algorithm, FELIX, utilizes a generalized parametrization of the Rodrigues-Frank space, in which all crystal systems can be represented without...
Aranda, Pedro R.; Moyano, Susana; Martinez, Luis D.; De Vito, Irma E. [Universidad Nacional de San Luis, Facultad de Quimica, Bioquimica y Farmacia, Area de Quimica Analitica, Instituto de Quimica de San Luis (INQUISAL - CONICET), San Luis (Argentina)
2010-09-15
A new, simple, and selective method for preconcentration and determination of Cr(VI) in aqueous samples. After adsorption in ''batch mode'' on Aliquat 336-AC, determinations were made directly on the solid by X-ray fluorescence spectrometry, which had the advantage of not requiring the step of elution of the chromium retained. The enrichment factor was calculated considering that the tablets obtained from 10 mL solution of Cr(VI) (1000 {mu}g L{sup -1}) had a final thickness of 0.64 mm and a diameter of 16.7 mm; the volume deposited on the pellet was 0.14 cm{sup 3}. The preconcentration factor obtained was 71-fold, which was highly satisfactory for chromium trace analysis by XRF. Finally, the method was successfully applied to the determination of Cr(VI) in drinking water samples. (orig.)
A model of the AGS based on stepwise ray-tracing through the measured field maps of the main magnets
Dutheil Y.; Meot, F.; Tsoupas, N.
2012-05-20
Two-dimensional mid-plane magnetic field maps of two of the main AGS magnets were produced, from Hall probe measurements, for a series of different current settings. The analysis of these data yielded the excitation functions [1] and the harmonic coefficients [2] of the main magnets which have been used so far in all the models of the AGS. The constant increase of the computation power makes it possible today to directly use a stepwise raytracing through these measured field maps with a reasonable computation time. We describe in detail how these field maps have allowed the generation of models of the 6 different types of AGS main magnets, and how they are being handled with the Zgoubi ray-tracing code [3]. We give and discuss a number of results obtained regarding both beam and spin dynamics in the AGS, and we provide comparisons with other numerical and analytical modelling methods.
Christophe Lièbe
2010-01-01
Full Text Available This paper presents a new software for design of through-the-wall imaging radars. The first part describes the evolution of a ray tracing simulator, originally designed for propagation of narrowband signals, and then for ultra-wideband signals. This simulator allows to obtain temporal channel response to a wide-band emitter (3 GHz to 10 GHz. An experimental method is also described to identify the propagation paths. Simulation results are compared to propagation experiments under the same conditions. Different configurations are tested and then discussed. Finally, a configuration of through-the-wall imaging radar is proposed, with different antennas patterns and different targets. Simulated images will be helpful for understanding the experiment obtained images.
He, Wenjun; Fu, Yuegang; Liu, Zhiying; Zhang, Lei; Wang, Jiake; Zheng, Yang; Li, Yahong
2017-03-01
The polarization aberrations of a complex optical system with multi-element lens have been investigated using a 3D polarization aberration function. The 3D polarization ray-tracing matrix has been combined with the optical path difference to obtain a 3D polarization aberration function, which avoids the need for a complicated phase unwrapping process. The polarization aberrations of a microscope objective have been analyzed to include, the distributions of 3D polarization aberration functions, diattenuation aberration, retardance aberration, and polarization-dependent intensity on the exit pupil. Further, the aberrations created by the field of view and the coating on the distribution rules of 3D polarization aberration functions are discussed in detail. Finally a novel appropriate field of view and wavelength correction is proposed for a polarization aberration function which optimizes the image quality of a multi-element optical system.
Nugent, Allen H; Bertram, Christopher D
2010-02-01
Prediction of the effects of refractive index (RI) mismatch on laser Doppler anemometer (LDA) measurements within a curvilinear cavity (an artificial ventricle) was achieved by developing a general technique for modelling the paths of the convergent beams of the LDA system using 3D vector geometry. Validated by ray tracing through CAD drawings, the predicted maximum tolerance in RI between the solid model and the working fluid was +/- 0.0005, equivalent to focusing errors commensurate with the geometric and alignment uncertainties associated with the flow model and the LDA arrangement. This technique supports predictions of the effects of refraction within a complex geometry. Where the RI mismatch is unavoidable but known, it is possible not only to calculate the true position of the measuring volume (using the probe location and model geometry), but also to estimate degradation in signal quality arising from differential displacement and refraction of the laser beams.
Chevallier, P.; Wang, J.; Jehanno, C.; Maurette, M.; Sutton, S. R.
1986-01-01
Synchrotron X-ray fluorescence spectra of unpolished iron and chondritic spheres extracted from sediments collected on the melt zone of the Greenland ice cap allow the analysis of Ni, Cu, Zn, Ga, Ge, Pb, and Se with minimum detection limits on the order of several parts per million. All detected elements are depleted relative to chondritic abundance with the exception of Pb, which shows enrichments up to a factor of 500. An apparent anticorrelation between the Ni-content and trace element concentration was observed in both types of spherules. The fractionation patterns of the iron and chondritic spheres are not complementary and consequently the two iron spheres examined in this study are unlikely to result from ejection of globules of Fe/Ni from parent chondritic micrometeoroids.
Church, Jonathan R.
New condensed matter metrologies are being used to probe ever smaller length scales. In support of the diverse field of materials research synchrotron based spectroscopies provide sub-micron spatial resolutions and a breadth of photon wavelengths for scientific studies. For electronic materials the thinnest layers in a complementary metal-oxide-semiconductor (CMOS) device have been reduced to just a few nanometers. This raises concerns for layer uniformity, complete surface coverage, and interfacial quality. Deposition processes like chemical vapor deposition (CVD) and atomic layer deposition (ALD) have been shown to deposit the needed high-quality films for the requisite thicknesses. However, new materials beget new chemistries and, unfortunately, unwanted side-reactions and by-products. CVD/ALD tools and chemical precursors provided by our collaborators at Air Liquide utilized these new chemistries and films were deposited for which novel spectroscopic characterization methods were used. The second portion of the thesis focuses on fading and decomposing paint pigments in iconic artworks. Efforts have been directed towards understanding the micro-environments causing degradation. Hard X-ray photoelectron spectroscopy (HAXPES) and variable kinetic energy X-ray photoelectron spectroscopy (VKE-XPS) are advanced XPS techniques capable of elucidating both chemical environments and electronic band structures in sub-surface regions of electronic materials. HAXPES has been used to study the electronic band structure in a typical CMOS structure; it will be shown that unexpected band alignments are associated with the presence of electronic charges near a buried interface. Additionally, a computational modeling algorithm, Bayes-Sim, was developed to reconstruct compositional depth profiles (CDP) using VKE-XPS data sets; a subset algorithm also reconstructs CDP from angle-resolved XPS data. Reconstructed CDP produced by Bayes-Sim were most strongly correlated to the real
Sangita Dhara; N L Misra
2011-02-01
Applicability of total reﬂection X-ray ﬂuorescence (TXRF) spectrometry for trace elemental analysis of rainwater samples was studied. The study was used to develop these samples as rainwater standards by the National University of Singapore (NUS). Our laboratory was one of the participants to use TXRF for this study. The rainwater sample obtained from NUS was analysed by TXRF and the trace elements Mn, Fe, Ni, Cu, Zn, V and Pb were determined as required by the NUS. The average precision was found to be within 16% and the TXRF-determined elemental concentrations of these elements were below 20 /l. The average deviation of TXRFdetermined values from the certiﬁed values were 20% (excluding the deviation for Fe and V which were comparatively high). Apart from the above elements, S, K, Ca, Rb, Sr, Ba and Br were also determined by TXRF and were found to be in the range of 0.2 to 191 /l. TXRF-determined values of our laboratory played an important role in the certiﬁcation of concentration of seven elements in this rainwater sample which was later developed as a rainwater standard.
Dahl, Tais W.; Ruhl, Micha; Hammarlund, Emma U.
2013-01-01
Elevated molybdenum (Mo) contents in organic-rich sediments are indicative of deposition from an anoxic and sulfide-rich (euxinic) water-column. This can be used for tracing past euxinic conditions in ancient oceans from sedimentary archives. Conventional analytical detection of elevated molybdenum...... with Mo: 47 ppm and NIST-2702 with Mo: 10 ppm) for analytical control. Analytical precision (1 sigma) after 30, 120, and 300 seconds of measuring time was 4, 2, and 1 ppm, with a respective detection limit of 11, 5, 3 ppm (3 sigma, noise level). The data were accurate to within the given precision (1...... sigma) after a daily calibration to samples covering a range of Mo concentrations from 0 to >30 ppm. Hand-held XRF equipment also allows Mo measurements directly on fresh rock surfaces, both in the field and under laboratory conditions. Rock-samples from a Cambrian drill core closely match ICPMS...
Sorting algorithms for single-particle imaging experiments at X-ray free-electron lasers.
Bobkov, S A; Teslyuk, A B; Kurta, R P; Gorobtsov, O Yu; Yefanov, O M; Ilyin, V A; Senin, R A; Vartanyants, I A
2015-11-01
Modern X-ray free-electron lasers (XFELs) operating at high repetition rates produce a tremendous amount of data. It is a great challenge to classify this information and reduce the initial data set to a manageable size for further analysis. Here an approach for classification of diffraction patterns measured in prototypical diffract-and-destroy single-particle imaging experiments at XFELs is presented. It is proposed that the data are classified on the basis of a set of parameters that take into account the underlying diffraction physics and specific relations between the real-space structure of a particle and its reciprocal-space intensity distribution. The approach is demonstrated by applying principal component analysis and support vector machine algorithms to the simulated and measured X-ray data sets.
Algorithms for three-dimensional chemical analysis via multi-energy synchrotron X-ray tomography
Ham, Kyungmin; Butler, Leslie G.
2007-08-01
The conversion of X-ray tomography images into three-dimensional chemical composition requires accurate mass absorption values, high-quality images, and a robust fitting algorithm. The least-squares fits of the images to a three-dimensional chemical composition can proceed with several different options such as minimal vs. over-determined and/or constrained parameters. This project has investigated the impact of XAFS features and a limited CCD dynamic range. These simulated results are compared to a recent experimental project in which synchrotron X-ray tomography was used to image a polymer blend, and from those images, calculated three-dimensional chemical composition maps of the two-component flame retardant, a brominated phthalimide dimer, Saytex ™ BT-93 and a synergist, antimony(III) oxide (Sb 2O 3).
Pingitore, N. E.; Cruz-Jimenez, G.; Price, T. D.
2001-12-01
X-ray absorption spectroscopy (XAS) affords the opportunity to probe the atomic environment of trace elements in human bone. We are using XAS to investigate the mode(s) of incorporation of Sr, Zn, Pb, and Ba in both modern and ancient (and thus possibly altered) human and animal bone. Because burial and diagenesis may add trace elements to bone, we performed XAS analysis on samples of pristine contemporary and ancient, buried human and animal bone. We assume that deposition of these elements during burial occurs by processes distinct from those in vivo, and this will be reflected in their atomic environments. Archaeologists measure strontium in human and animal bone as a guide to diet. Carnivores show lower Sr/Ca ratios than their herbivore prey due to discrimination against Sr relative to Ca up the food chain. In an initial sample suite no difference was observed between modern and buried bone. Analysis of additional buried samples, using a more sensitive detector, revealed significant differences in the distance to the second and third neighbors of the Sr in some of the buried samples. Distances to the first neighbor, oxygen, were similar in all samples. Zinc is also used in paleo-diet studies. Initial x-ray absorption spectroscopy of a limited suite of bones did not reveal any differences between modern and buried samples. This may reflect the limited number of samples examined or the low levels of Zn in typical aqueous solutions in soils. Signals from barium and lead were too low to record useful XAS spectra. Additional samples will be studied for Zn, Ba, and Pb. We conducted our XAS experiments on beam lines 4-1 and 4-3 at the Stanford Synchrotron Radiation Laboratory. Data were collected in the fluorescence mode, using a Lytle detector and appropriate filter, and a solid state, 13-element Ge-detector.
Trace element determination in amniotic fluid by total reflection X-ray fluorescence
Greaves, E.D.; Sajo-Bohus, L.; Castelli, C.; Borgerg, C. [Universidad Simon Bolivar, Caracas (Venezuela); Meitin, J.; Liendo, J.
1995-03-01
A new method is reported for the determination of Fe, Cu, Zn, and Br in amniotic fluid (AF) by Total Reflection X-Ray Fluorescence. The irradiation of AF samples with monochromatic X-Rays reduces the scattering background from the organic matrix and avoids the need for sample digestion. Sample manipulation is reduced to centrifuging and adding cobalt as internal standard. Lower detection limits obtained are 109, 53, 44 and 37 ppb for Fe, Cu, Zn and Br respectively. Measurement precision depends on element concentrations and can be as low as 1.5% SD. Results of the analysis of 34 AF samples from Venezuelan pregnant patients agree with previously reported ranges of Fe, Cu and Zn. Other elements observed but not quantified are Cl, K, Ca in all spectra and Pb and Sr in some of them. (author).
Martinez, T. [National University of Mexico, Faculty of Chemistry, Building D, CU (O4510) Mexico, D.F. Mexico (Mexico)], E-mail: tmc@servidor.unam.mx; Lartigue, J. [National University of Mexico, Faculty of Chemistry, Building D, CU (O4510) Mexico, D.F. Mexico (Mexico); Zarazua, G.; Avila-Perez, P. [National Institute of Nuclear Research. Carr. Mexico-Toluca Km 36.5, (52045) Salazar, Ocoyoacac, Edo. de Mexico (Mexico); Navarrete, M. [National University of Mexico, Faculty of Chemistry, Building D, CU (O4510) Mexico, D.F. Mexico (Mexico); Tejeda, S. [National Institute of Nuclear Research. Carr. Mexico-Toluca Km 36.5, (52045) Salazar, Ocoyoacac, Edo. de Mexico (Mexico)
2008-12-15
Many studies have identified an important number of toxic elements along with organic carcinogen molecules and radioactive isotopes in tobacco. In this work we have analyzed by Total Reflection X-Ray Fluorescence 9 brands of cigarettes being manufactured and distributed in the Mexican market. Two National Institute of Standards and Technology standards and a blank were equally treated at the same time. Results show the presence of some toxic elements such as Pb and Ni. These results are compared with available data for some foreign brands, while their implications for health are discussed. It can be confirmed that the Total Reflection X-Ray Fluorescence method provides precise (reproducible) and accuracy (trueness) data for 15 elements concentration in tobacco samples.
周庆华; 史建魁; 肖伏良
2011-01-01
A three-dimensional ray tracing study of a whistler-mode chorus is conducted for different geomagnetic activities by using a global core plasma density model. For the upperband chorus, the initial azimuthal wave angle affects slightly the projection of ray trajectories onto the plane （Z, √（x^2 ＋ y^2））, but controls the longitudinal propagation. The trajectory of the upper-band chorus is strongly associated with the plasmapause and the magnetic local time （MLT） of chorus source region. For the high geomagnetic activity, the chorus trajectory moves inward together with the plasmapause. In the bulge region, the plasmapause extends outward, while the chorus trajectory moves outward together with the plasmapause. For moderately or high geomagnetic activity, the lower-band chorus suffers low hybrid resonance （LHR） reflection before it reaches the plasmapause, leading to a weak correlation with the geomagnetic activity and magnetic local time of the chorus source region. For low geomagnetic activity, the lower-band chorus may be reflected firstly at the plasmapause instead of suffering LHR reflection, exhibiting a propagation characteristic similar to that of the upper-band chorus. The results provide a new insight into the propagation characteristics of the chorus for different geomagnetic activities and contribute to further understanding of the acceleration of energetic electron by a chorus wave.
Characterizing trace metal impurities in optical waveguide materials using x-ray absorption
Citrin, P.H.; Northrup, P.A.; Atkins, R.M.; Niu, L.; Marcus, M.A.; Jacobson, D.C. [Lucent Technologies, Murray Hill, NJ (United States). Bell Labs.; Glodis, P.F. [Lucent Technologies, Norcross, GA (United States). Bell Labs.
1998-12-31
X-ray absorption measurements are described for identifying metal impurities in silica preforms, the rod-like starting materials from which hair-like optical fibers are drawn. The results demonstrate the effectiveness of this approach as a non-destructive, quantitative, element-selective, position-sensitive, and chemical-state-specific means for characterizing transition metals in the concentration regime of parts per billion.
Alternative methods for ray tracing in uniaxial media. Application to negative refraction
Bellver-Cebreros, Consuelo; Rodriguez-Danta, Marcelo
2007-03-01
In previous papers [C. Bellver-Cebreros, M. Rodriguez-Danta, Eikonal equation, alternative expression of Fresnel's equation and Mohr's construction in optical anisotropic media, Opt. Commun. 189 (2001) 193; C. Bellver-Cebreros, M. Rodriguez-Danta, Internal conical refraction in biaxial media and graphical plane constructions deduced from Mohr's method, Opt. Commun. 212 (2002) 199; C. Bellver-Cebreros, M. Rodriguez-Danta, Refraccion conica externa en medios biaxicos a partir de la construccion de Mohr, Opt. Pura AppliE 36 (2003) 33], the authors have developed a method based on the local properties of dielectric permittivity tensor and on Mohr's plane graphical construction in order to study the behaviour of locally plane light waves in anisotropic media. In this paper, this alternative methodology is compared with the traditional one, by emphasizing the simplicity of the former when studying ray propagation through uniaxial media (comparison is possible since, in this case, traditional construction becomes also plane). An original and simple graphical method is proposed in order to determine the direction of propagation given by the wave vector from the knowledge of the extraordinary ray direction (given by Poynting vector). Some properties of light rays in these media not described in the literature are obtained. Finally, two applications are considered: a description of optical birefringence under normal incidence and the study of negative refraction in uniaxial media.
Viridi, Sparisoma
2013-01-01
Trace of ray deviated by a prism, which is common in a TIR (total internal reflection) measurement system, is sometimes difficult to manage, especially if the prism is an equilateral right angle prism (ERAP). The point where the ray is reflected inside the right-angle prism is also changed as the angle of incident ray changed. In an ATR (attenuated total reflectance) measurement system, range of this point determines size of sample. Using JavaScript and HTML5 model and visualization of ray tracing deviated by an ERAP is perform and reported in this work. Some data are obtained from this visualization and an empirical relations between angle of incident ray source \\theta_S, angle of ray detector hand \\theta_D, and angle of ray detector \\theta'_D are presented for radial position of ray source R_S, radial position of ray detector R_D, height of right-angle prism t, and refractive index of the prism n. Keywords: deviation angle, equilateral right angle prism, total internal reflection, JavaScript, HTML5.
New ray-tracing capabilities for the development of silicon pore optics
Vacanti, Giuseppe; Barrière, Nicolas; Chatbi, Abdelhakim; Collon, Maximilien; Günther, Ramses; Yanson, Alexei; Vervest, Mark; Bavdaz, Marcos; Wille, Eric
2015-09-01
The Geant4 based ray-tracer used to support the development of Silicon Pore Optics is being extended to take into account more subtle effects that affect the performance of the optics, like thermo-mechanical stresses and detailed surface metrology. Its performance has also been increased to make it possible to simulate rapidly and in detail the optics of Athena so that various possible configurations can be explored and characterized providing important feedback to the development and system teams. In this paper we report on the state of the development.
Wandzilak, Aleksandra; Czyzycki, Mateusz; Radwanska, Edyta; Adamek, Dariusz; Geraki, Kalotina; Lankosz, Marek
2015-12-01
Neoplastic and healthy brain tissues were analysed to discern the changes in the spatial distribution and overall concentration of elements using micro X-ray fluorescence spectroscopy. High-resolution distribution maps of minor and trace elements such as P, S, Cl, K, Ca, Fe, Cu and Zn made it possible to distinguish between homogeneous cancerous tissue and areas where some structures could be identified, such as blood vessels and calcifications. Concentrations of the elements in the selected homogeneous areas of brain tissue were compared between tumours with various malignancy grades and with the controls. The study showed a decrease in the average concentration of Fe, P, S and Ca in tissues with high grades of malignancy as compared to the control group, whereas the concentration of Zn in these tissues was increased. The changes in the concentration were found to be correlated with the tumour malignancy grade. The efficacy of micro X-ray fluorescence spectroscopy to distinguish between various types of cancer based on the concentrations of studied elements was confirmed by multivariate discriminant analysis. Our analysis showed that the most important elements for tissue classification are Cu, K, Fe, Ca, and Zn. This method made it possible to correctly classify histopathological types in 99.93% of the cases used to build the model and in as much as 99.16% of new cases.
Cashen, M. T.; Koch, P. M.
1997-04-01
In our fast-beam apparatus we have long used( P. Koch and K. van Leeuwen, Phys. Rep. 255), 289 (1995). an electrostatic filter lens (FL) selectively to transmit energy labeled signal ions (e.g., H^+ or He^+) whose energy EB + EL is E_L=40--300 eV above the energy, typically E_B=14.6 keV, of the much more intense primary ion beam. Based on one originally used( H. Zeman, K. Jost, and S. Gilad, Rev. Sci. Inst. 42), 485 (1971). with hundred-eV-energy-range electrons, our 12.8 cm long FL has 21 identical, equally spaced, 0.1 cm thick mumetal disks (11.4 cm OD with 1.91 cm dia. axial hole) electrically biased via resistors so that its near-axis electrostatic field is approximately hyperbolic. We have long noted that the analysis presented in Ref. [3], which ignores focusing effects, fails to explain why our FL has a final cutoff up to five or more times sharper than `theory'. We present ray tracing results obtained with the computer program uc(Simion) to show that strong focusing and higher operating regions (initially parallel rays crossing the axis more than once) play a very important role in the operation of the FL near cutoff and in sharpening its cutoff. Agreement is good.
Tracing X-ray Binary Population Evolution By Galaxy Dissection: First Results from M51
Lehmer, Bret; Eufrasio, Rafael T.; Markwardt, Larissa; Zezas, Andreas; Basu-Zych, Antara; Fragos, Tassos; Hornschemeier, Ann E.; Kalogera, Vassiliki; Ptak, Andrew; Tzanavaris, Panayiotis; Yukita, Mihoko
2017-01-01
Recently, we have found, in the Chandra Deep Field-South, that the emission from X-ray binary (XRB) populations in galaxies evolves significantly with cosmic time, most likely due to changes in the physical properties of galaxies like star-formation rate, stellar mass, stellar age, and metallicity. However, it has been challenging to directly show that these same physical properties are connected to XRB populations using data from nearby galaxies. We present a new technique for empirically calibrating how X-ray binary (XRB) populations evolve following their formation in a variety of environments. We first utilize detailed stellar population synthesis modeling of far-UV to far-IR broadband data of nearby (face-on spiral galaxies to construct maps of the star-formation histories on subgalactic scales. Using Chandra data, we then identify the locations of the XRBs within these galaxies and correlate their formation frequencies with local galaxy properties. In this talk, I will show promising first results for the Whirlpool galaxy (M51), and will discuss how expanding our sample to an archival sample of 20 face-on spirals will lead to a detailed empirical timeline for how XRBs form and evolve in various environments.
Trace the polymerization induced by gamma-ray irradiated silica particles
Lee, Hoik; Ryu, Jungju; Kim, Myungwoong; Im, Seung Soon; Kim, Ick Soo; Sohn, Daewon
2016-08-01
A γ-ray irradiation to inorganic particles is a promising technique for preparation of organic/inorganic composites as it offers a number of advantages such as an additive-free polymerizations conducted under mild conditions, avoiding undesired damage to organic components in the composites. Herein, we demonstrated a step-wise formation mechanism of organic/inorganic nanocomposite hydrogel in detail. The γ-ray irradiation to silica particles dispersed in water generates peroxide groups on their surface, enabling surface-initiated polymerization of acrylic acid from the inorganic material. As a result, poly(acrylic acid) (PAA) covers the silica particles in the form of a core-shell at the initial stage. Then, PAA-coated silica particles associate with each other by combination of radicals at the end of chains on different particles, leading to micro-gel domains. Finally, the micro-gels are further associated with each other to form a 3D network structure. We investigated this mechanism using dynamic light scattering (DLS) and transmission electron microscopy (TEM). Our result strongly suggests that controlling reaction time is critical to achieve specific and desirable organic/inorganic nanocomposite structure among core-shell particles, micro-gels and 3D network bulk hydrogel.
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
Stone, John E.; Sener, Melih; Vandivort, Kirby L.; Barragan, Angela; Singharoy, Abhishek; Teo, Ivan; Ribeiro, João V.; Isralewitz, Barry; Liu, Bo; Goh, Boon Chong; Phillips, James C.; MacGregor-Chatwin, Craig; Johnson, Matthew P.; Kourkoutis, Lena F.; Hunter, C. Neil; Schulten, Klaus
2016-07-01
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.
TANGO—New tracking AlGOrithm for gamma-rays
Tashenov, S.; Gerl, J.
2010-10-01
For spectroscopy, polarimetry and imaging purposes a new γ-ray tracking algorithm has been developed featuring identification of Compton escape events. The rejection of these events results in a significant increase of the Peak/Total ratio. The initial photon energy is restored for these events. Although the energy resolution in the spectrum reconstructed from the escape events is lower than the one from the full-energy events, the Monte-Carlo simulations show that the combined spectrum has an increased detector full-energy efficiency of up to 130% compared to its intrinsic full-energy efficiency. The assumed geometrical origin of the photons is verified event-by-event. This enables separation of photons emitted from a target and from background sources. A linear polarization analysis of the γ-lines can be performed. The efficiency of the algorithm and the Peak/Total ratio depending on the detector properties is discussed along with the proposed optimization schemes. The influence of the intrinsic properties of the scattering process like Compton profile and electron recoiling is discussed as well. The described algorithm deals with single photon events with energies of ≈100 keV up to a few MeV.
TANGO-New tracking AlGOrithm for gamma-rays
Tashenov, S., E-mail: tashenov@fysik.su.s [Royal Institute of Technology (KTH), AlbaNova University Center, Roslagstullsbacken 21, SE-10691 Stockholm (Sweden); Stockholm University, AlbaNova University Center, Roslagstullsbacken 21, SE-10691 Stockholm (Sweden); GSI Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Planckstrasse 1, 64291 Darmstadt (Germany); Gerl, J. [GSI Helmholtzzentrum fuer Schwerionenforschung GmbH (GSI), Planckstrasse 1, 64291 Darmstadt (Germany)
2010-10-21
For spectroscopy, polarimetry and imaging purposes a new {gamma}-ray tracking algorithm has been developed featuring identification of Compton escape events. The rejection of these events results in a significant increase of the Peak/Total ratio. The initial photon energy is restored for these events. Although the energy resolution in the spectrum reconstructed from the escape events is lower than the one from the full-energy events, the Monte-Carlo simulations show that the combined spectrum has an increased detector full-energy efficiency of up to 130% compared to its intrinsic full-energy efficiency. The assumed geometrical origin of the photons is verified event-by-event. This enables separation of photons emitted from a target and from background sources. A linear polarization analysis of the {gamma}-lines can be performed. The efficiency of the algorithm and the Peak/Total ratio depending on the detector properties is discussed along with the proposed optimization schemes. The influence of the intrinsic properties of the scattering process like Compton profile and electron recoiling is discussed as well. The described algorithm deals with single photon events with energies of {approx}100keV up to a few MeV.
Ménez, Bénédicte; Philippot, Pascal; Bonnin-Mosbah, Michelle; Simionovici, Alexandre; Gibert, François
2002-02-01
A critical problem for conducting quantitative analysis of individual fluid inclusions using Synchrotron X-Ray Fluorescence (SXRF) technique relates to the standardization and the calibration of the X-ray spectra. In this study, different approaches have been tested for calibration purposes: (a) the use of chlorine when Cl content can be estimated either from melting point depressions of undersaturated fluid inclusions or from saturation limits for halite-bearing fluid inclusions, (b) the use of calcium from synthetic fluid inclusions of known CaCl 2 content as an external standard. SXRF analysis was performed on individual fluid inclusions from the Chivor and Guali emerald deposits, Columbia. These well-known samples contain a single fluid inclusion population for which detailed crush-leach analyses are available, thus providing a relevant compositional reference frame. Concentration estimates were also compared to Particle Induced X-ray Emission (PIXE) analysis carried out independently on the same fluid inclusions. Results of the calibration tests indicate that major (Cl, K, Ca, Fe, Mn) and trace element (Cu, Zn, As, Br, Rb, Sr, Ba, Pb) concentration estimates can be performed without precise knowledge of the analytical volume and the inclusion's 3D geometry. Although the standard deviation of the SXRF results can be relatively high depending on the calibration mode used, mean concentration estimates for most elements are in good agreement with PIXE and crush-leach analysis. Elemental distributions within single fluid inclusions were also established. Associated correlation diagrams argue for the homogeneous distribution of most elements in the fluid inclusion. In contrast, Br shows a bimodal distribution interpreted to reflect a significant enrichment of the vapor portion of the inclusion fluid.
Determination of trace element levels in leaves of Nerium oleander using X-Ray Fluorescence
Santos, Ramon S.; Sanches, Francis A.C.R.A.; Neves, Arthur O.P.; Oliveira, Luis F.; Oliveira, Davi F.; Anjos, Marcelino J. [Universidade do Estado do Rio de Janeiro (UERJ), Rio de Janeiro, RJ (Brazil). Inst. de Fisica Armando Dias Tavares. Dept. de Fisica Aplicada e Termodinamica
2013-07-01
The environmental pollution by human activity has been one of the most concerns in the last years, principally due to rapid urban growth in the cities and the industrialization process. The air pollution can be increased due to several different kinds of emissions: urban traffic, industrial activities, burning fuel, civil industry of construction/demolition, fires and natural phenomena. Many of these emissions move from long distances due to convections currents and finally tend to deposit mainly in the plants leaves and in the soil. Thus, the plants leaves works as a natural sampler by the emissions deposit in these ones. In this study Nerium oleander leaves were used to measure the environmental pollutions levels in different sampling urban regions in the city of Rio de Janeiro/RJ: Andarai, Benfica, Bonsucesso, Caju, Engenho de Dentro, Engenho Novo, Estacio, Grajau, Inhauma, Lins, Maracana, Maria da Graca, Meier, Praca da Bandeira, Riachuelo, Rio Comprido, Sao Cristovao, Tijuca, Vila Isabel and city Center. The control samples were collected in Campo Grande near of Parque Nacional da Pedra Branca/RJ (National Park of Pedra Branca/RJ). The leaves were collected from adult plants and after the collection the samples were cleaned and placed in the greenhouse for drying, then were mashed and pressed into tablets forms. The analyses were performed using the energy dispersion X-ray fluorescence (EDXRF), developed on the own laboratory and based in a SiPIN detector and a mini X ray tube. It was possible to detect 16 elements in the analyzed samples: K, Ca, Cr, Mn, Fe, Cu, Zn, Br, Rb, Sr, Ba and Pb. The results shows that, in the studied areas, the analysis of the Nerium oleander plant shows a low-cost option and with a substantial efficiency as an environmental pollution biomonitor. (author)
Takahara, Hikari, E-mail: hikari@rigaku.co.jp [Rigaku Corp., 14-8 Akaoji-cho, Takatsuki, Osaka 569-1146 (Japan); Mori, Yoshihiro [Horiba Ltd., 2 Miyanohigashi, Kisshoin, Minami-ku, Kyoto 601-8510 (Japan); Shibata, Harumi [SUMCO Corporation, Seavance North, 1-2-1 Shibaura, Minato-ku, Tokyo 105-8634 (Japan); Shimazaki, Ayako [Toshiba Corporation, 8, Shinsugita-cho, Isogo-ku, Yokohama 235-8522 (Japan); Shabani, Mohammad B. [Mitsubishi Material Corporation, 1-297, Kitabukuro-cho, Omiya-ku, Saitama 330-8508 (Japan); Yamagami, Motoyuki [Rigaku Corp., 14-8 Akaoji-cho, Takatsuki, Osaka 569-1146 (Japan); Yabumoto, Norikuni [Analysis Atelier Co., 4-36-4, Yoyogi, Shibuya-ku, Tokyo 151-0053 (Japan); Nishihagi, Kazuo [Horiba Ltd., 2 Miyanohigashi, Kisshoin, Minami-ku, Kyoto 601-8510 (Japan); Gohshi, Yohichi [Tsukuba University, 1-1-1, Tennodai, Tsukuba, Ibaraki 305-8571 (Japan)
2013-12-01
Vapor phase treatment (VPT) was under investigation by the International Organization for Standardization/Technical Committee 201/Working Group 2 (ISO/TC201/WG2) to improve the detection limit of total reflection X-ray fluorescence spectroscopy (TXRF) for trace metal analysis of silicon wafers. Round robin test results have confirmed that TXRF intensity increased by VPT for intentional contamination with 5 × 10{sup 9} and 5 × 10{sup 10} atoms/cm{sup 2} Fe and Ni. The magnification of intensity enhancement varied greatly (1.2–4.7 in VPT factor) among the participating laboratories, though reproducible results could be obtained for average of mapping measurement. SEM observation results showed that various features, sizes, and surface densities of particles formed on the wafer after VPT. The particle morphology seems to have some impact on the VPT efficiency. High resolution SEM observation revealed that a certain number of dots with SiO{sub 2}, silicate and/or carbon gathered to form a particle and heavy metals, Ni and Fe in this study were segregated on it. The amount and shape of the residue should be important to control VPT factor. - Highlights: • This paper presents a summary of study results of VPT–TXRF using ISO/TC201/WG2. • Our goal is to analyze the trace metallic contamination on silicon wafer with concentrations below 1 × 10{sup 10} atoms/cm{sup 2}. • The efficiency and mechanism of VPT are discussed under several round robin tests and systematic studies.
Greenwald, R. A.; Frissell, N. A.; de Larquier, S.
2016-12-01
In this paper, we evaluate the performance of three methods used by HF radars in the SuperDARN network for determining the ground ranges to ionospheric scattering volumes. Each method uses somewhat different approaches, but the same equivalent-path analysis. We also show that Snell's Law can be added to this analysis to determine the refractive index of each scattering volume and thereby correct Doppler velocity measurements for ionospheric refraction. Two of these methods make their predictions using the group range to the scattering volume and a virtual height model, while the third method uses the group range and the elevation angle each backscattered return. The effectiveness of each of these methods is evaluated using ray tracing analyses through the International Reference Ionosphere. Ray tracings analysis provides determinations of the initial elevation angle, group range, group range, and refractive index of each ionospheric volume that backscatters signals to the radar. The initial or final elevation angle and the group range are used as inputs to the geolocation methods and the ground range and refractive index serve as reference data against which the predictions of the geolocation methods can be evaluated. We find that the methods using virtual height models actually change the initial elevation angle determined from ray tracing to a different elevation angle that is consistent with the virtual height model. Due to this change, predictions of the ground range and refractive index of scattering volumes located with virtual-height models are rarely consistent with the predictions obtained from ray tracing. In contrast, the geolocation method that uses the group range and initial or final elevation angle yields predictions that are in good agreement with ray tracing. Modifications to the equivalent-path analysis are required to obtain consistent predictions of the ground range and refractive index of backscatter from the topside F-layer.
Arindam Pal
2007-01-01
Full Text Available This paper presents an evaluation of the MIMO performance of three candidate antenna array designs, each embedded within a PDA footprint, using indoor wideband channel measurements at 5.2 GHz alongside channel simulations. A channel model which employs the plane-wave approximation was used to combine the embedded antenna radiation patterns of the candidate devices obtained from far-field pattern measurements and multipath component parameters from an indoor ray-tracer. The 4-element candidate arrays were each constructed using a different type of antenna element, and despite the diverse element directivities, pattern characteristics, and polarization purities, all three devices were constructed to fully exploit diversity in polarization, space, and angle. Thus, low correlation and high information theoretic capacity was observed in each case. A good match between the model and the measurements is also demonstrated, especially for 2ÃƒÂ—2 MIMO subsets of identically or orthogonally polarized linear slot antennas. The interdependencies between the channel XPD, directional spread and pathloss, and the respective impact on channel capacity are also discussed in this paper.
Pal Arindam
2007-01-01
Full Text Available This paper presents an evaluation of the MIMO performance of three candidate antenna array designs, each embedded within a PDA footprint, using indoor wideband channel measurements at 5.2 GHz alongside channel simulations. A channel model which employs the plane-wave approximation was used to combine the embedded antenna radiation patterns of the candidate devices obtained from far-field pattern measurements and multipath component parameters from an indoor ray-tracer. The 4-element candidate arrays were each constructed using a different type of antenna element, and despite the diverse element directivities, pattern characteristics, and polarization purities, all three devices were constructed to fully exploit diversity in polarization, space, and angle. Thus, low correlation and high information theoretic capacity was observed in each case. A good match between the model and the measurements is also demonstrated, especially for MIMO subsets of identically or orthogonally polarized linear slot antennas. The interdependencies between the channel XPD, directional spread and pathloss, and the respective impact on channel capacity are also discussed in this paper.
Silva Marina P
2012-07-01
Full Text Available Abstract Background The application and better understanding of traditional and new breast tumor biomarkers and prognostic factors are increasing due to the fact that they are able to identify individuals at high risk of breast cancer, who may benefit from preventive interventions. Also, biomarkers can make possible for physicians to design an individualized treatment for each patient. Previous studies showed that trace elements (TEs determined by X-Ray Fluorescence (XRF techniques are found in significantly higher concentrations in neoplastic breast tissues (malignant and benign when compared with normal tissues. The aim of this work was to evaluate the potential of TEs, determined by the use of the Energy Dispersive X-Ray Fluorescence (EDXRF technique, as biomarkers and prognostic factors in breast cancer. Methods By using EDXRF, we determined Ca, Fe, Cu, and Zn trace elements concentrations in 106 samples of normal and breast cancer tissues. Cut-off values for each TE were determined through Receiver Operating Characteristic (ROC analysis from the TEs distributions. These values were used to set the positive or negative expression. This expression was subsequently correlated with clinical prognostic factors through Fisher’s exact test and chi-square test. Kaplan Meier survival curves were also evaluated to assess the effect of the expression of TEs in the overall patient survival. Results Concentrations of TEs are higher in neoplastic tissues (malignant and benign when compared with normal tissues. Results from ROC analysis showed that TEs can be considered a tumor biomarker because, after establishing a cut-off value, it was possible to classify different tissues as normal or neoplastic, as well as different types of cancer. The expression of TEs was found statistically correlated with age and menstrual status. The survival curves estimated by the Kaplan-Meier method showed that patients with positive expression for Cu presented a poor
Jie Ji
2012-09-01
Full Text Available This study aims to determine the necessity of applying a mirror coating on the side of a truncated solid dielectric CPC (compound parabolic concentrator since ray tracing analysis has revealed that part of the incoming rays do not undergo total internal reflection, even within the half acceptance angle of the CPC. An experiment was designed and conducted indoors and outdoors to study the effect of mirror coating on the optical performance of a solid dielectric CPC. Ray tracing was also employed for the detailed analysis and its results are compared with the measurements. Based on these, a concept of partial coating is proposed and verified through simulation. The results show that a partly coated solid dielectric CPC may have a better optical efficiency than a solid CPC without coating for a certain range of incidence angles.
Wang, Xiao-Huan; Meng, Qing-Fen; Dong, Ya-Ping; Chen, Mei-Da; Li, Wu
2010-03-01
A rapid multi-element analysis method for clay mineral samples was described. This method utilized a polarized wave-length dispersive X-ray fluorescence spectrometer--Axios PW4400, which had a maximum tube power of 4 000 watts. The method was developed for the determination of As, Mn, Co, Cu, Cr, Dy, Ga, Mo, P, Pb, Rb, S, Sr, Ni, ,Cs, Ta, Th, Ti, U, V, Y, Zn, Zr, MgO, K2O, Na2O, CaO, Fe2O3, Al2O3, SiO2 and so on. Thirty elements in clay mineral species were measured by X-ray fluorescence spectrometry with pressed powder pellets. Spectral interferences, in particular the indirect interferences of each element, were studied. A method to distinguish the interference between each other periodic elements in element periodic table was put forward. The measuring conditions and existence were mainly investigated, and the selected background position as well as corrected spectral overlap for the trace elements were also discussed. It was found that the indirect spectral overlap line was the same important as direct spectral overlap line. Due to inducing the effect of indirect spectral overlap, some elements jlike Bi, Sn, W which do not need analysis were also added to the elements channel. The relative standard deviation (RSD) was in the range of 0.01% to 5.45% except three elements Mo, Cs and Ta. The detection limits, precisions and accuracies for most elements using this method can meet the requirements of sample analysis in clay mineral species.
Donatelli, Jeffrey J.; Sethian, James A.
2014-01-01
X-ray nanocrystallography allows the structure of a macromolecule to be determined from a large ensemble of nanocrystals. However, several parameters, including crystal sizes, orientations, and incident photon flux densities, are initially unknown and images are highly corrupted with noise. Autoindexing techniques, commonly used in conventional crystallography, can determine orientations using Bragg peak patterns, but only up to crystal lattice symmetry. This limitation results in an ambiguity in the orientations, known as the indexing ambiguity, when the diffraction pattern displays less symmetry than the lattice and leads to data that appear twinned if left unresolved. Furthermore, missing phase information must be recovered to determine the imaged object’s structure. We present an algorithmic framework to determine crystal size, incident photon flux density, and orientation in the presence of the indexing ambiguity. We show that phase information can be computed from nanocrystallographic diffraction using an iterative phasing algorithm, without extra experimental requirements, atomicity assumptions, or knowledge of similar structures required by current phasing methods. The feasibility of this approach is tested on simulated data with parameters and noise levels common in current experiments. PMID:24344317
Peille, Philippe; Ceballos, Maria Teresa; Cobo, Beatriz; Wilms, Joern; Bandler, Simon; Smith, Stephen J.; Dauser, Thomas; Brand, Thorsten; den Hartog, Roland; de Plaa, Jelle; Barret, Didier; den Herder, Jan-Willem; Piro, Luigi; Barcons, Xavier; Pointecouteau, Etienne
2016-07-01
The X-ray Integral Field Unit (X-IFU) microcalorimeter, on-board Athena, with its focal plane comprising 3840 Transition Edge Sensors (TESs) operating at 90 mK, will provide unprecedented spectral-imaging capability in the 0.2-12 keV energy range. It will rely on the on-board digital processing of current pulses induced by the heat deposited in the TES absorber, as to recover the energy of each individual events. Assessing the capabilities of the pulse reconstruction is required to understand the overall scientific performance of the X-IFU, notably in terms of energy resolution degradation with both increasing energies and count rates. Using synthetic data streams generated by the X-IFU End-to-End simulator, we present here a comprehensive benchmark of various pulse reconstruction techniques, ranging from standard optimal filtering to more advanced algorithms based on noise covariance matrices. Beside deriving the spectral resolution achieved by the different algorithms, a first assessment of the computing power and ground calibration needs is presented. Overall, all methods show similar performances, with the reconstruction based on noise covariance matrices showing the best improvement with respect to the standard optimal filtering technique. Due to prohibitive calibration needs, this method might however not be applicable to the X-IFU and the best compromise currently appears to be the so-called resistance space analysis which also features very promising high count rate capabilities.
Remmers, Julia; Beirle, Steffen; Doerner, Steffen; Wagner, Thomas
2013-04-01
Multi-Axis (MAX-) DOAS instruments observe scattered sunlight under various mostly slant elevation angles. From such observations information on tropospheric profiles of trace gases and aerosols can be retrieved. MAX-DOAS observations can be used to quantify emissions and to study chemical processes in the atmosphere. Measuring (horizontally and vertically) averaged concentrations the technique can be used as a link between in-situ and satellite measurements. Thus satellite observations of tropospheric trace gases can be validated. IMAX (Parametrized Inversion for MAX-DOAS measurements) is a parametrized method to retrieve vertical profiles of trace gases (such as H2O, NO2, HCHO, CHOCHO) and aerosols. No online calculations are necessary, since look-up tables (LUT) calculated with a Monte Carlo based radiative Transport Model are used. In this manner it is user-friendly, easy to distribute and applicable to every measurement location. The here shown measurements took place in the Maldives in March, 2012, during the CARDEX campaign. Simultaneous sun photometry-, Lidar- and UAV-measurements provide the possibility to validate the new algorithm. We present time series of profiles of trace gas concentrations and aerosol extinction We discuss the effects of clouds on the retrieved results.
Comparison of VTEC from ground-based space geodetic techniques based on ray-traced mapping factors
Heinkelmann, Robert; Alizadeh, M. Mahdi; Schuh, Harald; Deng, Zhiguo; Zus, Florian; Etemadfard, M. Hossein
2016-07-01
For the derivation of vertical total electron content (VTEC) from slant total electron content (STEC), usually a standard approach is used based on mapping functions that assume a single-layer model of the ionosphere (e.g. IERS Conventions 2010). In our study we test the standard approach against a recently developed alternative which is based on station specific ray-traced mapping factors. For the evaluation of this new mapping concept, we compute VTEC at selected Very Long Baseline Interferometry (VLBI) stations using the dispersive delays and the corresponding formal errors obtained by observing extra-galactic radio sources at two radio frequencies in S- and X-bands by the permanent geodetic/astrometric program organized by the IVS (International VLBI Service for Geodesy and Astrometry). Additionally, by applying synchronous sampling and a consistent analysis configuration, we determine VTEC at Global Navigation Satellite System (GNSS) antennas using GPS (Global Positioning System) and/or GLONASS (Globalnaja nawigazionnaja sputnikowaja Sistema) observations provided by the IGS (International GNSS Service) that are operated in the vicinity of the VLBI antennas. We compare the VTEC time series obtained by the individual techniques over a period of about twenty years and describe their characteristics qualitatively and statistically. The length of the time series allows us to assess the long-term climatology of ionospheric VTEC during the last twenty years.
Fu, Lei
2017-05-11
Full-waveform inversion of land seismic data tends to get stuck in a local minimum associated with the waveform misfit function. This problem can be partly mitigated by using an initial velocity model that is close to the true velocity model. This initial starting model can be obtained by inverting traveltimes with ray-tracing traveltime tomography (RT) or wave-equation traveltime (WT) inversion. We have found that WT can provide a more accurate tomogram than RT by inverting the first-arrival traveltimes, and empirical tests suggest that RT is more sensitive to the additive noise in the input data than WT. We present two examples of applying WT and RT to land seismic data acquired in western Saudi Arabia. One of the seismic experiments investigated the water-table depth, and the other one attempted to detect the location of a buried fault. The seismic land data were inverted by WT and RT to generate the P-velocity tomograms, from which we can clearly identify the water table depth along the seismic survey line in the first example and the fault location in the second example.
Custodio, P.J.; Carvalho, M.L. [Centro Fisica Atomica, Universidade de Lisboa, Av. Prof. Gama Pinto, 2, 1649-003, Lisboa (Portugal); Nunes, F. [Hospital Garcia de Orta, Almada (Portugal)
2003-04-01
This work is an application of energy dispersive X-ray fluorescence (EDXRF) as an analytical technique for trace elemental determination in human membrane and placenta and elemental concentrations correlations in both tissues. Whole samples were collected during the delivery from healthy mothers and full-term pregnancies. The age of the mother was between 25 and 40 years old, and the weight of the infants ranged from 2.56 to 4.05 kg. Samples were lyophilised and analysed without any chemical treatment. No significant differences in elemental content of placenta and membrane samples were observed except for Ca. Very low levels of Se, As and Pb were observed in all the analysed samples. Zn, considered as one of the key elements in newborn health, was not significantly different in the analysed samples, all of which originated from healthy mothers and healthy babies. The obtained values agree with the literature except for Ca, which is much higher in the studied samples. (orig.)
Connolly, G D; Lowe, M J S; Temple, J A G; Rokhlin, S I
2010-05-01
The use of ultrasonic arrays has increased dramatically within recent years due to their ability to perform multiple types of inspection and to produce images of the structure through post-processing of received signals. Phased arrays offer many advantages over conventional transducers in the inspection of materials that are inhomogeneous with spatially varying anisotropic properties. In this paper, the arrays are focused on austenitic steel welds as a representative inhomogeneous material. The method of ray-tracing through a previously developed model of an inhomogeneous weld is shown, with particular emphasis on the difficulties presented by material inhomogeneity. The delay laws for the structure are computed and are used to perform synthetic focusing at the post-processing stage of signal data acquired by the array. It is demonstrated for a simulated austenitic weld that by taking material inhomogeneity and anisotropy into account, superior reflector location (and hence, superior sizing) results when compared to cases where these are ignored. The image is thus said to have been corrected. Typical images are produced from both analytical data in the frequency domain and data from finite element simulations in the time domain in a variety of wave modes, including cases with mode conversion and reflections.
Stevens, John [Univ. of California, Berkeley, CA (United States)
2013-12-01
Ray tracing was used to perform optical optimization of arrays of photovoltaic microrods and explore the interaction between light and bubbles of oxygen gas on the surface of the microrods. The incident angle of light was varied over a wide range. The percent of incident light absorbed by the microrods and reflected by the bubbles was computed over this range. It was found that, for the 10 μm diameter, 100 μm tall SrTiO_{3} microrods simulated in the model, the optimal center-to-center spacing was 14 μm for a square grid. This geometry produced 75% average and 90% maximum absorbance. For a triangular grid using the same microrods, the optimal center-to-center spacing was 14 μm. This geometry produced 67% average and 85% maximum absorbance. For a randomly laid out grid of 5 μm diameter, 100 μm tall SrTiO! microrods with an average center-to-center spacing of 20 μm, the average absorption was 23% and the maximum absorption was 43%. For a 50% areal coverage fraction of bubbles on the absorber surface, between 2%-20% of the incident light energy was reflected away from the rods by the bubbles, depending upon incident angle and bubble morphology.
Pelzers, R S; Yu, Q L; Mangkuto, R A
2014-10-01
This article aims to understand the radiation behavior within a photo-reactor, following the ISO 22197-1:2007 standard. The RADIANCE lighting simulation tool, based on the backward ray-tracing modeling method, is employed for a numerical computation of the radiation field. The reflection of the glass cover in the photo-reactor and the test sample influence the amount of irradiance received by the test-sample surface in the photo-reactor setup. The reflection of a white sample limits the irradiance reduction by the glass cover to 1.4 %, but darker samples can lead to an overestimation up to 9.8 % when used in the same setup. This overestimation could introduce considerable error into the interpretation of experiments. Furthermore, this method demonstrates that the kinetics for indoor photocatalytic pollutant degradation can be refined through radiation modeling of the reactor setup. In addition, RADIANCE may aid in future modeling of the more complex indoor environment where radiation affects significantly photocatalytic activity.
Sassen, Kenneth; Knight, Nancy C.; Takano, Yoshihide; Heymsfield, Andrew J.
1994-01-01
During the 1986 Project FIRE (First International Satellite Cloud Climatology Project Regional Experiment) field campaign, four 22 deg halo-producing cirrus clouds were studied jointly from a ground-based polarization lidar and an instrumented aircraft. The lidar data show the vertical cloud structure and the relative position of the aircraft, which collected a total of 84 slides by impaction, preserving the ice crystals for later microscopic examination. Although many particles were too fragile to survive impaction intact, a large fraction of the identifiable crystals were columns and radial bullet rosettes, with both displaying internal cavitations and radial plate-column combinations. Particles that were solid or displayed only a slight amount of internal structure were relatively rare, which shows that the usual model postulated by halo theorists, i.e., the randomly oriented, solid hexagonal crystal, is inappropriate for typical cirrus clouds. With the aid of new ray-tracing simulations for hexagonal hollow-ended column and bullet-rosette models, we evaluate the effects of more realistic ice-crystal structures on halo formation and lidar depolarization and consider why the common halo is not more common in cirrus clouds.
Jensen, K. A.; Ripoll, J.-F.; Wray, A. A.; Joseph, D.; ElHafi, M.
2004-01-01
Five computational methods for solution of the radiative transfer equation in an absorbing-emitting and non-scattering gray medium were compared on a 2 m JP-8 pool fire. The temperature and absorption coefficient fields were taken from a synthetic fire due to the lack of a complete set of experimental data for fires of this size. These quantities were generated by a code that has been shown to agree well with the limited quantity of relevant data in the literature. Reference solutions to the governing equation were determined using the Monte Carlo method and a ray tracing scheme with high angular resolution. Solutions using the discrete transfer method, the discrete ordinate method (DOM) with both S(sub 4) and LC(sub 11) quadratures, and moment model using the M(sub 1) closure were compared to the reference solutions in both isotropic and anisotropic regions of the computational domain. DOM LC(sub 11) is shown to be the more accurate than the commonly used S(sub 4) quadrature technique, especially in anisotropic regions of the fire domain. This represents the first study where the M(sub 1) method was applied to a combustion problem occurring in a complex three-dimensional geometry. The M(sub 1) results agree well with other solution techniques, which is encouraging for future applications to similar problems since it is computationally the least expensive solution technique. Moreover, M(sub 1) results are comparable to DOM S(sub 4).
Ray-trace simulation of CuInS(Se)₂ quantum dot based luminescent solar concentrators.
Hu, Xiangmin; Kang, Ruidan; Zhang, Yongyou; Deng, Luogen; Zhong, Haizheng; Zou, Bingsuo; Shi, Li-Jie
2015-07-27
To enhance the performance of luminescent solar concentrator (LSC), there is an increased need to search novel emissive materials with broad absorption and large Stokes shifts. I-III-VI colloidal CuInS2 and CuInSe2 based nanocrystals, which exhibit strong photoluminescence emissions in the visible to near infrared region with large Stokes shifts, are expected to improve performance in luminescent solar concentrator applications. In this work, the performance of CuInS(Se)2 quantum dots in simple planar LSC is evaluated by applying Monte-Carlo ray-trace simulation. A systematic parameters study was conducted to optimize the performance. An optimized photon concentration ratio of 0.34 for CuInS2 nanocrystals and 1.25 for CuInSe2 nanocrystals doping LSC are obtained from the simulation. The results demonstrated that CuInSe2 based nanocrystals are particularly interesting for luminescent solar concentrator applications, especially to combine with low price Si solar cells.
Use of a ray-based reconstruction algorithm to accurately quantify preclinical microSPECT images.
Vandeghinste, Bert; Van Holen, Roel; Vanhove, Christian; De Vos, Filip; Vandenberghe, Stefaan; Staelens, Steven
2014-01-01
This work aimed to measure the in vivo quantification errors obtained when ray-based iterative reconstruction is used in micro-single-photon emission computed tomography (SPECT). This was investigated with an extensive phantom-based evaluation and two typical in vivo studies using 99mTc and 111In, measured on a commercially available cadmium zinc telluride (CZT)-based small-animal scanner. Iterative reconstruction was implemented on the GPU using ray tracing, including (1) scatter correction, (2) computed tomography-based attenuation correction, (3) resolution recovery, and (4) edge-preserving smoothing. It was validated using a National Electrical Manufacturers Association (NEMA) phantom. The in vivo quantification error was determined for two radiotracers: [99mTc]DMSA in naive mice (n = 10 kidneys) and [111In]octreotide in mice (n = 6) inoculated with a xenograft neuroendocrine tumor (NCI-H727). The measured energy resolution is 5.3% for 140.51 keV (99mTc), 4.8% for 171.30 keV, and 3.3% for 245.39 keV (111In). For 99mTc, an uncorrected quantification error of 28 ± 3% is reduced to 8 ± 3%. For 111In, the error reduces from 26 ± 14% to 6 ± 22%. The in vivo error obtained with 99mTc-dimercaptosuccinic acid ([99mTc]DMSA) is reduced from 16.2 ± 2.8% to -0.3 ± 2.1% and from 16.7 ± 10.1% to 2.2 ± 10.6% with [111In]octreotide. Absolute quantitative in vivo SPECT is possible without explicit system matrix measurements. An absolute in vivo quantification error smaller than 5% was achieved and exemplified for both [99mTc]DMSA and [111In]octreotide.
Use of a Ray-Based Reconstruction Algorithm to Accurately Quantify Preclinical MicroSPECT Images
Bert Vandeghinste
2014-06-01
Full Text Available This work aimed to measure the in vivo quantification errors obtained when ray-based iterative reconstruction is used in micro-single-photon emission computed tomography (SPECT. This was investigated with an extensive phantom-based evaluation and two typical in vivo studies using 99mTc and 111In, measured on a commercially available cadmium zinc telluride (CZT-based small-animal scanner. Iterative reconstruction was implemented on the GPU using ray tracing, including (1 scatter correction, (2 computed tomography-based attenuation correction, (3 resolution recovery, and (4 edge-preserving smoothing. It was validated using a National Electrical Manufacturers Association (NEMA phantom. The in vivo quantification error was determined for two radiotracers: [99mTc]DMSA in naive mice (n = 10 kidneys and [111In]octreotide in mice (n = 6 inoculated with a xenograft neuroendocrine tumor (NCI-H727. The measured energy resolution is 5.3% for 140.51 keV (99mTc, 4.8% for 171.30 keV, and 3.3% for 245.39 keV (111In. For 99mTc, an uncorrected quantification error of 28 ± 3% is reduced to 8 ± 3%. For 111In, the error reduces from 26 ± 14% to 6 ± 22%. The in vivo error obtained with “mTc-dimercaptosuccinic acid ([99mTc]DMSA is reduced from 16.2 ± 2.8% to −0.3 ± 2.1% and from 16.7 ± 10.1% to 2.2 ± 10.6% with [111In]octreotide. Absolute quantitative in vivo SPECT is possible without explicit system matrix measurements. An absolute in vivo quantification error smaller than 5% was achieved and exemplified for both [”mTc]DMSA and [111In]octreotide.
Kolkoori, Sanjeevareddy
2014-07-01
Austenitic welds and dissimilar welds are extensively used in primary circuit pipes and pressure vessels in nuclear power plants, chemical industries and fossil fuelled power plants because of their high fracture toughness, resistance to corrosion and creep at elevated temperatures. However, cracks may initiate in these weld materials during fabrication process or stress operations in service. Thus, it is very important to evaluate the structural integrity of these materials using highly reliable non-destructive testing (NDT) methods. Ultrasonic non-destructive inspection of austenitic welds and dissimilar weld components is complicated because of anisotropic columnar grain structure leading to beam splitting and beam deflection. Simulation tools play an important role in developing advanced reliable ultrasonic testing (UT) techniques and optimizing experimental parameters for inspection of austenitic welds and dissimilar weld components. The main aim of the thesis is to develop a 3D ray tracing model for quantitative evaluation of ultrasonic wave propagation in an inhomogeneous anisotropic austenitic weld material. Inhomogenity in the anisotropic weld material is represented by discretizing into several homogeneous layers. According to ray tracing model, ultrasonic ray paths are traced during its energy propagation through various discretized layers of the material and at each interface the problem of reflection and transmission is solved. The influence of anisotropy on ultrasonic reflection and transmission behaviour in an anisotropic austenitic weld material are quantitatively analyzed in three dimensions. The ultrasonic beam directivity in columnar grained austenitic steel material is determined three dimensionally using Lamb's reciprocity theorem. The developed ray tracing model evaluates the transducer excited ultrasonic fields accurately by taking into account the directivity of the transducer, divergence of the ray bundle, density of rays and phase
Baba, Y.; Shimoyama, I.; Hirao, N.
2016-10-01
In order to determine the chemical states of radioactive cesium (137Cs or 134Cs) sorbed in clay minerals, chemical states of cesium as well as the other alkali metals (sodium and rubidium) sorbed in micaceous oxides have been investigated by X-ray photoelectron spectroscopy (XPS). Since the number of atoms in radioactive cesium is extremely small, we specially focused on chemical states of trace-level alkali metals. For this purpose, we have measured XPS under X-ray total reflection (TR) condition. For cesium, it was shown that ultra-trace amount of cesium down to about 100 pg cm-2 can be detected by TR-XPS. This amount corresponds to about 200 Bq of 137Cs (t1/2 = 30.2 y). It was demonstrated that ultra-trace amount of cesium corresponding to radioactive cesium level can be measured by TR-XPS. As to the chemical states, it was found that core-level binding energy in TR-XPS for trace-level cesium shifted to lower-energy side compared with that for thicker layer. A reverse tendency is observed in sodium. Based on charge transfer within a simple point-charge model, it is concluded that chemical bond between alkali metal and micaceous oxide for ultra-thin layer is more polarized that for thick layer.
Yun, W.; Lewis, S.; Stripe, B.; Chen, S.; Reynolds, D.; Spink, I.; Lyon, A.
2015-12-01
We are developing a patent-pending x-ray microprobe with substantially unprecedented performance attributes: working distances of >2 cm, narrow spectral bandwidth, and large x-ray flux. The outstanding performance is enabled by: (1) a revolutionary new type of high flux x-ray source designed to be >10X brighter than the brightest rotating anode x-ray source available; (2) an axially symmetric x-ray mirror lens with large solid angle collection and high focusing efficiency; and (3) a detector configuration that enables the collection of 10X more x-rays than current microXRF designs. The sensitivity will be ppm-scale, far surpassing charged particle analysis (e.g. EPMA and SEM-EDS), and >1000X throughput over the leading micro-XRFs. Despite the introduction of a number of laboratory microXRF systems in the past decade, the state-of-the-art has been limited primarily by low resolution (~30 μm) and low throughput. This is substantially attributable to a combination of low x-ray source brightness and poor performance x-ray optics. Here we present our initial results in removing the x-ray source bottleneck, in which we use a novel x-ray source using Fine Anode Array Source Technology (Sigray FAAST™). When coupled with our proprietary high efficiency x-ray mirror lens, the throughput achieved is comparable to that of many synchrotron microXRF beamlines. Potential applications of the x-ray microprobe include high throughput mapping of mineralogy at high resolution, including trace elements, such as rare earth metals, and deposits (e.g. siderite, clays), with ppm sensitivity, providing information for properties such as permeability and elastic/mechanical properties, and to provide compositional information for Digital Rock. Additional applications include those in which the limited penetration of electrons limits achieving adequate statistics, such as determining the concentration of precious minerals in mine tailings.
Causin, Valerio; Marega, Carla; Carresi, Pietro; Schiavone, Sergio; Marigo, Antonio
2007-05-03
Thirty-three shopping bags, commonly encountered in the packaging of drug doses, were characterized by wide angle X-ray diffraction (WAXD). Using this single technique, without sample preparation, nearly all the considered samples could be differentiated, achieving a discriminating power of 0.992. The rather large degree of variability existing in grocery bags, even though they are mass produced, was shown, confirming that these items can be useful in tracing the source of illicit drug doses.
M Grafe; M Landers; R Tappero; P Austin; B Gan; A Grabsch; C Klauber
2011-12-31
We describe the application of quantitative evaluation of mineralogy by scanning electron microscopy in combination with techniques commonly available at hard X-ray microprobes to define the mineralogical environment of a bauxite residue core segment with the more specific aim of determining the speciation of trace metals (e.g., Ti, V, Cr, and Mn) within the mineral matrix. Successful trace metal speciation in heterogeneous matrices, such as those encountered in soils or mineral residues, relies on a combination of techniques including spectroscopy, microscopy, diffraction, and wet chemical and physical experiments. Of substantial interest is the ability to define the mineralogy of a sample to infer redox behavior, pH buffering, and mineral-water interfaces that are likely to interact with trace metals through adsorption, coprecipitation, dissolution, or electron transfer reactions. Quantitative evaluation of mineralogy by scanning electron microscopy coupled with micro-focused X-ray diffraction, micro-X-ray fluorescence, and micro-X-ray absorption near edge structure (mXANES) spectroscopy provided detailed insights into the composition of mineral assemblages and their effect on trace metal speciation during this investigation. In the sample investigated, titanium occurs as poorly ordered ilmenite, as rutile, and is substituted in iron oxides. Manganese's spatial correlation to Ti is closely linked to ilmenite, where it appears to substitute for Fe and Ti in the ilmenite structure based on its mXANES signature. Vanadium is associated with ilmenite and goethite but always assumes the +4 oxidation state, whereas chromium is predominantly in the +3 oxidation state and solely associated with iron oxides (goethite and hematite) and appears to substitute for Fe in the goethite structure.
van Aardt, J. A.; van Leeuwen, M.; Kelbe, D.; Kampe, T.; Krause, K.
2015-12-01
Remote sensing is widely accepted as a useful technology for characterizing the Earth surface in an objective, reproducible, and economically feasible manner. To date, the calibration and validation of remote sensing data sets and biophysical parameter estimates remain challenging due to the requirements to sample large areas for ground-truth data collection, and restrictions to sample these data within narrow temporal windows centered around flight campaigns or satellite overpasses. The computer graphics community have taken significant steps to ameliorate some of these challenges by providing an ability to generate synthetic images based on geometrically and optically realistic representations of complex targets and imaging instruments. These synthetic data can be used for conceptual and diagnostic tests of instrumentation prior to sensor deployment or to examine linkages between biophysical characteristics of the Earth surface and at-sensor radiance. In the last two decades, the use of image generation techniques for remote sensing of the vegetated environment has evolved from the simulation of simple homogeneous, hypothetical vegetation canopies, to advanced scenes and renderings with a high degree of photo-realism. Reported virtual scenes comprise up to 100M surface facets; however, due to the tighter coupling between hardware and software development, the full potential of image generation techniques for forestry applications yet remains to be fully explored. In this presentation, we examine the potential computer graphics techniques have for the analysis of forest structure-function relationships and demonstrate techniques that provide for the modeling of extremely high-faceted virtual forest canopies, comprising billions of scene elements. We demonstrate the use of ray tracing simulations for the analysis of gap size distributions and characterization of foliage clumping within spatial footprints that allow for a tight matching between characteristics
Kim, Jee Hoon; Lee, Joon Woo; Ahn, Tae In; Shin, Jong Hwa; Park, Kyung Sub; Son, Jung Eek
2016-01-01
Canopy photosynthesis has typically been estimated using mathematical models that have the following assumptions: the light interception inside the canopy exponentially declines with the canopy depth, and the photosynthetic capacity is affected by light interception as a result of acclimation. However, in actual situations, light interception in the canopy is quite heterogenous depending on environmental factors such as the location, microclimate, leaf area index, and canopy architecture. It is important to apply these factors in an analysis. The objective of the current study is to estimate the canopy photosynthesis of paprika (Capsicum annuum L.) with an analysis of by simulating the intercepted irradiation of the canopy using a 3D ray-tracing and photosynthetic capacity in each layer. By inputting the structural data of an actual plant, the 3D architecture of paprika was reconstructed using graphic software (Houdini FX, FX, Canada). The light curves and A/C i curve of each layer were measured to parameterize the Farquhar, von Caemmerer, and Berry (FvCB) model. The difference in photosynthetic capacity within the canopy was observed. With the intercepted irradiation data and photosynthetic parameters of each layer, the values of an entire plant's photosynthesis rate were estimated by integrating the calculated photosynthesis rate at each layer. The estimated photosynthesis rate of an entire plant showed good agreement with the measured plant using a closed chamber for validation. From the results, this method was considered as a reliable tool to predict canopy photosynthesis using light interception, and can be extended to analyze the canopy photosynthesis in actual greenhouse conditions.
Jee Hoon Kim
2016-09-01
Full Text Available Canopy photosynthesis has typically been estimated using mathematical models that have the following assumptions: the light interception inside the canopy exponentially declines with the canopy depth, and the photosynthetic capacity is affected by light interception as a result of acclimation. However, in actual situations, light interception in the canopy is quite heterogenous depending on environmental factors such as the location, microclimate, leaf area index, and canopy architecture. It is important to apply these factors in an analysis. The objective of the current study is to estimate the canopy photosynthesis of paprika (Capsicum annuum L. with an analysis of by simulating the intercepted irradiation of the canopy using a 3D ray-tracing and photosynthetic capacity in each layer. By inputting the structural data of an actual plant, the 3D architecture of paprika was reconstructed using graphic software (Houdini FX, FX, Canada. The light curves and A/Ci curve of each layer were measured to parameterize the Farquhar, von Caemmerer and Berry (FvCB model. The difference in photosynthetic capacity within the canopy was observed. With the intercepted irradiation data and photosynthetic parameters of each layer, the values of an entire plant’s photosynthesis rate were estimated by integrating the calculated photosynthesis rate at each layer. The estimated photosynthesis rate of an entire plant showed good agreement with the measured plant using a closed chamber for validation. From the results, this method was considered as a reliable tool to predict canopy photosynthesis using light interception, and can be extended to analyze the canopy photosynthesis in actual greenhouse conditions.
Kim, Jee Hoon; Lee, Joon Woo; Ahn, Tae In; Shin, Jong Hwa; Park, Kyung Sub; Son, Jung Eek
2016-01-01
Canopy photosynthesis has typically been estimated using mathematical models that have the following assumptions: the light interception inside the canopy exponentially declines with the canopy depth, and the photosynthetic capacity is affected by light interception as a result of acclimation. However, in actual situations, light interception in the canopy is quite heterogenous depending on environmental factors such as the location, microclimate, leaf area index, and canopy architecture. It is important to apply these factors in an analysis. The objective of the current study is to estimate the canopy photosynthesis of paprika (Capsicum annuum L.) with an analysis of by simulating the intercepted irradiation of the canopy using a 3D ray-tracing and photosynthetic capacity in each layer. By inputting the structural data of an actual plant, the 3D architecture of paprika was reconstructed using graphic software (Houdini FX, FX, Canada). The light curves and A/Ci curve of each layer were measured to parameterize the Farquhar, von Caemmerer, and Berry (FvCB) model. The difference in photosynthetic capacity within the canopy was observed. With the intercepted irradiation data and photosynthetic parameters of each layer, the values of an entire plant's photosynthesis rate were estimated by integrating the calculated photosynthesis rate at each layer. The estimated photosynthesis rate of an entire plant showed good agreement with the measured plant using a closed chamber for validation. From the results, this method was considered as a reliable tool to predict canopy photosynthesis using light interception, and can be extended to analyze the canopy photosynthesis in actual greenhouse conditions. PMID:27667994
Flandes, Alberto; Spilker, Linda; Déau, Estelle
2016-10-01
Saturn's rings are a complex collection of icy particles with diameters from 1 m to few meters. Their natural window of study is the infrared because its temperatures are between 40K and 120K. The main driver of the temperature of these rings is the direct solar radiation as well as the solar radiation reflected off Saturn's atmosphere. The second most important energy source is the infrared radiation coming from Saturn itself. The study of the variations of temperatures of the rings, or, in general, their thermal behavior, may provide important information on their composition, their structure and their dynamics. Models that consider these and other energy sources are able to explain, to a first approximation, the observed temperature variations of the rings. The challenge for these models is to accurately describe the variation of illumination on the rings, i. e., how the illuminated and non-illuminated regions of the ring particles change at the different observation geometries. This shadowing mainly depends on the optical depth, as well as the general structure of the rings.In this work, We show a semi-analytical model that considers the main energy sources of the rings and their average properties (e.g., optical depth, particle size range and vertical distribution). In order to deal with the shadowing at specific geometries, the model uses the ray-tracing technique. The goal is to describe the ring temperatures observed by the Composite Infrared Spectrometer, CIRS, onboard the Cassini spacecraft, which is in orbit around Saturn since 2004. So far, the model is able to reproduce some of the general features of specific regions of the A, B and C rings.
张昕; 刘月巍; 王斌; 季仲贞
2004-01-01
The Spectral Statistical Interpolation (SSI) analysis system of NCEP is used to assimilate meteorological data from the Global Positioning Satellite System (GPS/MET) refraction angles with the variational technique. Verified by radiosonde, including GPS/MET observations into the analysis makes an overall improvement to the analysis variables of temperature, winds, and water vapor. However, the variational model with the ray-tracing method is quite expensive for numerical weather prediction and climate research. For example, about 4 000 GPS/MET refraction angles need to be assimilated to produce an ideal global analysis. Just one iteration of minimization will take more than 24 hours CPU time on the NCEP's Gray C90 computer. Although efforts have been taken to reduce the computational cost, it is still prohibitive for operational data assimilation. In this paper, a parallel version of the three-dimensional variational data assimilation model of GPS/MET occultation measurement suitable for massive parallel processors architectures is developed. The divide-and-conquer strategy is used to achieve parallelism and is implemented by message passing. The authors present the principles for the code's design and examine the performance on the state-of-the-art parallel computers in China. The results show that this parallel model scales favorably as the number of processors is increased. With the Memory-IO technique implemented by the author, the wall clock time per iteration used for assimilating 1420 refraction angles is reduced from 45 s to 12 s using 1420 processors. This suggests that the new parallelized code has the potential to be useful in numerical weather prediction (NWP) and climate studies.
Ghammraoui, Bahaa; Badal, Andreu; Popescu, Lucretiu M
2016-04-21
Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter cross section of the investigated object revealing structural information of tissue under investigation. In the original CSCT proposals the reconstruction of images from coherently scattered x-rays is done at each scattering angle separately using analytic reconstruction. In this work we develop a maximum likelihood estimation of scatter components algorithm (ML-ESCA) that iteratively reconstructs images using a few material component basis functions from coherent scatter projection data. The proposed algorithm combines the measured scatter data at different angles into one reconstruction equation with only a few component images. Also, it accounts for data acquisition statistics and physics, modeling effects such as polychromatic energy spectrum and detector response function. We test the algorithm with simulated projection data obtained with a pencil beam setup using a new version of MC-GPU code, a Graphical Processing Unit version of PENELOPE Monte Carlo particle transport simulation code, that incorporates an improved model of x-ray coherent scattering using experimentally measured molecular interference functions. The results obtained for breast imaging phantoms using adipose and glandular tissue cross sections show that the new algorithm can separate imaging data into basic adipose and water components at radiation doses comparable with Breast Computed Tomography. Simulation results also show the potential for imaging microcalcifications. Overall, the component images obtained with ML-ESCA algorithm have a less noisy appearance than the images obtained with the conventional filtered back projection algorithm for each individual scattering angle. An optimization study for x-ray energy range selection for breast CSCT is also presented.
Ghammraoui, Bahaa; Badal, Andreu; Popescu, Lucretiu M.
2016-04-01
Coherent scatter computed tomography (CSCT) is a reconstructive x-ray imaging technique that yields the spatially resolved coherent-scatter cross section of the investigated object revealing structural information of tissue under investigation. In the original CSCT proposals the reconstruction of images from coherently scattered x-rays is done at each scattering angle separately using analytic reconstruction. In this work we develop a maximum likelihood estimation of scatter components algorithm (ML-ESCA) that iteratively reconstructs images using a few material component basis functions from coherent scatter projection data. The proposed algorithm combines the measured scatter data at different angles into one reconstruction equation with only a few component images. Also, it accounts for data acquisition statistics and physics, modeling effects such as polychromatic energy spectrum and detector response function. We test the algorithm with simulated projection data obtained with a pencil beam setup using a new version of MC-GPU code, a Graphical Processing Unit version of PENELOPE Monte Carlo particle transport simulation code, that incorporates an improved model of x-ray coherent scattering using experimentally measured molecular interference functions. The results obtained for breast imaging phantoms using adipose and glandular tissue cross sections show that the new algorithm can separate imaging data into basic adipose and water components at radiation doses comparable with Breast Computed Tomography. Simulation results also show the potential for imaging microcalcifications. Overall, the component images obtained with ML-ESCA algorithm have a less noisy appearance than the images obtained with the conventional filtered back projection algorithm for each individual scattering angle. An optimization study for x-ray energy range selection for breast CSCT is also presented.
Izaurieta, Fernando; Rodríguez, Eduardo
2011-01-01
Let \\Gamma_{a} be Dirac matrices in d-dimensional Minkowski spacetime, and let \\beta_{i} = B_{i}^{ab} \\Gamma_{ab}, where \\Gamma_{ab} = \\Gamma_{[a} \\Gamma_{b]} and B_{i}^{ab} are arbitrary antisymmetric tensors. The trace of the symmetrized product of an odd number of \\beta-matrices vanishes identically. The trace of the symmetrized product of 2n \\beta-matrices can be written as a sum of certain B-contractions over the integer partitions of n, with every term being multiplied by a numerical factor \\alpha. We provide a general algorithm to compute these \\alpha-coefficients for any d and up to any desired value of n. The algorithm uses random matrices to generate a linear system of equations whose solution is the set of coefficients for a given n. A recurrence relation among these coefficients is shown to hold in all analyzed cases and is used to greatly simplify the computation for large values of n. Numerical values for the \\alpha-coefficients are given for n = 1, ..., 7.
A. S. M. Zahid Kausar
2014-01-01
Full Text Available Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS and closest object finder (COF, are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.
Algorithms for a hand-held miniature x-ray fluorescence analytical instrument
Elam, W.T.; Newman, D.; Ziemba, F. [and others
1998-12-31
The purpose of this joint program was to provide technical assistance with the development of a Miniature X-ray Fluorescence (XRF) Analytical Instrument. This new XRF instrument is designed to overcome the weaknesses of spectrometers commercially available at the present time. Currently available XRF spectrometers (for a complete list see reference 1) convert spectral information to sample composition using the influence coefficients technique or the fundamental parameters method. They require either a standard sample with composition relatively close to the unknown or a detailed knowledge of the sample matrix. They also require a highly-trained operator and the results often depend on the capabilities of the operator. In addition, almost all existing field-portable, hand-held instruments use radioactive sources for excitation. Regulatory limits on such sources restrict them such that they can only provide relatively weak excitation. This limits all current hand-held XRF instruments to poor detection limits and/or long data collection times, in addition to the licensing requirements and disposal problems for radioactive sources. The new XRF instrument was developed jointly by Quantrad Sensor, Inc., the Naval Research Laboratory (NRL), and the Department of Energy (DOE). This report describes the analysis algorithms developed by NRL for the new instrument and the software which embodies them.
Accelerating statistical image reconstruction algorithms for fan-beam x-ray CT using cloud computing
Srivastava, Somesh; Rao, A. Ravishankar; Sheinin, Vadim
2011-03-01
Statistical image reconstruction algorithms potentially offer many advantages to x-ray computed tomography (CT), e.g. lower radiation dose. But, their adoption in practical CT scanners requires extra computation power, which is traditionally provided by incorporating additional computing hardware (e.g. CPU-clusters, GPUs, FPGAs etc.) into a scanner. An alternative solution is to access the required computation power over the internet from a cloud computing service, which is orders-of-magnitude more cost-effective. This is because users only pay a small pay-as-you-go fee for the computation resources used (i.e. CPU time, storage etc.), and completely avoid purchase, maintenance and upgrade costs. In this paper, we investigate the benefits and shortcomings of using cloud computing for statistical image reconstruction. We parallelized the most time-consuming parts of our application, the forward and back projectors, using MapReduce, the standard parallelization library on clouds. From preliminary investigations, we found that a large speedup is possible at a very low cost. But, communication overheads inside MapReduce can limit the maximum speedup, and a better MapReduce implementation might become necessary in the future. All the experiments for this paper, including development and testing, were completed on the Amazon Elastic Compute Cloud (EC2) for less than $20.
Qiao Liya; Wan Xiuhua; Cai Xiaogu; Balamurali Vasudevan; Xiong Ying; Tan Jiaxuan; Guan Zheng
2014-01-01
Background The evaluation of retinal image quality in cataract eyes has gained importance and the clinical modulation transfer functions (MTF) can obtained by aberrometer and double pass (DP) system.This study aimed to compare MTF derived from a ray tracing aberrometer and a DP system in eady cataractous and normal eyes.Methods There were 128 subjects with 61 control eyes and 67 eyes with early cataract defined according to the Lens Opacities Classification System Ⅲ.A laser ray-tracing wavefront aberrometer (iTrace) and a double pass (DP) system (OQAS) assessed ocular MTF for 6.0 mm pupil diameters following dilation.Areas under the MTF (AUMTF) and their correlations were analyzed.Stepwise multiple regression analysis assessed factors affecting the differences between iTrace-and OQAS-derived AUMTF for the early cataract group.Results For both early cataract and control groups,iTrace-derived MTFs were higher than OQAS-derived MTFs across a range of spatial frequencies (P ＜0.01).No significant difference between the two groups occurred for iTrace-derived AUMTF,but the early cataract group had significantly smaller OQAS-derived AUMTF than did the control group (P ＜0.01).AUMTF determined from both the techniques demonstrated significant correlations with nuclear opacities,higher-order aberrations (HOAs),visual acuity,and contrast sensitivity functions,while the OQAS-derived AUMTF also demonstrated significant correlations with age and cortical opacity grade.The factors significantly affecting the difference between iTrace and OQAS AUMTF were root-mean-squared HOAs (standardized beta coefficient=-0.63,P ＜0.01) and age (standardized beta coefficient=0.26,P ＜0.01).Conclusions MTFs determined from a iTrace and a DP system (OQAS) differ significantly in early cataractous and normal subjects.Correlations with visual performance were higher for the DP system.OQAS-derived MTF may be useful as an indicator of visual performance in early cataract eyes.
Jufriadif Na`am
2016-12-01
Full Text Available Dental caries are tooth decay caused by bacterial infections . It is commonly known as cavities. This infection causes demineralization and hence destruction of the hard tissues of the teeth. Diagnosis of dental caries is conventionally carried out with the help of radiographic films. This research aims to develop some algorithm of the mMG method in identifying dental caries based using digital panoramic dental x-ray images. This paper presents an algorithm of using digital panoramic dental x-ray images to detect dental caries. Type of algorithm used in this study is normal mMG, Enhancement mMG, and Smooth mMG. This study makes use of MATLAB and it performs dental caries detection in three algorithms. A dataset of 225 digital panoramic dental x-ray images in .png format is used to edge detection of the object in dental. The results are helpful to identify such caries from the tooth.
Kolkoori, Sanjeevareddy; Hoehne, Christian; Prager, Jens; Rethmeier, Michael; Kreutzbruck, Marc
2014-02-01
Quantitative evaluation of ultrasonic C-scan images in homogeneous and layered anisotropic austenitic materials is of general importance for understanding the influence of anisotropy on wave fields during ultrasonic non-destructive testing and evaluation of these materials. In this contribution, a three dimensional ray tracing method is presented for evaluating ultrasonic C-scan images quantitatively in general homogeneous and layered anisotropic austenitic materials. The directivity of the ultrasonic ray source in general homogeneous columnar grained anisotropic austenitic steel material (including layback orientation) is obtained in three dimensions based on Lamb's reciprocity theorem. As a prerequisite for ray tracing model, the problem of ultrasonic ray energy reflection and transmission coefficients at an interface between (a) isotropic base material and anisotropic austenitic weld material (including layback orientation), (b) two adjacent anisotropic weld metals and (c) anisotropic weld metal and isotropic base material is solved in three dimensions. The influence of columnar grain orientation and layback orientation on ultrasonic C-scan image is quantitatively analyzed in the context of ultrasonic testing of homogeneous and layered austenitic steel materials. The presented quantitative results provide valuable information during ultrasonic characterization of homogeneous and layered anisotropic austenitic steel materials. Copyright © 2013 Elsevier B.V. All rights reserved.
Melli, Seyed Ali, E-mail: sem649@mail.usask.ca [Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, SK (Canada); Wahid, Khan A. [Department of Electrical and Computer Engineering, University of Saskatchewan, Saskatoon, SK (Canada); Babyn, Paul [Department of Medical Imaging, University of Saskatchewan, Saskatoon, SK (Canada); Montgomery, James [College of Medicine, University of Saskatchewan, Saskatoon, SK (Canada); Snead, Elisabeth [Western College of Veterinary Medicine, University of Saskatchewan, Saskatoon, SK (Canada); El-Gayed, Ali [College of Medicine, University of Saskatchewan, Saskatoon, SK (Canada); Pettitt, Murray; Wolkowski, Bailey [College of Agriculture and Bioresources, University of Saskatchewan, Saskatoon, SK (Canada); Wesolowski, Michal [Department of Medical Imaging, University of Saskatchewan, Saskatoon, SK (Canada)
2016-01-11
Synchrotron source propagation-based X-ray phase contrast computed tomography is increasingly used in pre-clinical imaging. However, it typically requires a large number of projections, and subsequently a large radiation dose, to produce high quality images. To improve the applicability of this imaging technique, reconstruction algorithms that can reduce the radiation dose and acquisition time without degrading image quality are needed. The proposed research focused on using a novel combination of Douglas–Rachford splitting and randomized Kaczmarz algorithms to solve large-scale total variation based optimization in a compressed sensing framework to reconstruct 2D images from a reduced number of projections. Visual assessment and quantitative performance evaluations of a synthetic abdomen phantom and real reconstructed image of an ex-vivo slice of canine prostate tissue demonstrate that the proposed algorithm is competitive in reconstruction process compared with other well-known algorithms. An additional potential benefit of reducing the number of projections would be reduction of time for motion artifact to occur if the sample moves during image acquisition. Use of this reconstruction algorithm to reduce the required number of projections in synchrotron source propagation-based X-ray phase contrast computed tomography is an effective form of dose reduction that may pave the way for imaging of in-vivo samples.
Desai, Naeem M.; Lionheart, William R. B.
2016-11-01
We give an explicit plane-by-plane filtered back-projection reconstruction algorithm for the transverse ray transform of symmetric second rank tensor fields on Euclidean three-space, using data from rotation about three orthogonal axes. We show that in the general case two-axis data is insufficient, but we give an explicit reconstruction procedure for the potential case with two-axis data. We describe a numerical implementation of the three-axis algorithm and give reconstruction results for simulated data.
Somoza, J.R. [California Univ., Berkeley (United States). Dept. of Chemistry]|[Lawrence Berkeley Lab., CA (United States); Szoeke, H. [Lawrence Livermore National Lab., CA (United States); Goodman, D.M. [Lawrence Livermore National Lab., CA (United States); Beran, P. [Lawrence Livermore National Lab., CA (United States); Truckses, D. [Wisconsin Univ., Madison, WI (United States). Dept. of Biochemistry; Kim, S.H. [California Univ., Berkeley (United States). Dept. of Chemistry]|[Lawrence Berkeley Lab., CA (United States); Szoeke, A. [Lawrence Livermore National Lab., CA (United States)
1995-09-01
The holographic method makes use of partially modeled electron density and experimentally measured structure-factor amplitudes to recover electron density corresponding to the unmodeled part of a crystal structure. This paper describes a fast algorithm that makes it possible to apply the holographic method to sizable crystallographic problems. The algorithm uses positivity constraints on the electron density and can incorporate a `target` electron density, making it similar to solvent flattening. The potential for applying the holographic method to macromolecular X-ray crystallography is assessed using both synthetic and experimental data. (orig.).
Forbang, R Teboh [John Hopkins University, Baltimore, MD (United States)
2014-06-01
Purpose: MultiPlan, the treatment planning system for the CyberKnife Robotic Radiosurgery system offers two approaches to dose computation, namely Ray-Tracing (RT), the default technique and Monte Carlo (MC), an option. RT is deterministic, however it accounts for primary heterogeneity only. MC on the other hand has an uncertainty associated with the calculation results. The advantage is that in addition, it accounts for heterogeneity effects on the scattered dose. Not all sites will benefit from MC. The goal of this work was to focus on central nervous system (CNS) tumors and compare dosimetrically, treatment plans computed with RT versus MC. Methods: Treatment plans were computed using both RT and MC for sites covering (a) the brain (b) C-spine (c) upper T-spine (d) lower T-spine (e) L-spine and (f) sacrum. RT was first used to compute clinically valid treatment plans. Then the same treatment parameters, monitor units, beam weights, etc., were used in the MC algorithm to compute the dose distribution. The plans were then compared for tumor coverage to illustrate the difference if any. All MC calculations were performed at a 1% uncertainty. Results: Using the RT technique, the tumor coverage for the brain, C-spine (C3–C7), upper T-spine (T4–T6), lower T-spine (T10), Lspine (L2) and sacrum were 96.8%, 93.1%, 97.2%, 87.3%, 91.1%, and 95.3%. The corresponding tumor coverage based on the MC approach was 98.2%, 95.3%, 87.55%, 88.2%, 92.5%, and 95.3%. It should be noted that the acceptable planning target coverage for our clinical practice is >95%. The coverage can be compromised for spine tumors to spare normal tissues such as the spinal cord. Conclusion: For treatment planning involving the CNS, RT and MC appear to be similar for most sites but for the T-spine area where most of the beams traverse lung tissue. In this case, MC is highly recommended.
Amberger, Martin A.; Hoeltig, Michael [University of Hamburg, Institute for Inorganic and Applied Chemistry, Martin-Luther-King-Platz 6, D-20146 Hamburg (Germany); Broekaert, Jose A.C., E-mail: jose.broekaert@chemie.uni-hamburg.d [University of Hamburg, Institute for Inorganic and Applied Chemistry, Martin-Luther-King-Platz 6, D-20146 Hamburg (Germany)
2010-02-15
The use of slurry sampling total reflection X-ray fluorescence spectrometry (SlS-TXRF) for the direct determination of Ca, Cr, Cu, Fe, Mn and Ti in four boron nitride powders has been described. Measurements of the zeta potential showed that slurries with good stabilities can be obtained by the addition of polyethylenimine (PEI) at a concentration of 0.1 wt.% and by adjusting the pH at 4. For the optimization of the concentration of boron nitride in the slurries the net line intensities and the signal to background ratios were determined for the trace elements Ca and Ti as well as for the internal standard element Ga in the case of concentrations of boron nitride ranging from 1 to 30 mg mL{sup -1}. As a compromise with respect to high net line intensities and high signal to background ratios, concentrations of 5 mg mL{sup -1} of boron nitride were found suitable and were used for all further measurements. The limits of detection of SlS-TXRF for the boron nitride powders were found to range from 0.062 to 1.6 mug g{sup -1} for Cu and Ca, respectively. Herewith, they are higher than those obtained in solid sampling and slurry sampling graphite furnace atomic absorption spectrometry (SoS-GFAAS, SlS-GFAAS) as well as those of solid sampling electrothermal evaporation inductively coupled plasma optical emission spectrometry (SoS-ETV-ICP-OES). For Ca and Fe as well as for Cu and Fe, however, they were found to be lower than for GFAAS and for ICP-OES subsequent to wet chemical digestion, respectively. The universal applicability of SlS-TXRF to the analysis of samples with a wide variety of matrices could be demonstrated by the analysis of certified reference materials such as SiC, Al{sub 2}O{sub 3}, powdered bovine liver and borate ore with a single calibration. The correlation coefficients of the plots for the values found for Ca, Fe and Ti by SlS-TXRF in the boron nitride powders as well as in the before mentioned samples versus the reference values for the respective
Model-based x-ray energy spectrum estimation algorithm from CT scanning data with spectrum filter
Li, Lei; Wang, Lin-Yuan; Yan, Bin
2016-10-01
With the development of technology, the traditional X-ray CT can't meet the modern medical and industry needs for component distinguish and identification. This is due to the inconsistency of X-ray imaging system and reconstruction algorithm. In the current CT systems, X-ray spectrum produced by X-ray source is continuous in energy range determined by tube voltage and energy filter, and the attenuation coefficient of object is varied with the X-ray energy. So the distribution of X-ray energy spectrum plays an important role for beam-hardening correction, dual energy CT image reconstruction or dose calculation. However, due to high ill-condition and ill-posed feature of system equations of transmission measurement data, statistical fluctuations of X ray quantum and noise pollution, it is very hard to get stable and accurate spectrum estimation using existing methods. In this paper, a model-based X-ray energy spectrum estimation method from CT scanning data with energy spectrum filter is proposed. First, transmission measurement data were accurately acquired by CT scan and measurement using phantoms with different energy spectrum filter. Second, a physical meaningful X-ray tube spectrum model was established with weighted gaussian functions and priori information such as continuity of bremsstrahlung and specificity of characteristic emission and estimation information of average attenuation coefficient. The parameter in model was optimized to get the best estimation result for filtered spectrum. Finally, the original energy spectrum was reconstructed from filtered spectrum estimation with filter priori information. Experimental results demonstrate that the stability and accuracy of X ray energy spectrum estimation using the proposed method are improved significantly.
Tibaldo, L
2015-01-01
Cosmic rays up to at least PeV energies are usually described in the framework of an elementary scenario that involves acceleration by objects that are located in the disk of the Milky Way, such as supernova remnants or massive star-forming regions, and then diffusive propagation throughout the Galaxy. Details of the propagation process are so far inferred mainly from the composition of cosmic rays measured near the Earth and then extrapolated to the whole Galaxy. The details of the propagation in the Galactic halo and the escape into the intergalactic medium remain uncertain. The densities of cosmic rays in specific locations can be traced via the gamma rays they produce in inelastic collisions with clouds of interstellar gas. Therefore, we analyze 73 months of Fermi-LAT data from 300 MeV to 10 GeV in the direction of several high- and intermediate-velocity clouds that are located in the halo of the Milky Way. These clouds are supposed to be free of internal sources of cosmic rays and hence any gamma-ray emi...
Marguí, E., E-mail: eva.margui@udg.edu [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain); Zawisza, B.; Skorek, R. [Institute of Chemistry, University of Silesia, Szkolna 9, 40-006 Katowice (Poland); Theato, T. [SPECTRO Analytical Instruments GmbH, Boschstr. 10, 47533 Kleve (Germany); Queralt, I. [Laboratory of X-Ray Analytical Applications, Institute of Earth Sciences Jaume Almera, CSIC, Solé Sabarís s/n, 08028 Barcelona (Spain); Hidalgo, M. [Department of Chemistry, University of Girona, Campus Montilivi, 17071 Girona (Spain); Sitko, R. [Institute of Chemistry, University of Silesia, Szkolna 9, 40-006 Katowice (Poland)
2013-10-01
This study was aimed to achieve improved instrumental sensitivity and detection limits for multielement determination of V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Ga, Se, Pb and Cd in liquid samples by using different X-ray fluorescence (XRF) configurations (a benchtop energy-dispersive X-ray fluorescence spectrometer, a benchtop polarised energy-dispersive X-ray fluorescence spectrometer and a wavelength-dispersive X-ray fluorescence spectrometer). The preconcentration of metals from liquid solutions consisted on a solid-phase extraction using carbon nanotubes (CNTs) as solid sorbents. After the extraction step, the aqueous sample was filtered and CNTs with the absorbed elements were collected onto a filter paper which was directly analyzed by XRF. The calculated detection limits in all cases were in the low ng mL{sup −1} range. Nevertheless, results obtained indicate the benefits, in terms of sensitivity, of using polarized X-ray sources using different secondary targets in comparison to conventional XRF systems, above all if Cd determination is required. The developed methodologies, using the aforementioned equipments, have been applied for multielement determination in water samples from an industrial area of Poland. - Highlights: • Use of carbon nanotubes for preconcentration of V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Ga, Se, Pb and Cd • Combination of this preconcentration procedure with different XRF systems • Benefit of using polarized X-ray sources for trace element determination.
Baba, Y., E-mail: baba.yuji@jaea.go.jp; Shimoyama, I.; Hirao, N.
2016-10-30
Highlights: • Total-reflection XPS for Na, Rb, and Cs on micaceous oxide were measured. • Detection limit of 100 pg cm{sup −2} was achieved in Cs, corresponding to 200 Bq of {sup 137}Cs (t{sub 1/2} = 30.2 y). • Cs sorbed in micaceous oxides is found ionically bonded with oxygen atoms. - Abstract: In order to determine the chemical states of radioactive cesium ({sup 137}Cs or {sup 134}Cs) sorbed in clay minerals, chemical states of cesium as well as the other alkali metals (sodium and rubidium) sorbed in micaceous oxides have been investigated by X-ray photoelectron spectroscopy (XPS). Since the number of atoms in radioactive cesium is extremely small, we specially focused on chemical states of trace-level alkali metals. For this purpose, we have measured XPS under X-ray total reflection (TR) condition. For cesium, it was shown that ultra-trace amount of cesium down to about 100 pg cm{sup −2} can be detected by TR-XPS. This amount corresponds to about 200 Bq of {sup 137}Cs (t{sub 1/2} = 30.2 y). It was demonstrated that ultra-trace amount of cesium corresponding to radioactive cesium level can be measured by TR-XPS. As to the chemical states, it was found that core-level binding energy in TR-XPS for trace-level cesium shifted to lower-energy side compared with that for thicker layer. A reverse tendency is observed in sodium. Based on charge transfer within a simple point-charge model, it is concluded that chemical bond between alkali metal and micaceous oxide for ultra-thin layer is more polarized that for thick layer.
Gastellu-Etchegorry, Jean-Philippe; Yin, Tiangang; Lauret, Nicolas; Grau, Eloi; Rubio, Jeremy; Cook, Bruce D.; Morton, Douglas C.; Sun, Guoqing
2016-01-01
Light Detection And Ranging (LiDAR) provides unique data on the 3-D structure of atmosphere constituents and the Earth's surface. Simulating LiDAR returns for different laser technologies and Earth scenes is fundamental for evaluating and interpreting signal and noise in LiDAR data. Different types of models are capable of simulating LiDAR waveforms of Earth surfaces. Semi-empirical and geometric models can be imprecise because they rely on simplified simulations of Earth surfaces and light interaction mechanisms. On the other hand, Monte Carlo ray tracing (MCRT) models are potentially accurate but require long computational time. Here, we present a new LiDAR waveform simulation tool that is based on the introduction of a quasi-Monte Carlo ray tracing approach in the Discrete Anisotropic Radiative Transfer (DART) model. Two new approaches, the so-called "box method" and "Ray Carlo method", are implemented to provide robust and accurate simulations of LiDAR waveforms for any landscape, atmosphere and LiDAR sensor configuration (view direction, footprint size, pulse characteristics, etc.). The box method accelerates the selection of the scattering direction of a photon in the presence of scatterers with non-invertible phase function. The Ray Carlo method brings traditional ray-tracking into MCRT simulation, which makes computational time independent of LiDAR field of view (FOV) and reception solid angle. Both methods are fast enough for simulating multi-pulse acquisition. Sensitivity studies with various landscapes and atmosphere constituents are presented, and the simulated LiDAR signals compare favorably with their associated reflectance images and Laser Vegetation Imaging Sensor (LVIS) waveforms. The LiDAR module is fully integrated into DART, enabling more detailed simulations of LiDAR sensitivity to specific scene elements (e.g., atmospheric aerosols, leaf area, branches, or topography) and sensor configuration for airborne or satellite LiDAR sensors.
Wee, Tae-Kwon; Kuo, Ying-Hwa; Lee, Dong-Kyou
2010-12-01
A two-dimensional curved ray tracer (CRT) is developed to study the propagation path of radio signals across a heterogeneous planetary atmosphere. The method, designed to achieve improvements in both computational efficiency and accuracy over conventional straight-line methods, takes rays' first-order bending into account to better describe curved raypaths in the stratified atmosphere. CRT is then used to simulate the phase path from GPS radio occultation (RO). The merit of the ray tracing approach in GPS RO is explicit consideration of horizontal variation in the atmosphere, which may lead to a sizable error but is disregarded in traditional retrieval schemes. In addition, direct modeling of the phase path takes advantage of simple error characteristics in the measurement. With provision of ionospheric and neutral atmospheric refractive indices, in this effort, rays are traced along the full range of GPS-low Earth orbiting (LEO) radio links just as the measurements are made in real life. Here, ray shooting is employed to realize the observed radio links with controlled accuracy. CRT largely reproduces the very measured characteristics of GPS signals. When compared, the measured and simulated phases show remarkable agreement. The cross validation between CRT and GPS RO has confirmed not only the strength of CRT but also the high accuracy of GPS RO measurements. The primary motivation for this study is enabling effective quality control for GPS RO data, overcoming a complicated error structure in the high-level data. CRT has also shown a great deal of potential for improved utilization of GPS RO data for geophysical research.
First-order convex feasibility algorithms for x-ray CT
Sidky, Emil Y.; Jørgensen, Jakob Heide; Pan, Xiaochuan
2013-01-01
Purpose: Iterative image reconstruction (IIR) algorithms in computed tomography (CT) are based on algorithms for solving a particular optimization problem. Design of the IIR algorithm, therefore, is aided by knowledge of the solution to the optimization problem on which it is based. Often times......, however, it is impractical to achieve accurate solution to the optimization of interest, which complicates design of IIR algorithms. This issue is particularly acute for CT with a limited angular-range scan, which leads to poorly conditioned system matrices and difficult to solve optimization problems....... In this paper, we develop IIR algorithms which solve a certain type of optimization called convex feasibility. The convex feasibility approach can provide alternatives to unconstrained optimization approaches and at the same time allow for rapidly convergent algorithms for their solution—thereby facilitating...
Kohei Arai
2012-06-01
Full Text Available Simulation method of sea water which contains spherical and non-spherical particles of suspended solid and phytoplankton based on Monte Carlo Ray Tracing: MCRT is proposed for identifying non-spherical species of phytoplankton. From the simulation results, it is found that the proposed MCRT model is validated. Also some possibility of identification of spherical and non-spherical shapes of particles which are contained in sea water is shown. Meanwhile, simulations with the different shape of particles, Prolate and Oblate show that Degree of Polarization: DP depends on shapes. Therefore, non-spherical shape of phytoplankton can be identified with polarization characteristics measurements of the ocean.
Kohei Arai
2013-04-01
Full Text Available Comparative study on linear and nonlinear mixed pixel models of which pixels in remote sensing satellite images is composed with plural ground cover materials mixed together, is conducted for remote sensing satellite image analysis. The mixed pixel models are based on Cierniewski of ground surface reflectance model. The comparative study is conducted by using of Monte Carlo Ray Tracing: MCRT simulations. Through simulation study, the difference between linear and nonlinear mixed pixel models is clarified. Also it is found that the simulation model is validated.
Barbosa, Rommel M; Batista, Bruno L; Barião, Camila V; Varrique, Renan M; Coelho, Vinicius A; Campiglia, Andres D; Barbosa, Fernando
2015-10-01
A practical and easy control of the authenticity of organic sugarcane samples based on the use of machine-learning algorithms and trace elements determination by inductively coupled plasma mass spectrometry is proposed. Reference ranges for 32 chemical elements in 22 samples of sugarcane (13 organic and 9 non organic) were established and then two algorithms, Naive Bayes (NB) and Random Forest (RF), were evaluated to classify the samples. Accurate results (>90%) were obtained when using all variables (i.e., 32 elements). However, accuracy was improved (95.4% for NB) when only eight minerals (Rb, U, Al, Sr, Dy, Nb, Ta, Mo), chosen by a feature selection algorithm, were employed. Thus, the use of a fingerprint based on trace element levels associated with classification machine learning algorithms may be used as a simple alternative for authenticity evaluation of organic sugarcane samples.
Bieberle, M; Hampel, U
2015-06-13
Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.
Cardille, J. A.
2015-12-01
With the opening of the Landsat archive, researchers have a vast new data source teeming with imagery and potential. Beyond Landsat, data from other sensors is newly available as well: these include ALOS/PALSAR, Sentinel-1 and -2, MERIS, and many more. Google Earth Engine, developed to organize and provide analysis tools for these immense data sets, is an ideal platform for researchers trying to sift through huge image stacks. It offers nearly unlimited processing power and storage with a straightforward programming interface. Yet labeling forest change through time remains challenging given the current state of the art for interpreting remote sensing image sequences. Moreover, combining data from very different image platforms remains quite difficult. To address these challenges, we developed the BULC algorithm (Bayesian Updating of Land Cover), designed for the continuous updating of land-cover classifications through time in large data sets. The algorithm ingests data from any of the wide variety of earth-resources sensors; it maintains a running estimate of land-cover probabilities and the most probable class at all time points along a sequence of events. Here we compare BULC results from two study sites that witnessed considerable forest change in the last 40 years: the Pacific Northwest of the United States and the Mato Grosso region of Brazil. In Brazil, we incorporated rough classifications from more than 100 images of varying quality, mixing imagery from more than 10 different sensors. In the Pacific Northwest, we used BULC to identify forest changes due to logging and urbanization from 1973 to the present. Both regions had classification sequences that were better than many of the component days, effectively ignoring clouds and other unwanted signal while fusing the information contained on several platforms. As we leave remote sensing's data-poor era and enter a period with multiple looks at Earth's surface from multiple sensors over a short period of
Doert, Marlene [Technische Universitaet Dortmund (Germany); Ruhr-Universitaet Bochum (Germany); Einecke, Sabrina [Technische Universitaet Dortmund (Germany); Errando, Manel [Barnard College, Columbia University, New York City (United States)
2015-07-01
The second Fermi-LAT source catalog (2FGL) is the deepest all-sky survey of the gamma-ray sky currently available to the community. Out of the 1873 catalog sources, 576 remain unassociated. We present a search for active galactic nuclei (AGN) among these unassociated objects, which aims at a reduction of the number of unassociated gamma-ray sources and a more complete characterization of the population of gamma-ray emitting AGN. Our study uses two complimentary machine learning algorithms which are individually trained on the gamma-ray properties of associated 2FGL sources and thereafter applied to the unassociated sample. The intersection of the two methods yields a high-confidence sample of 231 AGN candidate sources. We estimate the performance of the classification by taking inherent differences between the samples of associated and unassociated 2FGL sources into account. A search for infra-red counterparts and first results from follow-up studies in the X-ray band using Swift satellite data for a subset of our AGN candidates are also presented.
Hu, Jian-Ying; Hirokawa, Takeshi; Nishiyama, Fumitaka; Kimura, Goji; Kiso, Yoshiyuki; Ito, Kazuaki; Shoto, Eiji [Hiroshima Univ., Higashi-Hiroshima (Japan). Faculty of Engineering
1993-12-31
A misch metal, an alloy of light rare earth elements, was analyzed by a new coupled analytical method, ITP-PIXE (isotachophoresis - Particle Induced X-ray Emission): The sample solution containing ca.1 mg misch metal was separated and fractionated by the use of a preparative isotachophoretic analyzer. The dropwise fractions containing nanomole rare earth elements were analyzed off-line by PIXE. The matrix effect in X-ray measurement was reduced by the isotachophoretic removing of the dominant lanthanoids and preconcentration of the trace elements of interest. Consequently the minor elements, Sm, Gd, Tb, Dy, Ho, Er, Yb and Y could be determined accurately. The most trace element found was Yb (4ppm, 4ng in 1mg sample). The good accuracy of ITP-PIXE method was also demonstrated for several model samples of lanthanoids, where La was the dominant element and the thirteen lanthanoids were the minor elements. The ratio was varied from 500:1 to 50000:1. Even in the case of 50000:1, ca.10% accuracy was achieved for each minor element except for Sm(23%), Gd(17%) and Yb(18%). The analytical results by ITP-PIXE were compared with those by means of ICP-AES (Inductively Coupled Plasma - Atomic Emission Spectrometry). (author).
Feizi, Sepehr; Delfazayebaher, Siamak; Ownagh, Vahid; Sadeghpour, Fatemeh
2017-08-03
To evaluate the agreement between total corneal astigmatism calculated by vector summation of anterior and posterior corneal astigmatism (TCAVec) and total corneal astigmatism measured by ray tracing (TCARay). This study enrolled a total of 204 right eyes of 204 normal subjects. The eyes were measured using a Galilei double Scheimpflug analyzer. The measured parameters included simulated keratometric astigmatism using the keratometric index, anterior corneal astigmatism using the corneal refractive index, posterior corneal astigmatism, and TCARay. TCAVec was derived by vector summation of the astigmatism on the anterior and posterior corneal surfaces. The magnitudes and axes of TCAVec and TCARay were compared. The Pearson correlation coefficient and Bland-Altman plots were used to assess the relationship and agreement between TCAVec and TCARay, respectively. The mean TCAVec and TCARay magnitudes were 0.76±0.57D and 1.00±0.78D, respectively (Pvector summation and ray tracing methods cannot be used interchangeably. There was a systematic error between the TCAVec and TCARay magnitudes. Copyright © 2017 Spanish General Council of Optometry. Published by Elsevier España, S.L.U. All rights reserved.
A constrained, total-variation minimization algorithm for low-intensity X-ray CT
Sidky, Emil Y; Ullberg, Christer; Pan, Xiaochuan
2010-01-01
Purpose: We develop an iterative image-reconstruction algorithm for application to low-intensity computed tomography (CT) projection data, which is based on constrained, total-variation (TV) minimization. The algorithm design focuses on recovering structure on length scales comparable to a detector-bin width. Method: Recovering the resolution on the scale of a detector bin, requires that pixel size be much smaller than the bin width. The resulting image array contains many more pixels than data, and this undersampling is overcome with a combination of Fourier upsampling of each projection and the use of constrained, TV-minimization, as suggested by compressive sensing. The presented pseudo-code for solving constrained, TV-minimization is designed to yield an accurate solution to this optimization problem within 100 iterations. Results: The proposed image-reconstruction algorithm is applied to a low-intensity scan of a rabbit with a thin wire, to test resolution. The proposed algorithm is compared with filtere...
Berry, Jonna Elizabeth [Iowa State Univ., Ames, IA (United States)
2016-10-25
This dissertation describes a variety of studies on the determination of trace elements in samples with forensic importance. Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) was used to determine the trace element composition of numerous lipstick samples. Lipstick samples were determined to be homogeneous. Most lipstick samples of similar colors were readily distinguishable at a 95% confidence interval based on trace element composition. Numerous strands of a multi-strand speaker cable were analyzed by LA-ICP-MS. The strands in this study are spatially heterogeneous in trace element composition. In actual forensic applications, the possibility of spatial heterogeneity must be considered, especially in cases where only small samples (e.g., copper wire fragments after an explosion) are available. The effects of many unpredictable variables, such as weather, temperature, and human activity, on the retention of gunshot residue (GSR) around projectile wounds were assessed with LAICP- MS. Skin samples around gunshot and stab wounds and larvae feeding in and around the wounds on decomposing pig carcasses were analyzed for elements consistent with GSR (Sb, Pb, Ba, and Cu). These elements were detected at higher levels in skin and larvae samples around the gunshot wounds compared to the stab wounds for an extended period of time throughout decomposition in both a winter and summer study. After decomposition, radiographic images of the pig bones containing possible damage from bullets revealed metallic particles embedded within a number of bones. Metallic particles within the bones were analyzed with x-ray, K-edge densitometry and determined to contain lead, indicating that bullet residue can be retained throughout decomposition and detected within bones containing projectile trauma.
Pham, Mai Quyen; Ducros, Nicolas; Nicolas, Barbara
2017-03-01
Spectral computed tomography (CT) exploits the measurements obtained by a photon counting detector to reconstruct the chemical composition of an object. In particular, spectral CT has shown a very good ability to image K-edge contrast agent. Spectral CT is an inverse problem that can be addressed solving two subproblems, namely the basis material decomposition (BMD) problem and the tomographic reconstruction problem. In this work, we focus on the BMD problem, which is ill-posed and nonlinear. The BDM problem is classically either linearized, which enables reconstruction based on compressed sensing methods, or nonlinearly solved with no explicit regularization scheme. In a previous communication, we proposed a nonlinear regularized Gauss-Newton (GN) algorithm.1 However, this algorithm can only be applied to convex regularization functionals. In particular, the lp (p thorax phantom made of soft tissue, bone and gadolinium, which is scanned with a 90-kV x-ray tube and a 3-bin photon counting detector.
Lehmer, B.D; Brandt, W.N.; Schneider, D.P.; Steffen, A.T.; Alexander, D.M.; Bell, E.F.; Hornschemeier, A.E.; McIntosh, D.H.; Bauer, F.E.; Gilli, R.; Mainieri, V.; Silverman, J.D.; Tozzi, P.; Wolf, C.
2008-01-01
We report on the X-ray evolution over the last approx.9 Gyr of cosmic history (i.e., since z = 1.4) of late-type galaxy populations in the Chandra Deep Field-North and Extended Chandra Deep Field-South (CDF-N and E-CDF-S. respectively; jointly CDFs) survey fields. Our late-type galaxy sample consists of 2568 galaxies. which were identified using rest-frame optical colors and HST morphologies. We utilized X-ray stacking analyses to investigate the X-ray emission from these galaxies, emphasizing the contributions from normal galaxies that are not dominated by active galactic nuclei (AGNs). Over this redshift range, we find significant increases (factors of approx. 5-10) in the X-ray-to-optical mean luminosity ratio (L(sub x)/L(sub B)) and the X-ray-to-stellar-mass mean ratio (L(sub x)/M(sub *)) for galaxy populations selected by L(sub B) and M(sub *), respectively. When analyzing galaxy samples selected via SFR, we find that the mean X-ray-to-SFR ratio (L(sub x)/SFR) is consistent with being constant over the entire redshift range for galaxies with SFR = 1-100 Solar Mass/yr, thus demonstrating that X-ray emission can be used as a robust indicator of star-formation activity out to z approx. 1.4. We find that the star-formation activity (as traced by X-ray luminosity) per unit stellar mass in a given redshift bin increases with decreasing stellar mass over the redshift range z = 0.2-1, which is consistent with previous studies of how star-formation activity depends on stellar mass. Finally, we extend our X-ray analyses to Lyman break galaxies at z approx. 3 and estimate that L(sub x)/L(sub B) at z approx. 3 is similar to its value at z = 1.4.
Dittmann, Jonas
2016-01-01
Cone beam projection is an essential and particularly time consuming part of any iterative tomographic reconstruction algorithm. On current graphics hardware especially the amount and pattern of memory accesses is a limiting factor when read-only textures cannot be used. With the final objective of accelerating iterative reconstruction techniques, a non-oversampling Joseph-like raytracing projection algorithm for three dimensions featuring both a branchless sampling loop and a cache friendly memory access pattern is presented. An interpretation of the employed interpolation scheme is given with respect to the effective beam and voxel models implied. The method is further compared to existing techniques, and the modifications required to implement further voxel and beam shape models are outlined. Both memory access rates and total run time are benchmarked on a current consumer grade graphics processing unit and explicitly compared to the performance of a classic Digital Differential Analyzer (DDA) algorithm. T...
Islam, M J; Reza, A W; Kausar, A S M Z; Ramiah, H
2014-01-01
The advent of technology with the increasing use of wireless network has led to the development of Wireless Body Area Network (WBAN) to continuously monitor the change of physiological data in a cost efficient manner. As numerous researches on wave propagation characterization have been done in intrabody communication, this study has given emphasis on the wave propagation characterization between the control units (CUs) and wireless access point (AP) in a hospital scenario. Ray tracing is a tool to predict the rays to characterize the wave propagation. It takes huge simulation time, especially when multiple transmitters are involved to transmit physiological data in a realistic hospital environment. Therefore, this study has developed an accelerated ray tracing method based on the nearest neighbor cell and prior knowledge of intersection techniques. Beside this, Red-Black tree is used to store and provide a faster retrieval mechanism of objects in the hospital environment. To prove the superiority, detailed complexity analysis and calculations of reflection and transmission coefficients are also presented in this paper. The results show that the proposed method is about 1.51, 2.1, and 2.9 times faster than the Object Distribution Technique (ODT), Space Volumetric Partitioning (SVP), and Angular Z-Buffer (AZB) methods, respectively. To show the various effects on received power in 60 GHz frequency, few comparisons are made and it is found that on average -9.44 dBm, -8.23 dBm, and -9.27 dBm received power attenuations should be considered when human, AP, and CU move in a given hospital scenario.
M. J. Islam
2014-01-01
Full Text Available The advent of technology with the increasing use of wireless network has led to the development of Wireless Body Area Network (WBAN to continuously monitor the change of physiological data in a cost efficient manner. As numerous researches on wave propagation characterization have been done in intrabody communication, this study has given emphasis on the wave propagation characterization between the control units (CUs and wireless access point (AP in a hospital scenario. Ray tracing is a tool to predict the rays to characterize the wave propagation. It takes huge simulation time, especially when multiple transmitters are involved to transmit physiological data in a realistic hospital environment. Therefore, this study has developed an accelerated ray tracing method based on the nearest neighbor cell and prior knowledge of intersection techniques. Beside this, Red-Black tree is used to store and provide a faster retrieval mechanism of objects in the hospital environment. To prove the superiority, detailed complexity analysis and calculations of reflection and transmission coefficients are also presented in this paper. The results show that the proposed method is about 1.51, 2.1, and 2.9 times faster than the Object Distribution Technique (ODT, Space Volumetric Partitioning (SVP, and Angular Z-Buffer (AZB methods, respectively. To show the various effects on received power in 60 GHz frequency, few comparisons are made and it is found that on average −9.44 dBm, −8.23 dBm, and −9.27 dBm received power attenuations should be considered when human, AP, and CU move in a given hospital scenario.
Tibaldo, L.; Digel, S. W.; Franckowiak, A.; Moskalenko, I. V.; Negro, M.; Orlando, E.; Porter, T. A.; Reimer, O. [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States); Casandjian, J. M.; Grenier, I. A.; Marshall, D. J. [Laboratoire AIM, CEA-IRFU/CNRS/Université Paris Diderot, Service d’Astrophysique, CEA Saclay, F-91191 Gif sur Yvette (France); Jóhannesson, G. [Science Institute, University of Iceland, IS-107 Reykjavik (Iceland); Strong, A. W., E-mail: ltibaldo@slac.stanford.edu, E-mail: digel@stanford.edu [Max-Planck Institut für extraterrestrische Physik, D-85748 Garching (Germany)
2015-07-10
It is widely accepted that cosmic rays (CRs) up to at least PeV energies are Galactic in origin. Accelerated particles are injected into the interstellar medium where they propagate to the farthest reaches of the Milky Way, including a surrounding halo. The composition of CRs coming to the solar system can be measured directly and has been used to infer the details of CR propagation that are extrapolated to the whole Galaxy. In contrast, indirect methods, such as observations of γ-ray emission from CR interactions with interstellar gas, have been employed to directly probe the CR densities in distant locations throughout the Galactic plane. In this article we use 73 months of data from the Fermi Large Area Telescope in the energy range between 300 MeV and 10 GeV to search for γ-ray emission produced by CR interactions in several high- and intermediate-velocity clouds (IVCs) located at up to ∼7 kpc above the Galactic plane. We achieve the first detection of IVCs in γ rays and set upper limits on the emission from the remaining targets, thereby tracing the distribution of CR nuclei in the halo for the first time. We find that the γ-ray emissivity per H atom decreases with increasing distance from the plane at 97.5% confidence level. This corroborates the notion that CRs at the relevant energies originate in the Galactic disk. The emissivity of the upper intermediate-velocity Arch hints at a 50% decline of CR densities within 2 kpc from the plane. We compare our results to predictions of CR propagation models.
Petitgirard, Sylvain; Daniel, Isabelle; Dabin, Yves; Cardon, Hervé; Tucoulou, Rémi; Susini, Jean
2009-03-01
We present a new diamond anvil cell (DAC), hereafter called the fluoX DAC, dedicated for x-ray fluorescence (XRF) analysis of trace elements in fluids under high pressure and high temperature to 10 GPa and 1273 K at least. This new setup has allowed measurement of Rb, Sr, Y, Zr, with concentrations of 50 ppm to 5.6 GPa and 1273 K. The characteristics of the fluoX DAC consist in an optimized shielding and collection geometry in order to reduce the background level in XRF spectrum. Consequently, minimum detection limits of 0.3 ppm were calculated for the abovementioned elements in this new setup. This new DAC setup coupled to the hard x-rays focusing beamline ID22 (ESRF, France) offers the possibility to analyze in situ at high pressure and high temperature, ppm level concentrations of heavy elements, rare earth elements, and first transition metals, which are of prime importance in geochemical processes. The fluoX DAC is also suitable to x-ray diffraction over the same high pressure-temperature range.
Selig, Marco; Oppermann, Niels; Enßlin, Torsten A
2014-01-01
We analyze the 5.5 year all-sky data from the Fermi Large Area Telescope restricted to gamma-ray photons with energies between 0.6--307.2 GeV. Raw count maps show a superposition of diffuse and point-like contributions and are subject to shot noise and instrumental artifacts. Using the D3PO inference algorithm, we model the observed photon counts as the sum of a diffuse and a point-like photon flux, convolved with the instrumental beam and subject to Poissonian shot noise. The D3PO algorithm performs a Bayesian inference in this setting without the use of spatial or spectral templates; i.e., it removes the shot noise, deconvolves the instrumental response, and yields estimates for the two flux components separately. The non-parametric reconstruction uncovers the morphology of the diffuse photon flux up to several hundred GeV. We present an all-sky spectral index map for the diffuse component. We show that the diffuse gamma-ray flux can phenomenologically be described by only two distinct components: a soft co...
Khuder, A. [Department of Chemistry, Atomic Energy Commission, P.O. Box 6091, Damascus (Syrian Arab Republic)], E-mail: scientific2@aec.org.sy; Sawan, M.Kh.; Karjou, J. [Department of Chemistry, Atomic Energy Commission, P.O. Box 6091, Damascus (Syrian Arab Republic); Razouk, A.K. [Department of Agriculture, Atomic Energy Commission, P.O. Box 6091, Damascus (Syrian Arab Republic)
2009-07-15
X-ray fluorescence (XRF) and total-reflection X-ray fluorescence (TXRF) techniques suited well for a multi-element determination of K, Ca, Mn, Fe, Cu, Zn, Rb, and Sr in some Syrian medicinal plant species. The accuracy and the precision of both techniques were verified by analyzing the Standard Reference Materials (SRM) peach-1547 and apple leaves-1515. A good agreement between the measured concentrations of the previously mentioned elements and the certified values were obtained with errors less than 10.7% for TXRF and 15.8% for XRF. The determination of Br was acceptable only by XRF with an error less than 24%. Furthermore, the XRF method showed a very good applicability for the determination of K, Ca, Mn, Fe, Cu, Zn, Rb, Sr, and Br in infusions of different Syrian medicinal plant species, namely anise (Anisum vulgare), licorice root (Glycyrrhiza glabra), and white wormwood (Artemisia herba-alba)
Khuder, A.; Sawan, M. Kh.; Karjou, J.; Razouk, A. K.
2009-07-01
X-ray fluorescence (XRF) and total-reflection X-ray fluorescence (TXRF) techniques suited well for a multi-element determination of K, Ca, Mn, Fe, Cu, Zn, Rb, and Sr in some Syrian medicinal plant species. The accuracy and the precision of both techniques were verified by analyzing the Standard Reference Materials (SRM) peach-1547 and apple leaves-1515. A good agreement between the measured concentrations of the previously mentioned elements and the certified values were obtained with errors less than 10.7% for TXRF and 15.8% for XRF. The determination of Br was acceptable only by XRF with an error less than 24%. Furthermore, the XRF method showed a very good applicability for the determination of K, Ca, Mn, Fe, Cu, Zn, Rb, Sr, and Br in infusions of different Syrian medicinal plant species, namely anise ( Anisum vulgare), licorice root ( Glycyrrhiza glabra), and white wormwood ( Artemisia herba-alba).
Pingbo, An; Li, Wang; Hongxi, Lu; Zhiguo, Yu; Lei, Liu; Xin, Xi; Lixia, Zhao; Junxi, Wang; Jinmin, Li
2016-06-01
The internal quantum efficiency (IQE) of the light-emitting diodes can be calculated by the ratio of the external quantum efficiency (EQE) and the light extraction efficiency (LEE). The EQE can be measured experimentally, but the LEE is difficult to calculate due to the complicated LED structures. In this work, a model was established to calculate the LEE by combining the transfer matrix formalism and an in-plane ray tracing method. With the calculated LEE, the IQE was determined and made a good agreement with that obtained by the ABC model and temperature-dependent photoluminescence method. The proposed method makes the determination of the IQE more practical and conventional. Project supported by the National Natural Science Foundation of China (Nos.11574306, 61334009), the China International Science and Technology Cooperation Program (No. 2014DFG62280), and the National High Technology Program of China (No. 2015AA03A101).
Ezoe, Masako; Sasaki, Miho; Hokura, Akiko; Nakai, Izumi [Tokyo Univ. of Science, Faculty of Science, Tokyo (Japan); Terada, Yasuko [Japan Synchrotron Radiation Research Inst., Mikazuki, Hyogo (Japan); Yoshinaga, Tatsuki; Tukamoto, Katsumi [Tokyo Univ., Ocean Research Inst., Tokyo (Japan); Hagiwara, Atsushi [Nagasaki Univ., Graduate School of Science and Technology, Bunkyou, Nagasaki (Japan)
2002-10-01
Two-dimensional imaging and a quantitative analysis of trace elements in rotifer, Brachionus plicatilis, belonging to zooplankton, were carried out by a synchrotron radiation X-ray fluorescence analysis (SR-XRF). The XRF imaging revealed that female rotifers accumulated Fe and Zn in the digestive organ and Fe, Zn, Cu, and Ca in the sexual organs, while the Mn level was high in the head. From a quantitative analysis by inductively coupled plasma mass spectrometry (ICP-MS), we found that rotifers eat the chlorella and accumulate the above elements in the body. The result of quantitative analyses of Mn, Cu, and Zn by SR-XRF in a single sample is in fair agreement with the average values determined by ICP-MS analyses, which were obtained by measuring a large number of rotifers, digested by nitric acid. The present study has demonstrated that SR-XRF is an effective tool for the trace element analysis of a single individual of rotifer. (author)
Krishna, A. Keshav; Khanna, Tarun C.; Mohan, K. Rama
2016-08-01
This paper introduces a calibration procedure and provides the data achieved for accuracy, precision, reproducibility and the detection limits for major (Si, Al, Fe, Mn, Mg, Ca, Na, K, Ti, P) and trace (Ba, Cr, Cu, Hf, La, Nb, Ni, Pb, Rb, Sr, Ta, Th, U, Y, Zn, Zr) elements in the routine analysis of geological and environmental samples. Forty-two rock and soil reference materials were used to calibrate and evaluate the analytical method using a sequential wavelength dispersive X-ray fluorescence spectrometer. Samples were prepared as fused glass discs and analysis performed with a total measuring time of thirty-one minutes. Another set of twelve independent reference materials were analyzed for the evaluation of accuracy. The detection limits and accuracy obtained for the trace elements (1-2 mg/kg) are adequate both for geochemical exploration and environmental studies. The fitness for purpose of the results was also evaluated by the quality criteria test proposed by the International Global Geochemical Mapping Program (IGCP) from which it can be deduced that the method is adequate considering geochemical mapping application and accuracy obtained is within the expected interval of certified values in most cases.
Rucker, Dale F.; Ferré, Ty P. A.
2004-08-01
A MATLAB program was developed to invert first arrival travel time picks from zero offset profiling borehole ground penetrating radar traces to obtain the electromagnetic wave propagation velocities in soil. Zero-offset profiling refers to a mode of operation wherein the centers of the bistatic antennae being lowered to the same depth below ground for each measurement. The inversion uses a simulated annealing optimization routine, whereby the model attempts to reduce the root mean square error between the measured and modeled travel time by perturbing the velocity in a ray tracing routine. Measurement uncertainty is incorporated through the presentation of the ensemble mean and standard deviation from the results of a Monte Carlo simulation. The program features a pre-processor to modify or delete travel time information from the profile before inversion and post-processing through presentation of the ensemble statistics of the water contents inferred from the velocity profile. The program includes a novel application of a graphical user interface to animate the velocity fitting routine.
Accelerated Algorithm of Ray Castingin Medical Volume Rendering%医学体绘制的一种快速光线投射算法
牛翠霞; 范辉; 杜慧秋
2006-01-01
The accelerated direct volume rendering algorithm in medical data sets was discussed. Based on several accelerated techniques of DVR, an efficient ray-casting algorithm was proposed which improved the traditional ray-casting algorithm. The algorithm mainly applies methods of polygon scan conversion and voxelization of casting rays. The algorithm uses the method of determining the convex hull of a set and the intersecting algorithm related directly to x,y, z family planes to clip data sets and rays.%针对医学体数据场的直接体绘制(DVR)的加速算法进行了讨论.基于体绘制的多种加速技术,利用格雷厄姆求凸壳算法和与平面簇求交算法对体数据场和投射光线进行裁剪,结合多边形的扫描线转换和投射光线的离散化、体素化,改进了光线投射算法.
Mampuya, Wambaka Ange [Department of Radiation Oncology and Image–Applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto (Japan); Matsuo, Yukinori, E-mail: ymatsuo@kuhp.kyoto-u.ac.jp [Department of Radiation Oncology and Image–Applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto (Japan); Nakamura, Akira; Nakamura, Mitsuhiro; Mukumoto, Nobutaka; Miyabe, Yuki; Narabayashi, Masaru; Sakanaka, Katsuyuki; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image–Applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto (Japan)
2013-04-01
The objective of this study was to evaluate the differences in dose-volumetric data obtained using the analytical anisotropic algorithm (AAA) vs the x-ray voxel Monte Carlo (XVMC) algorithm for stereotactic body radiation therapy (SBRT) for lung cancer. Dose-volumetric data from 20 patients treated with SBRT for solitary lung cancer generated using the iPlan XVMC for the Novalis system consisting of a 6-MV linear accelerator and micro-multileaf collimators were recalculated with the AAA in Eclipse using the same monitor units and identical beam setup. The mean isocenter dose was 100.2% and 98.7% of the prescribed dose according to XVMC and AAA, respectively. Mean values of the maximal dose (D{sub max}), the minimal dose (D{sub min}), and dose received by 95% volume (D{sub 95}) for the planning target volume (PTV) with XVMC were 104.3%, 75.1%, and 86.2%, respectively. When recalculated with the AAA, those values were 100.8%, 77.1%, and 85.4%, respectively. Mean dose parameter values considered for the normal lung, namely the mean lung dose, V{sub 5}, and V{sub 20}, were 3.7 Gy, 19.4%, and 5.0% for XVMC and 3.6 Gy, 18.3%, and 4.7% for the AAA, respectively. All of these dose-volumetric differences between the 2 algorithms were within 5% of the prescribed dose. The effect of PTV size and tumor location, respectively, on the differences in dose parameters for the PTV between the AAA and XVMC was evaluated. A significant effect of the PTV on the difference in D{sub 95} between the AAA and XVMC was observed (p = 0.03). Differences in the marginal doses, namely D{sub min} and D{sub 95}, were statistically significant between peripherally and centrally located tumors (p = 0.04 and p = 0.02, respectively). Tumor location and volume might have an effect on the differences in dose-volumetric parameters. The differences between AAA and XVMC were considered to be within an acceptable range (<5 percentage points)
3D weighting in cone beam image reconstruction algorithms: ray-driven vs. pixel-driven.
Tang, Xiangyang; Nilsen, Roy A; Smolin, Alex; Lifland, Ilya; Samsonov, Dmitry; Taha, Basel
2008-01-01
A 3D weighting scheme have been proposed previously to reconstruct images at both helical and axial scans in stat-of-the-art volumetric CT scanners for diagnostic imaging. Such a 3D weighting can be implemented in the manner of either ray-driven or pixel-drive, depending on the available computation resources. An experimental study is conducted in this paper to evaluate the difference between the ray-driven and pixel-driven implementations of the 3D weighting from the perspective of image quality, while their computational complexity is analyzed theoretically. Computer simulated data and several phantoms, such as the helical body phantom and humanoid chest phantom, are employed in the experimental study, showing that both the ray-driven and pixel-driven 3D weighting provides superior image quality for diagnostic imaging in clinical applications. With the availability of image reconstruction engine at increasing computational power, it is believed that the pixel-driven 3D weighting will be dominantly employed in state-of-the-art volumetric CT scanners over clinical applications.
Trace determination of uranium in fertilizer samples by total reﬂection X-ray ﬂuorescence
N L Misra; Sangita Dhara; Arijeet Das; G S Lodha; S K Aggarwal; I Varga
2011-02-01
Uranium is reported to be present in phosphate fertilizers. The recovery of uranium from the fertilizers is important because it can be used as fuel in nuclear reactors and also because of environmental concerns. For both these activities suitable method of uranium determinations at trace levels in these fertilizers are required. Studies have been initiated for such TXRF determination of uranium and the results are reported in the present paper. For TXRF determinations the fertilizer samples were processed with nitric acid and the uranium present in it was removed by solvent extraction using tri-n-butyl phosphate as the extractant. The organic phase containing uranium was equilibrated with 1.5% suprapure nitric acid to bring out uranium in aqueous phase. This aqueous phase was mixed with internal standard Y and the TXRF spectra were measured by depositing samples on ﬂoat glass supports. The amounts of uranium in four fertilizer samples of Hungarian origin were determined by processing these TXRF spectra. Uranium concentrations in two fertilizer samples were found to be in the range of 4–6 /, whereas two fertilizer samples did not show the presence of uranium. The precision of the TXRF determination of uranium was found to be better than 8 % (1).
Miksat, J.; Müller, T. M.; Wenzel, F.
2008-07-01
Finite difference (FD) simulation of elastic wave propagation is an important tool in geophysical research. As large-scale 3-D simulations are only feasible on supercomputers or clusters, and even then the simulations are limited to long periods compared to the model size, 2-D FD simulations are widespread. Whereas in generally 3-D heterogeneous structures it is not possible to infer the correct amplitude and waveform from 2-D simulations, in 2.5-D heterogeneous structures some inferences are possible. In particular, Vidale & Helmberger developed an approach that simulates 3-D waveforms using 2-D FD experiments only. However, their method requires a special FD source implementation technique that is based on a source definition which is not any longer used in nowadays FD codes. In this paper, we derive a conversion between 2-D and 3-D Green tensors that allows us to simulate 3-D displacement seismograms using 2-D FD simulations and the actual ray path determined in the geometrical optic limit. We give the conversion for a source of a certain seismic moment that is implemented by incrementing the components of the stress tensor. Therefore, we present a hybrid modelling procedure involving 2-D FD and kinematic ray-tracing techniques. The applicability is demonstrated by numerical experiments of elastic wave propagation for models of different complexity.
Wahlberg, J.S.
1981-01-01
Low levels of selenium (0.1-500 ppm) in both organic and inorganic geologic materials can be semiquantitatively measured by isolating Se as a thin film for presentation to an energy-dispersive X-ray fluorescence spectrometer. Suitably pulverized samples are first digested by fusing with a mixture of Na2CO3 and Na2O2. The fusion cake is dissolved in distilled water, buffered with NH4Cl, and filtered to remove Si and the R2O3 group. A carrier solution of Na2TeO4, plus solid KI, hydrazine sulfate and Na2SO3, is added to the filtrate. The solution is then vacuum-filtered through a 0.45-??m pore-size filter disc. The filter, with the thin film of precipitate, is supported between two sheets of Mylar?? film for analysis. Good agreement is shown between data reported in this study and literature values reported by epithermal neutron-activation analysis and spectrofluorimetry. The method can be made quantitative by utilizing a secondary precipitation to assure complete recovery of the Se. The X-ray method offers fast turn-around time and a reasonably high production rate. ?? 1981.
Alxneit, I. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1999-08-01
The program RAY was developed to perform Monte-Carlo simulations of the flux distribution in solar reactors in connection with an arbitrary heliostat field. The code accounts for the shading of the incoming rays from the sun due to the reactor supporting tower as well as for full blocking and shading of the heliostats among themselves. A simplified falling particle reactor (FPR) was evaluated. A central receiver field was used with a total area of 311 m{sup 2} composed of 176 round, focusing heliostats. No attempt was undertaken to optimise either the geometry of the heliostat field nor the aiming strategy of the heliostats. The FPR was evaluated at two different geographic latitudes (-8.23W/47.542N; PSI and -8.23W/20.0N) and during the course of a day (May 30{sup th}). The incident power passing through the reactor aperture and the flux density distribution within the FPR was calculated. (author) 3 figs., 1 tab., 3 refs.
Imaging, Detection, and Identification Algorithms for Position-Sensitive Gamma-Ray Detectors
Wahl, Christopher G.
Three-dimensional-position-sensitive semiconductors record both the locations and energies of gamma-ray interactions with high resolution, enabling spectroscopy and imaging of gamma-ray-emitting materials. Imaging enables the detection of point sources of gamma rays in an otherwise extended-source background, even when the background spectrum is unknown and may share the point source's spectrum. The generalized likelihood ratio test (GLRT) and source-intensity test (SIT) are applied to this situation to detect one-or-more unshielded point sources from a library of isotopes in a spectrally unknown or known background when the background intensity varies spatially by a factor of two or less. In addition to estimating the number of sources present, their activities, isotopes, and directions from the detector are estimated. Experimental and some simulated results are presented for a single detector and an 18-detector array of 2 cm by 2 cm by 1.5 cm CdZnTe crystals and compared with the performance of spectral-only detection when the background and source are assumed to be spectrally different. Furthermore, the expected detection performance of the 18-detector array system is investigated statistically using experimental data in the case where the background is distinct spectrally from the point source and the possible source location and isotopic identity are known. Including imaging gave at least 7% higher SNR compared to ignoring the image dimension. Also, imaging methods based on the maximum-likelihood, expectation-maximization method are introduced to determine the spatial distribution of isotopes and to find the activity distributions within targets moving with known motion through a radioactive background. Software has also been developed to support the analysis of the data from 3D-position-sensitive spectroscopic systems, for a range of detector designs and applications. The software design and unique features that allow fast multidimensional data analysis are
Flamant, Julien; Le Bihan, Nicolas; Martin, Andrew V; Manton, Jonathan H
2016-05-01
In three-dimensional (3D) single particle imaging with x-ray free-electron lasers, particle orientation is not recorded during measurement but is instead recovered as a necessary step in the reconstruction of a 3D image from the diffraction data. Here we use harmonic analysis on the sphere to cleanly separate the angular and radial degrees of freedom of this problem, providing new opportunities to efficiently use data and computational resources. We develop the expansion-maximization-compression algorithm into a shell-by-shell approach and implement an angular bandwidth limit that can be gradually raised during the reconstruction. We study the minimum number of patterns and minimum rotation sampling required for a desired angular and radial resolution. These extensions provide new avenues to improve computational efficiency and speed of convergence, which are critically important considering the very large datasets expected from experiment.
Near-field x-ray phase contrast imaging and phase retrieval algorithm
Zhu Hua-Feng; Xie Hong-Lan; Gao Hong-Yi; Chen Jian-Wen; Li Ru-Xin; Xu Zhi-Zhan
2005-01-01
Theoretical analyses of x-ray diffraction phase contrast imaging and near field phase retrieval method are presented.A new variant of the near field intensity distribution is derived with the optimal phase imaging distance and spatial frequency of object taken into account. Numerical examples of phase retrieval using simulated data are also given. On the above basis, the influence of detecting distance and polychroism of radiation on the phase contrast image and the retrieved phase distribution are discussed. The present results should be useful in the practical application of in-line phase contrast imaging.
Maruthi, Y. A.; Das, N. Lakshmana; Ramprasad, S.; Ram, S. S.; Sudarshan, M.
2015-08-01
The present studies focus the quantitative analysis of elements in school chalk to ensure the safety of its use. The elements like Calcium (Ca), Aluminum (Al), Iron (Fe), Silicon (Si) and Chromium (Cr) were analyzed from settled chalk dust samples collected from five classrooms (CD-1) and also from another set of unused chalk samples collected from local market (CD-2) using Energy Dispersive X-Ray florescence(ED-XRF) spectroscopy. Presence of these elements in significant concentrations in school chalk confirmed that, it is an irritant and occupational hazard. It is suggested to use protective equipments like filtered mask for mouth, nose and chalk holders. This study also suggested using the advanced mode of techniques like Digital boards, marker boards and power point presentations to mitigate the occupational hazard for classroom chalk
Analysis of Full Charge Reconstruction Algorithms for X-Ray Pixelated Detectors
Baumbaugh, A.; /Fermilab; Carini, G.; /SLAC; Deptuch, G.; /Fermilab; Grybos, P.; /AGH-UST, Cracow; Hoff, J.; /Fermilab; Siddons, P., Maj.; /Brookhaven; Szczygiel, R.; /AGH-UST, Cracow; Trimpl, M.; Yarema, R.; /Fermilab
2012-05-21
Existence of the natural diffusive spread of charge carriers on the course of their drift towards collecting electrodes in planar, segmented detectors results in a division of the original cloud of carriers between neighboring channels. This paper presents the analysis of algorithms, implementable with reasonable circuit resources, whose task is to prevent degradation of the detective quantum efficiency in highly granular, digital pixel detectors. The immediate motivation of the work is a photon science application requesting simultaneous timing spectroscopy and 2D position sensitivity. Leading edge discrimination, provided it can be freed from uncertainties associated with the charge sharing, is used for timing the events. Analyzed solutions can naturally be extended to the amplitude spectroscopy with pixel detectors.
Analysis of full charge reconstruction algorithms for x-ray pixelated detectors
Baumbaugh, A.; /Fermilab; Carini, G.; /SLAC; Deptuch, G.; /Fermilab; Grybos, P.; /AGH-UST, Cracow; Hoff, J.; /Fermilab; Siddons, P., Maj.; /Brookhaven; Szczygiel, R.; /AGH-UST, Cracow; Trimpl, M.; Yarema, R.; /Fermilab
2011-11-01
Existence of the natural diffusive spread of charge carriers on the course of their drift towards collecting electrodes in planar, segmented detectors results in a division of the original cloud of carriers between neighboring channels. This paper presents the analysis of algorithms, implementable with reasonable circuit resources, whose task is to prevent degradation of the detective quantum efficiency in highly granular, digital pixel detectors. The immediate motivation of the work is a photon science application requesting simultaneous timing spectroscopy and 2D position sensitivity. Leading edge discrimination, provided it can be freed from uncertainties associated with the charge sharing, is used for timing the events. Analyzed solutions can naturally be extended to the amplitude spectroscopy with pixel detectors.
董建军; 杨正华; 曹柱荣; 韦敏习; 詹夏宇; 刘慎业; 丁永坤
2011-01-01
The spatial resolution of KBA X-ray microscope is studied with ray-tracing simulation and experimental test. In the experiment, the imaging object is Au grid, backlit by X-rays produced by the 9th laser interaction with Cu target on Shen-guang II laser facility. The spatial resolution of KBA X-ray microscope is found to be asymmetric about the center of its field of view. Moreover, the experimental data show that, the variation of resolution in the reducing direction of grazing incidence angle is smaller than that in the increasing direction, and the resolution asymmetry is about 30% relative to the field center.%通过光线追踪模拟在SGⅡ激光装置上利用第9路激光入射到Cu背光靶面产生X射线,通过Au网格背光照相,利用KBA显微镜对此网格成像,获得了清晰的网格图像.通过对实验网格数据的分析发现:在掠射角减小的方向,空间分辨力随视场的变化比掠射角增大的方向变化小,与光线追踪模拟比较,二者均表明KBA的视场是非对称的,从实验图像数据得出,视场的不对称相对于中心位置约为30％.
Ran, Jing; Wang, Dejian; Wang, Can; Zhang, Gang; Yao, Lipeng
2014-08-01
Portable X-ray fluorescence (PXRF) spectrometry may be very suitable for a fast and effective environmental assessment and source identification of trace metals in soils. In this study, topsoils (0-10 cm) at 139 sites were in situ scanned for total trace metals (Cr, Cu, Ni, Pb and Zn) and arsenic concentrations by PXRF in a typical town in Yangtze Delta region of Jiangsu province, China. To validate the utility of PXRF, 53 samples were collected from the scanning sites for the determination of selected trace metals using conventional methods. Based on trace metal concentrations detected by in situ PXRF, the contamination extent and sources of trace metals were studied via geo-accumulation index, multivariate analysis and geostatistics. The trace metal concentrations determined by PXRF were similar to those obtained via conventional chemical analysis. The median concentration of As, Cr, Cu, Ni, Pb and Zn in soils were 10.8, 56.4, 41.5, 43.5, 33.5, and 77.7 mg kg(-1), respectively. The distribution patterns of Cr, Cu, Ni, Pb, and Zn were mostly affected by anthropogenic sources, while As was mainly derived from lithogenic sources. Overall, PXRF has been successfully applied to contamination assessment and source identification of trace metals in soils.
Zacharias Kamarianakis
2014-07-01
Full Text Available This paper presents the architecture of a software platform implemented in C++, for the purpose of testing and evaluation of reconstruction algorithms in X-ray imaging. The fundamental elements of the platform are classes, tightened together in a logical hierarchy. Real world objects as an X-ray source or a flat detector can be defined and implemented as instances of corresponding classes. Various operations (e.g. 3D transformations, loading, saving, filtering of images, creation of planar or curved objects of various dimensions have been incorporated in the software tool as class methods, as well. The user can easily set up any arrangement of the imaging chain objects in 3D space and experiment with many different trajectories and configurations. Selected 3D volume reconstructions using simulated data acquired in specific scanning trajectories are used as a demonstration of the tool. The platform is considered as a basic tool for future investigations of new reconstruction methods in combination with various scanning configurations.
Ritter, A.; Hyde, E. A.; Q. A. Parker
2013-01-01
We present a fast and portable re-implementation of Piskunov and Valenti's optimal-extraction algorithm (Piskunov & Valenti, 2002} in C/C++ together with full uncertainty propagation, improved cosmic-ray removal, and an optimal background-subtraction algorithm. This re-implementation can be used with IRAF and most existing data-reduction packages and leads to signal-to-noise ratios close to the Poisson limit. The algorithm is very stable, operates on spectra from a wide range of instruments (...
Traces of co-evolution in high z X-ray selected and submm-luminous QSOs
Khan-Ali, A; Page, M J; Stevens, J A; Mateos, S; Symeonidis, M; Orjales, J M Cao
2014-01-01
We present a detailed study of a X -ray selected sample of 5 submillimeter bright QSOs at $z\\sim2$, where the highest rates of star formation (SF) and further growth of black holes (BH) occur. Therefore, this sample is a great laboratory to investigate the co-evolution of star formation and AGN. We present here the analysis of the spectral energy distributions (SED) of the 5 QSOS, including new data from Herschel PACS and SPIRE. Both AGN components (direct and reprocessed) and like Star Formation (SF) are needed to model its SED. From the SED and their UV-optical spectra we have estimated the mass of the black hole ($M_{BH} = 10^9 - 10^{10} M_{SUN}$) and bolometric luminosities of AGN ($L_{BOL} = (0.8-20) \\times 10^{13} L_{SUN}$). These objects show very high luminosities in the far infrared range (at the H/ULIRG levels) and very high rates of SF (SFR = 400-1400 $M_{SUN}$/y). Known their current SFR and their BH masses, we deduce that their host galaxies must be already very massive, or would not have time to...
Woelfl, S.; Óvári, M.; Nimptsch, J.; Neu, T. R.; Mages, M.
2016-02-01
Element determination in plankton is important for the assessment of metal contamination of aquatic environments. Until recently, it has been difficult to determine elemental content in rotifers or ciliates derived from natural plankton samples because of the difficulty in handling and separation of these fragile organisms. The aim of this study was to evaluate methods for separation of rotifers and large ciliates from natural plankton samples (μg range dry weight) and subsequent analysis of their elemental content using total-reflection X-ray fluorescence spectrometry (TXRF). Plankton samples were collected from different aquatic environments (three lakes, one river) in Chile, Argentina and Hungary. From one to eighty specimens of five rotifer species (Brachionus calyciflorus, Brachionus falcatus, Asplanchna sieboldii, Asplanchna sp., Philodina sp.) and four to twelve specimens of one large ciliate (Stentor amethystinus) were prepared according to the dry method originally developed for microcrustaceans, and analysed by TRXF following in situ microdigestion. Our results demonstrated that it possible to process these small and fragile organisms (individual dry mass: 0.17-9.39 μg ind- 1) via careful washing and preparation procedures. We found species-dependent differences of the element mass fractions for some of the elements studied (Cr, Mn, Fe, Ni, Cu, Zn, As, Pb), especially for Cu, Fe and Mn. One large rotifer species (A. sieboldii) also showed a negative correlation between individual dry weight and the element content for Pb, Ni and Cr. We conclude that our application of the in situ microdigestion-TRXF method is suitable even for rotifers and ciliates, greatly expanding the possibilities for use of plankton in biomonitoring of metal contamination in aquatic environments.
Ibrahim Gaafar
2015-12-01
Full Text Available This study is an attempt to use the gamma ray spectrometric measurements and VLF-EM data to identify the subsurface structure and map uranium mineralization along El Sela shear zone, South Eastern Desert of Egypt. Many injections more or less mineralized with uranium and associated with alteration processes were recorded in El Sela shear zone. As results from previous works, the emplacement of these injections is structurally controlled and well defined by large shear zones striking in an ENE–WSW direction and crosscut by NW–SE to NNW–SSE fault sets. VLF method has been applied to map the structure and the presence of radioactive minerals that have been delineated by the detection of high uranium mineralization. The electromagnetic survey was carried out to detect the presence of shallow and deep conductive zones that cross the granites along ENE–WSW fracturing directions and to map its spatial distribution. The survey comprised seventy N–S spectrometry and VLF-EM profiles with 20 m separation. The resulted data were displayed as composite maps for K, eU and eTh as well as VLF-Fraser map. Twelve profiles with 100 m separation were selected for detailed description. The VLF-EM data were interpreted qualitatively as well as quantitatively using the Fraser and the Karous–Hjelt filters. Fraser filtered data and relative current density pseudo-sections indicate the presence of shallow and deep conductive zones that cross the granites along ENE–WSW shearing directions. High uranium concentrations found just above the higher apparent current-density zones that coincide with El-Sela shear zone indicate a positive relation between conductivity and uranium minerals occurrence. This enables to infer that the anomalies detected by VLF-EM data are due to the highly conductive shear zone enriched with uranium mineralization extending for more than 80 m.
格网划分的双策略跟踪多边形裁剪算法%Polygon clipping algorithm based on dual-strategies tracing and grid partition
汪荣峰; 廖学军
2012-01-01
An effective algorithm for polygon clipping which supports simple polygons including concave polygons and polygons with holes inside is presented in this paper. This algorithm can be used to calculate set-theoretic differences and union of two polygons. Most analogous algorithms are classifying point of intersection by entry points and exit points, then generating output polygons by tracing vertex. Different from these algorithms, this paper classifies point of intersection by normal point of intersection and vertex of intersection, and designs different tracing strategies for them. By using these strategies alternately and recursively, a steady tracing process to cope with degenerate input is putted forward. To improve efficiency of edge intersection, which is the bottleneck of polygon clipping, it partitions the polygon that has more numbers of edges to grids and brings forward an algorithm for edge partition based on Bresenham line-drawing algorithm. At the end of this paper, the algorithm is compared with the existing algorithms and the result shows that it has higher speed than others.%论文提出了一种高效稳定的多边形裁剪算法,算法支持带内环的平面简单多边形,同时也支持多边形的“并”和“差”等布尔运算.首先,设计了算法所需的数据结构；其次,基于直线扫描转换Bresenham算法原理提出了边网格划分的有效算法,并应用一个简单的方法避免不同网格内边的重复求交；最后,将交点分类为普通交点和顶交点,并针对这两类交点构造了不同的跟踪策略,在跟踪过程中交替、递归地应用这两个策略来确保算法处理特殊情况时的稳定性.与其它同类算法的比较表明,新算法具有更高的效率.
A revised partiality model and post-refinement algorithm for X-ray free-electron laser data
Ginn, Helen Mary [Wellcome Trust Centre for Human Genetics, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Brewster, Aaron S.; Hattne, Johan [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Evans, Gwyndaf; Wagner, Armin [Harwell Science and Innovation Campus, Fermi Avenue, Didcot OX11 0QX (United Kingdom); Grimes, Jonathan M. [Wellcome Trust Centre for Human Genetics, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Harwell Science and Innovation Campus, Fermi Avenue, Didcot OX11 0QX (United Kingdom); Sauter, Nicholas K. [Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States); Sutton, Geoff [Wellcome Trust Centre for Human Genetics, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Stuart, David Ian, E-mail: dave@strubi.ox.ac.uk [Wellcome Trust Centre for Human Genetics, Roosevelt Drive, Oxford OX3 7BN (United Kingdom); Harwell Science and Innovation Campus, Fermi Avenue, Didcot OX11 0QX (United Kingdom)
2015-05-23
An updated partiality model and post-refinement algorithm for XFEL snapshot diffraction data is presented and confirmed by observing anomalous density for S atoms at an X-ray wavelength of 1.3 Å. Research towards using X-ray free-electron laser (XFEL) data to solve structures using experimental phasing methods such as sulfur single-wavelength anomalous dispersion (SAD) has been hampered by shortcomings in the diffraction models for X-ray diffraction from FELs. Owing to errors in the orientation matrix and overly simple partiality models, researchers have required large numbers of images to converge to reliable estimates for the structure-factor amplitudes, which may not be feasible for all biological systems. Here, data for cytoplasmic polyhedrosis virus type 17 (CPV17) collected at 1.3 Å wavelength at the Linac Coherent Light Source (LCLS) are revisited. A previously published definition of a partiality model for reflections illuminated by self-amplified spontaneous emission (SASE) pulses is built upon, which defines a fraction between 0 and 1 based on the intersection of a reflection with a spread of Ewald spheres modelled by a super-Gaussian wavelength distribution in the X-ray beam. A method of post-refinement to refine the parameters of this model is suggested. This has generated a merged data set with an overall discrepancy (by calculating the R{sub split} value) of 3.15% to 1.46 Å resolution from a 7225-image data set. The atomic numbers of C, N and O atoms in the structure are distinguishable in the electron-density map. There are 13 S atoms within the 237 residues of CPV17, excluding the initial disordered methionine. These only possess 0.42 anomalous scattering electrons each at 1.3 Å wavelength, but the 12 that have single predominant positions are easily detectable in the anomalous difference Fourier map. It is hoped that these improvements will lead towards XFEL experimental phase determination and structure determination by sulfur SAD and will
Woelfl, Stefan; Mercado, Susana; Villalobos, Lorena [Instituto de Zoologia, Universidad Austral de Chile, Casilla 567, Valdivia (Chile); Mages, Margarete; Ovari, Mihaly [Department of Inland Water Research Magdeburg, UFZ Centre for Environmental Research Leipzig-Halle, Brueckstrasse 3a, 39114, Magdeburg (Germany); Encina, Francisco [Escuela de Ciencias Ambientales, Facultad de Ciencias, Universidad Catolica de Temuco, Montt 056, Temuco (Chile)
2004-02-01
First results are described from the application of a recently developed dry method for determination of elements in single specimens of freshwater microcrustaceans, using total reflection X-ray fluorescence spectrometry (TXRF). This method is a powerful, non-destructive technique for quantifying the trace element content of minute biological samples with a dry weight of 3-50 {mu}g. Three different freshwater microcrustaceans were sampled, from the natural, uncontaminated Lake Laja and from the artificial Rapel reservoir which is slightly contaminated by drainage water from a copper mine. Single specimens of Daphnia pulex, Bosmina chilensis, and Ceriodaphnia dubia were prepared using a modification of the dry method and measured by TXRF. The results showed that both As, Mn, Fe, Ni, Zn, and Cu content and the bioaccumulation of these metals were usually significantly different between the microcrustaceans from the two lakes. The largest difference was found for Cu which was eight times more concentrated in the two microcrustaceans from Rapel reservoir than it was in D. pulex from Lake Laja. (orig.)
Isik, Hakan
This study is premised on the fact that student conceptions of optics appear to be unrelated to student characteristics of gender, age, years since high school graduation, or previous academic experiences. This study investigated the relationships between student characteristics and student performance on image formation test items and the changes in student conceptions of optics after an introductory inquiry-based physics course. Data was collected from 39 college students who were involved in an inquiry-based physics course teaching topics of geometrical optics. Student data concerning characteristics and previous experiences with optics and mathematics were collected. Assessment of student understanding of optics knowledge for pinholes, plane mirrors, refraction, and convex lenses was collected with, the Test of Image Formation with Light-Ray Tracing instrument. Total scale and subscale scores representing the optics instrument content were derived from student pretest and posttest responses. The types of knowledge, needed to answer each optics item correctly, were categorized as situational, conceptual, procedural, and strategic knowledge. These types of knowledge were associated with student correct and incorrect responses to each item to explain the existences and changes in student scientific and naive conceptions. Correlation and stepwise multiple regression analyses were conducted to identify the student characteristics and academic experiences that significantly predicted scores on the subscales of the test. The results showed that student experience with calculus was a significant predictor of student performance on the total scale as well as on the refraction subscale of the Test of Image Formation with Light-Ray Tracing. A combination of student age and previous academic experience with precalculus was a significant predictor of student performance on the pretest pinhole subscale. Student characteristic of years since high school graduation
无
2000-01-01
The pricing of electricity trasmission requires determining how much use each generator is making of a transmission line and what is each generator's contribution to the system losses. Such problems cannot be solved by only using Kirchoff's laws. This paper proposes two current decomposition axioms based on which the theories and models are established for the current trace problem. To create an efficient algorithm the graph theory is employed. It is proved that there is no directed circuit in a directed current distribution graph. According to this theorem a very simple and efficient algorithm based on recursive elimination process is suggested. A simple example is used to explain the algorithm.
Hirokawa, Shunji; Abrar Hossain, M; Kihara, Yuichi; Ariyoshi, Shogo
2008-12-01
In this paper, we propose three ideas to improve a kinematic estimation algorithm for total knee arthroplasty. The first is a two-step estimation algorithm that improves estimation accuracy by excluding certain assumptions needed for the pattern matching algorithm reported by Banks and Hodge. The second is incorporating a 3D geometric articulation model into the algorithm to improve estimation accuracy substantially for the depth translation, and to introduce contact points' trajectories between the articular surfaces. The third is an algorithm to process estimation even when the silhouettes of two components overlap. To assess our algorithm's potential for clinical application, we carried out two experiments. First, we used a robot to position the prosthesis. Estimation accuracy was checked by comparing input data to the robot with the estimates from X-ray photographs. Incorporating our articulation model remarkably reduced the error in the depth translation. Next, we performed a clinical assessment by applying the algorithm and articulation model to fluoroscopy images of a patient who had recently had TKA.
Jewett, C.; Anghel, V.N.P. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Armitage, J.; Boudjemline, K.; Botte, J. [Carleton Univ., Dept. of Physics, Ottawa, Ontario (Canada); Bryman, D. [Advanced Applied Physics Solutions, Vancouver, British Columbia (Canada); Univ. of British Columbia, Vancouver, British Columbia (Canada); Bueno, J. [Advanced Applied Physics Solutions, Vancouver, British Columbia (Canada); Charles, E. [Canada Border Services Agency, Ottawa, Ontario (Canada); Cousins, T. [International Safety Research, Ottawa, Ontario (Canada); Didsbury, R. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Erhardt, L. [Defence Research and Development Canada, Ottawa, Ontario (Canada); Erlandson, A. [Carleton Univ., Dept. of Physics, Ottawa, Ontario (Canada); Gallant, G. [Canada Border Services Agency, Ottawa, Ontario (Canada); Jason, A. [Los Alamos National Laboratory, Los Alamos (United States); Jonkmans, G. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Liu, Z. [Advanced Applied Physics Solutions, Vancouver, British Columbia (Canada); Univ. of British Columbia, Vancouver, British Columbia (Canada); McCall, M.; Noel, S. [International Safety Research, Ottawa, Ontario (Canada); Oakham, F.G. [Carleton Univ., Dept. of Physics, Ottawa, Ontario (Canada); TRIUMF, Vancouver, British Columbia, (Canada); Ong, D.; Stocki, T. [Health Canada, Ottawa, Ontario (Canada); Thompson, M. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada); Waller, D. [Defence Research and Development Canada, Ottawa, Ontario (Canada)
2011-07-01
The Cosmic Ray Inspection and Passive Tomography (CRIPT) collaboration is developing a cosmic ray muon tomography system to identify Special Nuclear Materials (SNM) in cargo containers. In order to gauge the viability of the technique, and to determine the best detector type, GEANT4 was used to simulate the passage of cosmic ray muons through a cargo container. The scattering density estimation (SDE) algorithm was developed and tested with data from these simulations to determine how well it could reconstruct the interior of a container. The simulation results revealed the ability of cosmic ray muon tomography techniques to image spheres of lead-shielded Special Nuclear Materials (SNM), such as uranium or plutonium, in a cargo container, containing a cargo of granite slabs. (author)
Baran, A. J.; Hesse, Evelyn; Sourdeval, Odran
2017-03-01
Future satellite missions, from 2022 onwards, will obtain near-global measurements of cirrus at microwave and sub-millimetre frequencies. To realise the potential of these observations, fast and accurate light-scattering methods are required to calculate scattered millimetre and sub-millimetre intensities from complex ice crystals. Here, the applicability of the ray tracing with diffraction on facets method (RTDF) in predicting the bulk scalar optical properties and phase functions of randomly oriented hexagonal ice columns and hexagonal ice aggregates at millimetre frequencies is investigated. The applicability of RTDF is shown to be acceptable down to size parameters of about 18, between the frequencies of 243 and 874 GHz. It is demonstrated that RTDF is generally well within about 10% of T-matrix solutions obtained for the scalar optical properties assuming hexagonal ice columns. Moreover, on replacing electromagnetic scalar optical property solutions obtained for the hexagonal ice aggregate with the RTDF counterparts at size parameter values of about 18 or greater, the bulk scalar optical properties can be calculated to generally well within ±5% of an electromagnetic-based database. The RTDF-derived bulk scalar optical properties result in brightness temperature errors to generally within about ±4 K at 874 GHz. Differing microphysics assumptions can easily exceed such errors. Similar findings are found for the bulk scattering phase functions. This finding is owing to the scattering solutions being dominated by the processes of diffraction and reflection, both being well described by RTDF. The impact of centimetre-sized complex ice crystals on interpreting cirrus polarisation measurements at sub-millimetre frequencies is discussed.
Krumer, Zachar; van Sark, Wilfried G. J. H. M.; de Mello Donegá, Celso; Schropp, Ruud E. I.
2013-09-01
Luminescent solar concentrators (LSCs) are low cost photovoltaic devices, which reduce the amount of necessary semiconductor material per unit area of a photovoltaic solar energy converter by means of concentration. The device is comprised of a thin plastic plate in which luminescent species (fluorophores) have been incorporated.The fluorophores absorb the solar light and radiatively re-emit a part of the energy. Total internal reflection traps most of the emitted light inside the plate and wave-guides it to a narrow side facet with a solar cell attached, where conversion into electricity occurs. The eciency of such devices is as yet rather low, due to several loss mechanisms, of which self-absorption is of high importance. Combined ray-tracing and Monte-Carlosimulations is a widely used tool for efficiency estimations of LSC-devices prior to manufacturing. We have applied this method to a model experiment, in which we analysed the impact of self-absorption onto LSC-efficiency of fluorophores with different absorption/emission-spectral overlap (Stokes-shift): several organic dyes and semiconductor quantum dots (single compound and core/shell of type-II). These results are compared with the ones obtained experimentally demonstrating a good agreement. The validated model is used to investigate systematically the influence of spectral separation and luminescence quantum efficiency on the intensity loss inconsequence of increased self-absorption. The results are used to adopt a quantity called the self-absorption cross-section and establish it as reliable criterion for self-absorption properties of materials that can be obtained from fundamental data and has a more universal scope of application, than the currently used Stokes-shift.
Sidky, Emil Y; Pan, Xiaochuan
2012-01-01
Iterative image reconstruction (IIR) algorithms in Computed Tomography (CT) are based on algorithms for solving a particular optimization problem. Design of the IIR algorithm, therefore, is aided by knowledge of the solution to the optimization problem on which it is based. Often times, however, it is impractical to achieve accurate solution to the optimization of interest, which complicates design of IIR algorithms. This issue is particularly acute for CT with a limited angular-range scan, which leads to poorly conditioned system matrices and difficult to solve optimization problems. In this article, we develop IIR algorithms which solve a certain type of optimization called convex feasibility. The convex feasibility approach can provide alternatives to unconstrained optimization approaches and at the same time allow for efficient algorithms for their solution -- thereby facilitating the IIR algorithm design process. An accelerated version of the Chambolle-Pock (CP) algorithm is adapted to various convex fea...
Nowlan, C. R.; Liu, X.; Janz, S. J.; Leitch, J. W.; Al-Saadi, J. A.; Chance, K.; Cole, J.; Delker, T.; Follette-Cook, M. B.; Gonzalez Abad, G.; Good, W. S.; Kowalewski, M. G.; Loughner, C.; Pickering, K. E.; Ruppert, L.; Soo, D.; Szykman, J.; Valin, L.; Zoogman, P.
2016-12-01
The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) and the GEO-CAPE Airborne Simulator (GCAS) instruments are pushbroom sensors capable of making remote sensing measurements of air quality and ocean color. Originally developed as test-bed instruments for the Geostationary Coastal and Air Pollution Events (GEO-CAPE) decadal survey, these instruments are now also part of risk reduction for the upcoming Tropospheric Emissions: Monitoring of Pollution (TEMPO) and Geostationary Environment Monitoring Spectrometer (GEMS) geostationary satellite missions, and will provide validation capabilities after the satellite instruments are in orbit. GeoTASO and GCAS flew on two different aircraft in their first intensive air quality field campaigns during the DISCOVER-AQ missions over Texas in 2013 and Colorado in 2014. GeoTASO was also deployed in 2016 during the KORUS-AQ field campaign to make measurements of trace gases and aerosols over Korea. GeoTASO and GCAS collect spectra of backscattered solar radiation in the UV and visible that can be used to derive 2-D maps of trace gas columns below the aircraft at spatial resolutions on the order of 250 x 500 m. We present spatially resolved maps of trace gas retrievals of ozone, nitrogen dioxide, formaldehyde and sulfur dioxide over urban areas and power plants from flights during the field campaigns, and comparisons with data from ground-based spectrometers, in situ monitoring instruments, and satellites.
The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument is a test bed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA F...
The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument is a test bed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA F...
Acciarri, R.; et al.
2017-08-10
The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the current pattern-recognition performance are presented for simulated MicroBooNE events, using a selection of final-state event topologies.
Acciarri, R.; An, R.; Anthony, J.; Asaadi, J.; Auger, M.; Bagby, L.; Balasubramanian, S.; Baller, B.; Barnes, C.; Barr, G.; Bass, M.; Bay, F.; Bishai, M.; Blake, A.; Bolton, T.; Camilleri, L.; Caratelli, D.; Carls, B.; Castillo Fernandez, R.; Cavanna, F.; Chen, H.; Church, E.; Cianci, D.; Cohen, E.; Collin, G. H.; Conrad, J. M.; Convery, M.; Crespo-Anadón, J. I.; Del Tutto, M.; Devitt, D.; Dytman, S.; Eberly, B.; Ereditato, A.; Escudero Sanchez, L.; Esquivel, J.; Fadeeva, A. A.; Fleming, B. T.; Foreman, W.; Furmanski, A. P.; Garcia-Gamez, D.; Garvey, G. T.; Genty, V.; Goeldi, D.; Gollapinni, S.; Graf, N.; Gramellini, E.; Greenlee, H.; Grosso, R.; Guenette, R.; Hackenburg, A.; Hamilton, P.; Hen, O.; Hewes, J.; Hill, C.; Ho, J.; Horton-Smith, G.; Hourlier, A.; Huang, E.-C.; James, C.; Jan de Vries, J.; Jen, C.-M.; Jiang, L.; Johnson, R. A.; Joshi, J.; Jostlein, H.; Kaleko, D.; Karagiorgi, G.; Ketchum, W.; Kirby, B.; Kirby, M.; Kobilarcik, T.; Kreslo, I.; Laube, A.; Li, Y.; Lister, A.; Littlejohn, B. R.; Lockwitz, S.; Lorca, D.; Louis, W. C.; Luethi, M.; Lundberg, B.; Luo, X.; Marchionni, A.; Mariani, C.; Marshall, J.; Martinez Caicedo, D. A.; Meddage, V.; Miceli, T.; Mills, G. B.; Moon, J.; Mooney, M.; Moore, C. D.; Mousseau, J.; Murrells, R.; Naples, D.; Nienaber, P.; Nowak, J.; Palamara, O.; Paolone, V.; Papavassiliou, V.; Pate, S. F.; Pavlovic, Z.; Piasetzky, E.; Porzio, D.; Pulliam, G.; Qian, X.; Raaf, J. L.; Rafique, A.; Rochester, L.; Rudolf von Rohr, C.; Russell, B.; Schmitz, D. W.; Schukraft, A.; Seligman, W.; Shaevitz, M. H.; Sinclair, J.; Smith, A.; Snider, E. L.; Soderberg, M.; Söldner-Rembold, S.; Soleti, S. R.; Spentzouris, P.; Spitz, J.; St. John, J.; Strauss, T.; Szelc, A. M.; Tagg, N.; Terao, K.; Thomson, M.; Toups, M.; Tsai, Y.-T.; Tufanli, S.; Usher, T.; Van De Pontseele, W.; Van de Water, R. G.; Viren, B.; Weber, M.; Wickremasinghe, D. A.; Wolbers, S.; Wongjirad, T.; Woodruff, K.; Yang, T.; Yates, L.; Zeller, G. P.; Zennamo, J.; Zhang, C.
2017-01-01
The development and operation of Liquid-Argon Time-Projection Chambers for neutrino physics has created a need for new approaches to pattern recognition in order to fully exploit the imaging capabilities offered by this technology. Whereas the human brain can excel at identifying features in the recorded events, it is a significant challenge to develop an automated, algorithmic solution. The Pandora Software Development Kit provides functionality to aid the design and implementation of pattern-recognition algorithms. It promotes the use of a multi-algorithm approach to pattern recognition, in which individual algorithms each address a specific task in a particular topology. Many tens of algorithms then carefully build up a picture of the event and, together, provide a robust automated pattern-recognition solution. This paper describes details of the chain of over one hundred Pandora algorithms and tools used to reconstruct cosmic-ray muon and neutrino events in the MicroBooNE detector. Metrics that assess the...
Ritter, A; Parker, Q A
2013-01-01
We present a fast and portable re-implementation of Piskunov and Valenti's optimal-extraction algorithm (Piskunov & Valenti, 2002} in C/C++ together with full uncertainty propagation, improved cosmic-ray removal, and an optimal background-subtraction algorithm. This re-implementation can be used with IRAF and most existing data-reduction packages and leads to signal-to-noise ratios close to the Poisson limit. The algorithm is very stable, operates on spectra from a wide range of instruments (slit spectra and fibre feeds), and has been extensively tested for VLT/UVES, ESO/CES, ESO/FEROS, NTT/EMMI, NOT/ALFOSC, STELLA/SES, SSO/WiFeS, and finally, P60/SEDM-IFU data.
Ritter, A.; Hyde, E. A.; Parker, Q. A.
2014-02-01
We present a fast and portable re-implementation of Piskunov and Valenti's optimal-extraction algorithm (Piskunov & Valenti, 2002} in C/C++ together with full uncertainty propagation, improved cosmic-ray removal, and an optimal background-subtraction algorithm. This re-implementation can be used with IRAF and most existing data-reduction packages and leads to signal-to-noise ratios close to the Poisson limit. The algorithm is very stable, operates on spectra from a wide range of instruments (slit spectra and fibre feeds), and has been extensively tested for VLT/UVES, ESO/CES, ESO/FEROS, NTT/EMMI, NOT/ALFOSC, STELLA/SES, SSO/WiFeS, and finally, P60/SEDM-IFU data.
Bi, Lei; Yang, Ping; Liu, Chao; Yi, Bingqi; Baum, Bryan A.; Van Diedenhoven, Bastiaan; Iwabuchi, Hironobu
2014-01-01
A fundamental problem in remote sensing and radiative transfer simulations involving ice clouds is the ability to compute accurate optical properties for individual ice particles. While relatively simple and intuitively appealing, the conventional geometric-optics method (CGOM) is used frequently for the solution of light scattering by ice crystals. Due to the approximations in the ray-tracing technique, the CGOM accuracy is not well quantified. The result is that the uncertainties are introduced that can impact many applications. Improvements in the Invariant Imbedding T-matrix method (II-TM) and the Improved Geometric-Optics Method (IGOM) provide a mechanism to assess the aforementioned uncertainties. The results computed by the II-TMþIGOM are considered as a benchmark because the IITM solves Maxwell's equations from first principles and is applicable to particle size parameters ranging into the domain at which the IGOM has reasonable accuracy. To assess the uncertainties with the CGOM in remote sensing and radiative transfer simulations, two independent optical property datasets of hexagonal columns are developed for sensitivity studies by using the CGOM and the II-TMþIGOM, respectively. Ice cloud bulk optical properties obtained from the two datasets are compared and subsequently applied to retrieve the optical thickness and effective diameter from Moderate Resolution Imaging Spectroradiometer (MODIS) measurements. Additionally, the bulk optical properties are tested in broadband radiative transfer (RT) simulations using the general circulation model (GCM) version of the Rapid Radiative Transfer Model (RRTMG) that is adopted in the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM, version 5.1). For MODIS retrievals, the mean bias of uncertainties of applying the CGOM in shortwave bands (0.86 and 2.13 micrometers) can be up to 5% in the optical thickness and as high as 20% in the effective diameter, depending on cloud optical
Melli, S. Ali; Wahid, Khan A.; Babyn, Paul; Cooper, David M. L.; Gopi, Varun P.
2016-12-01
Synchrotron X-ray Micro Computed Tomography (Micro-CT) is an imaging technique which is increasingly used for non-invasive in vivo preclinical imaging. However, it often requires a large number of projections from many different angles to reconstruct high-quality images leading to significantly high radiation doses and long scan times. To utilize this imaging technique further for in vivo imaging, we need to design reconstruction algorithms that reduce the radiation dose and scan time without reduction of reconstructed image quality. This research is focused on using a combination of gradient-based Douglas-Rachford splitting and discrete wavelet packet shrinkage image denoising methods to design an algorithm for reconstruction of large-scale reduced-view synchrotron Micro-CT images with acceptable quality metrics. These quality metrics are computed by comparing the reconstructed images with a high-dose reference image reconstructed from 1800 equally spaced projections spanning 180°. Visual and quantitative-based performance assessment of a synthetic head phantom and a femoral cortical bone sample imaged in the biomedical imaging and therapy bending magnet beamline at the Canadian Light Source demonstrates that the proposed algorithm is superior to the existing reconstruction algorithms. Using the proposed reconstruction algorithm to reduce the number of projections in synchrotron Micro-CT is an effective way to reduce the overall radiation dose and scan time which improves in vivo imaging protocols.
唐卫东; 关志华; 吴中元
2002-01-01
大多数现有的多目标进化算法(MOEA-Multiobjective Evolutionary Algorithm)都是基于Pareto机制的,如NPGA(Niched Pareto Genetic Algorithm),NSGA(Non-dominated Sorting Genetic Algorithm)等.这些算法的每一个循环都要对种群中的部分或全部个体进行排序或比较,计算量很大.文中介绍了一种基于变权重线性加权的Pareto轨迹法-WSTPEA(Weighted Sum Approach and Tracing Pareto Method),该算法不是同时求得所有可能的非劣解,而是每执行一个循环步骤求得一个非劣解,通过权重变化次数控制算法循环的次数,从而使整个种群遍历Pareto曲线(面).文中给出了算法的详细描述和流程图,并且对两个实验测试问题进行了计算,最后对结果进行了分析.
Wu, Juan; Lerotic, Mirna; Collins, Sean; Leary, Rowan; Saghi, Zineb; Midgley, Paul; Berejnov, Slava; Susac, Darija; Stumper, Juergen; Singh, Gurvinder; Hitchcock, Adam P
2017-09-12
Soft X-ray spectro-tomography provides three-dimensional (3D) chemical mapping based on natural X-ray absorption properties. Since radiation damage is intrinsic to X-ray absorption, it is important to find ways to maximize signal within a given dose. For tomography, using the smallest number of tilt series images that gives a faithful reconstruction is one such method. Compressed sensing (CS) methods have relatively recently been applied to tomographic reconstruction algorithms, providing faithful 3D reconstructions with a much smaller number of projection images than when conventional reconstruction methods are used. Here, CS is applied in the context of scanning transmission X-ray microscopy tomography. Reconstructions by weighted back-projection, the simultaneous iterative reconstruction technique, and CS are compared. The effects of varying tilt angle increment and angular range for the tomographic reconstructions are examined. Optimization of the regularization parameter in the CS reconstruction is explored and discussed. The comparisons show that CS can provide improved reconstruction fidelity relative to weighted back-projection and simultaneous iterative reconstruction techniques, with increasingly pronounced advantages as the angular sampling is reduced. In particular, missing wedge artifacts are significantly reduced and there is enhanced recovery of sharp edges. Examples of using CS for low-dose scanning transmission X-ray microscopy spectroscopic tomography are presented.
蔡彪; 潘晋孝; 陈平
2011-01-01
Traditional reconstruction algorithms assume that the X - ray is monochromatic, while in fact, X - ray is polychromatic in actual CT. When the polychromatic projection data are used to reconstruct the images directly,metal artifacts and beam - hardening artifacts appear in the reconstructed images, which reduces image quality and affects medical or industrial diagnosis. This paper considers the consecution of X - ray spectrum, and simulats the statistical reconstruction algorithm based on consecutive X - ray spectrum. Firstly, consecutive spectrum was discretized as monochromatic spectrum. Secondly, according to the workpiece material information and mass attenuation coefficient corresponding to X - ray energy, the workpiece material model was formulated based on consecutive spectrum. Finally, using the polychromatic - energy statistics iterate algorithm, the reconstruction was caried out based on polychromatic projection data. Through the simulation experiment, the algorithm reduces the artifacts to a certain extent, and improves the image quality.%关于提高CT图像精度的问题,传统的CT重建算法都是基于X射线源是单色源的假设,忽略了X射线的多色性.直接用多色投影数据进行图像重建易产生金属、硬化等伪影,降低图像质量,影响CT值标定,从而影响医学或工业诊断.考虑到X射线能谱的连续性,采用仿真手段实现连续X射线谱的统计重建.首先将连续X射线谱离散成若干单能谱,再根据待检工件的材质信息以及射线能量所对应的质量衰减系数,构建基于连续X射线谱的工件材质模型;最后利用多能统计重建算法对多能投影数据进行迭代重建.仿真结果表明,算法充分地利用了X射线的多能性,在一定程度上可以有效地降低图像伪影,提高CT重建图像质量.
Liu, Chen-Yi; Goertzen, Andrew L
2013-07-21
An iterative position-weighted centre-of-gravity algorithm was developed and tested for positioning events in a silicon photomultiplier (SiPM)-based scintillation detector for positron emission tomography. The algorithm used a Gaussian-based weighting function centred at the current estimate of the event location. The algorithm was applied to the signals from a 4 × 4 array of SiPM detectors that used individual channel readout and a LYSO:Ce scintillator array. Three scintillator array configurations were tested: single layer with 3.17 mm crystal pitch, matched to the SiPM size; single layer with 1.5 mm crystal pitch; and dual layer with 1.67 mm crystal pitch and a ½ crystal offset in the X and Y directions between the two layers. The flood histograms generated by this algorithm were shown to be superior to those generated by the standard centre of gravity. The width of the Gaussian weighting function of the algorithm was optimized for different scintillator array setups. The optimal width of the Gaussian curve was found to depend on the amount of light spread. The algorithm required less than 20 iterations to calculate the position of an event. The rapid convergence of this algorithm will readily allow for implementation on a front-end detector processing field programmable gate array for use in improved real-time event positioning and identification.
Maltz, Jonathan S; Gangadharan, Bijumon; Bose, Supratik; Hristov, Dimitre H; Faddegon, Bruce A; Paidi, Ajay; Bani-Hashemi, Ali R
2008-12-01
Quantitative reconstruction of cone beam X-ray computed tomography (CT) datasets requires accurate modeling of scatter, beam-hardening, beam profile, and detector response. Typically, commercial imaging systems use fast empirical corrections that are designed to reduce visible artifacts due to incomplete modeling of the image formation process. In contrast, Monte Carlo (MC) methods are much more accurate but are relatively slow. Scatter kernel superposition (SKS) methods offer a balance between accuracy and computational practicality. We show how a single SKS algorithm can be employed to correct both kilovoltage (kV) energy (diagnostic) and megavoltage (MV) energy (treatment) X-ray images. Using MC models of kV and MV imaging systems, we map intensities recorded on an amorphous silicon flat panel detector to water-equivalent thicknesses (WETs). Scattergrams are derived from acquired projection images using scatter kernels indexed by the local WET values and are then iteratively refined using a scatter magnitude bounding scheme that allows the algorithm to accommodate the very high scatter-to-primary ratios encountered in kV imaging. The algorithm recovers radiological thicknesses to within 9% of the true value at both kV and megavolt energies. Nonuniformity in CT reconstructions of homogeneous phantoms is reduced by an average of 76% over a wide range of beam energies and phantom geometries.
Trofimov, M. Yu.; Zakharenko, A. D.; Kozitskiy, S. B.
2016-10-01
A mode parabolic equation in the ray centered coordinates for 3D underwater sound propagation is developed. The Gaussian beam tracing in this case is constructed. The test calculations are carried out for the ASA wedge benchmark and proved an excellent agreement with the source images method in the case of cross-slope propagation. But in the cases of wave propagation at some angles to the cross-slope direction an account of mode interaction becomes necessary.
Yim, Che Wook; Kim, Song Hyun; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of)
2015-05-15
In two-phase flow, the motions of dispersed bubbles influence fluid properties such as heat transfer. In order to analyze how the bubble motion affects the fluid property, various techniques have been developed. An optical method has been used for the analysis of the single-phase flow such as Liquid Doppler Velocimetry (LDV) and Particle Image Velocimetry (PIV). However, it has some significant application problems which cannot be used for the opaque fluid and two phase flows. Phase-Doppler Method, another optical method, can be applied to the two-phase flow analysis. It is noted that the method also has difficulty to analyze the opaque flows. In a previous study, x-ray PIV method was proposed as the technique to measure the flow velocity and to get the flow vector field. However, there is no appropriate approach to analyze the bubble size for the two phase flows. In this study, a technique to estimate the bubble size by using x-ray is proposed as a preliminary study to develop an algorithm of the two phase flow analysis. In this study, a reconstruction algorithm of bubble size in two-phase flows using single x-ray was proposed. The analysis shows that 3-dimensional bubble size can be estimated by the multichannel detectors with the detection information. Also, a preliminary study on multi-bubble cases was performed. The analysis of the results show that that multiple bubbles can be separated by using the property that is the symmetry of bubbles. This proposed algorithm can detect the bubbles in flow of opaque fluids or nontransparent pipes which cannot be analyzed by optical methods. It is expected that the proposed method can utilized to inspect the bubbles in two-phase bubbly flow.
Nowlan, Caroline R.; Liu, Xiong; Leitch, James W.; Chance, Kelly; González Abad, Gonzalo; Liu, Cheng; Zoogman, Peter; Cole, Joshua; Delker, Thomas; Good, William; Murcray, Frank; Ruppert, Lyle; Soo, Daniel; Follette-Cook, Melanie B.; Janz, Scott J.; Kowalewski, Matthew G.; Loughner, Christopher P.; Pickering, Kenneth E.; Herman, Jay R.; Beaver, Melinda R.; Long, Russell W.; Szykman, James J.; Judd, Laura M.; Kelley, Paul; Luke, Winston T.; Ren, Xinrong; Al-Saadi, Jassim A.
2016-06-01
The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO) airborne instrument is a test bed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA Falcon aircraft in its first intensive field measurement campaign during the Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality (DISCOVER-AQ) Earth Venture Mission over Houston, Texas, in September 2013. Measurements of backscattered solar radiation between 420 and 465 nm collected on 4 days during the campaign are used to determine slant column amounts of NO2 at 250 m × 250 m spatial resolution with a fitting precision of 2.2 × 1015 moleculescm-2. These slant columns are converted to tropospheric NO2 vertical columns using a radiative transfer model and trace gas profiles from the Community Multiscale Air Quality (CMAQ) model. Total column NO2 from GeoTASO is well correlated with ground-based Pandora observations (r = 0.90 on the most polluted and cloud-free day of measurements and r = 0.74 overall), with GeoTASO NO2 slightly higher for the most polluted observations. Surface NO2 mixing ratios inferred from GeoTASO using the CMAQ model show good correlation with NO2 measured in situ at the surface during the campaign (r = 0.85). NO2 slant columns from GeoTASO also agree well with preliminary retrievals from the GEO-CAPE Airborne Simulator (GCAS) which flew on the NASA King Air B200 (r = 0.81, slope = 0.91). Enhanced NO2 is resolvable over areas of traffic NOx emissions and near individual petrochemical facilities.
C. R. Nowlan
2015-12-01
Full Text Available The Geostationary Trace gas and Aerosol Sensor Optimization (GeoTASO airborne instrument is a testbed for upcoming air quality satellite instruments that will measure backscattered ultraviolet, visible and near-infrared light from geostationary orbit. GeoTASO flew on the NASA Falcon aircraft in its first intensive field measurement campaign during the Deriving Information on Surface Conditions from Column and Vertically Resolved Observations Relevant to Air Quality (DISCOVER-AQ Earth Venture Mission over Houston, Texas in September 2013. Measurements of backscattered solar radiation between 420–465 nm collected on four days during the campaign are used to determine slant column amounts of NO2 at 250 m × 250 m spatial resolution with a fitting precision of 2.2 × 1015 molecules cm−2. These slant columns are converted to tropospheric NO2 vertical columns using a radiative transfer model and trace gas profiles from the Community Multiscale Air Quality (CMAQ model. Total column NO2 from GeoTASO is well correlated with ground-based Pandora observations (r = 0.90 on the most polluted and cloud-free day of measurements, with GeoTASO NO2 slightly higher for the most polluted observations. Surface NO2 mixing ratios inferred from GeoTASO using the CMAQ model show good correlation with NO2 measured in situ at the surface during the campaign (r = 0.91 for the most polluted day. NO2 slant columns from GeoTASO also agree well with preliminary retrievals from the GEO-CAPE Airborne Simulator (GCAS which flew on the NASA King Air B200 (r = 0.84, slope = 0.94. Enhanced NO2 is resolvable over areas of traffic NOx emissions and near individual petrochemical facilities.
Cozer, Thamara C.; Conceicao, Andre L.C.; Paschuk, Sergei A.; Rocha, Anna S.S. da; Fagundes, Alana C.F.; Maciel, Karla F.R.; Pimentel, Gustavo R.O.; Badelli, Juliana C., E-mail: thamara.cozer@gmail.com, E-mail: alconceicao@utfpr.edu.br, E-mail: sergei@utfpr.edu.br, E-mail: anna@utfpr.edu.br, E-mail: alanacarolinef@gmail.com, E-mail: karla_rimanski@hotmail.com, E-mail: g_rop@hotmail.com, E-mail: jubadellin@gmail.com [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil). Lab. de Espectroscopia de Raio-X
2015-07-01
Studies performed with canines indicate that one of the main neoplasia which affect these animals are the breast tumors, representing from 25% to 50% of all kinds of tumors. Moreover, half of them are classified as malignant. In this sense, recent researches on humans have been associated the presence of certain trace elements with the development of breast neoplasia in those individuals. Then, as the breast tissue composition in canines is very similar to the humans, it is expected the same behavior. In this direction, a very effective technique to identify and to determinate trace elements concentration is the EDXRF. However, studies on this area are scarce in the literature. Therefore, in this work it was developed an approach to quantify the main trace elements present into these tumors with high sensitivity. For this purpose, it was determined calibration curves of standards samples diluted in water, with concentrations of Ca, Fe, Cu and Zn, ranging from 400mg/kg to 35mg/kg, from 20mg/kg to 2mg/kg, from 10mg/kg to 1mg/kg and from 100mg/kg to 10mg/kg, respectively. All calibration curves were linearly fitted and on basis in this behavior it was determined the sensitivity of our approach to quantify the concentration of the trace elements mentioned above. In addition, it is important to mention that studies in this area are of great potential, because EDXRF represents a quickly practical and non-destructive alternative to quantify trace elements. (author)
王锡凡; 王秀丽
2000-01-01
The pricing of electricity trasmission requires determining how much use each generator is making of a transmission line and what is each generator’s contribution to the system losses. Such problems cannot be solved by only using Kirchoff’s laws. This paper proposes two current decomposition axioms based on which the theories and models are established for the current trace problem. To create an efficient algorithm the graph theory is employed. It is proved that there is no directed circuit in a directed current distribution graph. According to this theorem a very simple and efficient algorithm based on recursive elimination process is suggested. A simple example is used to explain the algorithm .
短波射线追踪技术中的电离层混合建模方法%Ionosphere hybrid modeling method for short-wave ray tracing
栗伟珉; 苏东林; 阎照文; 刘焱
2012-01-01
在国际参考电离层模型和多层准抛物模型的基础上,提出了一种混合应用两种模型进行电离层建模的新方法.利用射线追踪技术,分别对混合模型和传统国际参考电离层模型下短波射线在电离层中的轨迹进行了仿真,得到了电波群路径.通过与实测电波群路径的对比,结果表明：对中国中纬度地区在电离层混合模型下的射线追踪精度优于传统国际参考电离层模型下的射线追踪精度,同时混合建模方法降低了多层准抛物模型对输入条件的要求,扩展了多层准抛物模型在射线追踪技术中的应用范围.%Based on the international reference ionosphere（IRI） and the quasi-parabolic segments（QPS） model,a new ionosphere hybrid modeling method for short-wave ray tracing was proposed.The group ranges which show the short-wave propagation trace in the ionosphere were obtained separately by simulation in the hybrid model and the IRI model.By comparing the simulated results and the ionospheric oblique incidence sounding experimental data,the hybrid modeling method accuracy at mid-latitude region in China was analyzed.It indicates the ray tracing simulation accuracy in the hybrid model on experimental day better than the one in the IRI model.The limit to the QPS model＇s input is reduced by the hybrid modeling method and the QPS model＇s application range is extended in ray tracing technology.
Un-Hong Wong
2014-01-01
Full Text Available In this paper, we model the reflectance of the lunar regolith by a new method combining Monte Carlo ray tracing and Hapke’s model. The existing modeling methods exploit either a radiative transfer model or a geometric optical model. However, the measured data from an Interference Imaging spectrometer (IIM on an orbiter were affected not only by the composition of minerals but also by the environmental factors. These factors cannot be well addressed by a single model alone. Our method implemented Monte Carlo ray tracing for simulating the large-scale effects such as the reflection of topography of the lunar soil and Hapke’s model for calculating the reflection intensity of the internal scattering effects of particles of the lunar soil. Therefore, both the large-scale and microscale effects are considered in our method, providing a more accurate modeling of the reflectance of the lunar regolith. Simulation results using the Lunar Soil Characterization Consortium (LSCC data and Chang’E-1 elevation map show that our method is effective and useful. We have also applied our method to Chang’E-1 IIM data for removing the influence of lunar topography to the reflectance of the lunar soil and to generate more realistic visualizations of the lunar surface.
van der Horst, A J; Miller-Jones, J C A; Linford, J D; Gorosabel, J; Russell, D M; Postigo, A de Ugarte; Lundgren, A A; Taylor, G B; Maitra, D; Guziy, S; Belloni, T M; Kouveliotou, C; Jonker, P G; Kamble, A; Paragi, Z; Homan, J; Kuulkers, E; Granot, J; Altamirano, D; Buxton, M M; Castro-Tirado, A; Fender, R P; Garrett, M A; Gehrels, N; Hartmann, D H; Kennea, J A; Krimm, H A; Mangano, V; Ramirez-Ruiz, E; Romano, P; Wijers, R A M J; Wijnands, R; Yang, Y J
2013-01-01
MAXI J1659-152 was discovered on 2010 September 25 as a new X-ray transient, initially identified as a gamma-ray burst, but was later shown to be a new X-ray binary with a black hole as the most likely compact object. Dips in the X-ray light curves have revealed that MAXI J1659-152 is the shortest period black hole candidate identified to date. Here we present the results of a large observing campaign at radio, sub-millimeter, near-infrared (nIR), optical and ultraviolet (UV) wavelengths. We have combined this very rich data set with the available X-ray observations to compile a broadband picture of the evolution of this outburst. We have performed broadband spectral modeling, demonstrating the presence of a spectral break at radio frequencies and a relationship between the radio spectrum and X-ray states. Also, we have determined physical parameters of the accretion disk and put them into context with respect to the other parameters of the binary system. Finally, we have investigated the radio-X-ray and nIR/...
Cheney, James; Ahmed, Amal
2008-01-01
Provenance is information about the origin, derivation, ownership, or history of an object. It has recently been studied extensively in scientific databases and other settings due to its importance in helping scientists judge data validity, quality and integrity. However, most models of provenance have been stated as ad hoc definitions motivated by informal concepts such as "comes from", "influences", "produces", or "depends on". These models lack clear formalizations describing in what sense the definitions capture these intuitive concepts. This makes it difficult to compare approaches, evaluate their effectiveness, or argue about their validity. We introduce provenance traces, a general form of provenance for the nested relational calculus (NRC), a core database query language. Provenance traces can be thought of as concrete data structures representing the operational semantics derivation of a computation; they are related to the traces that have been used in self-adjusting computation, but differ in impor...
Fajstrup, Lisbeth; Goubault, Eric; Haucourt, Emmanuel;
2012-01-01
of concurrent languages, where programs are interpreted as directed topological spaces, and study its properties in order to devise an algorithm for computing dihomotopy classes of execution paths. In particular, our algorithm is able to compute a control-flow graph for concurrent programs, possibly containing...
Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey
2017-01-01
Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50 % in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.
Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey
2017-01-01
Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50% in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.
Vasilkov, Alexander; Qin, Wenhan; Krotkov, Nickolay; Lamsal, Lok; Spurr, Robert; Haffner, David; Joiner, Joanna; Yang, Eun-Su; Marchenko, Sergey
2017-01-01
Most satellite nadir ultraviolet and visible cloud, aerosol, and trace-gas algorithms make use of climatological surface reflectivity databases. For example, cloud and NO2 retrievals for the Ozone Monitoring Instrument (OMI) use monthly gridded surface reflectivity climatologies that do not depend upon the observation geometry. In reality, reflection of incoming direct and diffuse solar light from land or ocean surfaces is sensitive to the sun-sensor geometry. This dependence is described by the bidirectional reflectance distribution function (BRDF). To account for the BRDF, we propose to use a new concept of geometry-dependent Lambertian equivalent reflectivity (LER). Implementation within the existing OMI cloud and NO2 retrieval infrastructure requires changes only to the input surface reflectivity database. The geometry-dependent LER is calculated using a vector radiative transfer model with high spatial resolution BRDF information from the Moderate Resolution Imaging Spectroradiometer (MODIS) over land and the Cox-Munk slope distribution over ocean with a contribution from water-leaving radiance. We compare the geometry-dependent and climatological LERs for two wavelengths, 354 and 466 nm, that are used in OMI cloud algorithms to derive cloud fractions. A detailed comparison of the cloud fractions and pressures derived with climatological and geometry-dependent LERs is carried out. Geometry-dependent LER and corresponding retrieved cloud products are then used as inputs to our OMI NO2 algorithm. We find that replacing the climatological OMI-based LERs with geometry-dependent LERs can increase NO2 vertical columns by up to 50% in highly polluted areas; the differences include both BRDF effects and biases between the MODIS and OMI-based surface reflectance data sets. Only minor changes to NO2 columns (within 5 %) are found over unpolluted and overcast areas.
Barbosa, Rommel Melgaço; Nacano, Letícia Ramos; Freitas, Rodolfo; Batista, Bruno Lemos; Barbosa, Fernando
2014-09-01
This article aims to evaluate 2 machine learning algorithms, decision trees and naïve Bayes (NB), for egg classification (free-range eggs compared with battery eggs). The database used for the study consisted of 15 chemical elements (As, Ba, Cd, Co, Cs, Cu, Fe, Mg, Mn, Mo, Pb, Se, Sr, V, and Zn) determined in 52 eggs samples (20 free-range and 32 battery eggs) by inductively coupled plasma mass spectrometry. Our results demonstrated that decision trees and NB associated with the mineral contents of eggs provide a high level of accuracy (above 80% and 90%, respectively) for classification between free-range and battery eggs and can be used as an alternative method for adulteration evaluation.
Wavefront construction Kirchhoff migration with ray-amplitude corrections
Fehler, Michael C.; Hildebrand, S. T. (Steve T.); Huang, L. (Lian-Jie); Alde, D. M. (Douglas M.)
2002-01-01
Kirchhoff migration using ray tracing travel times has been a popular imaging method for many years. There are significant limitations in the ability of Kirchhoff migration using only first arrivals to reliably image regions of complex structure. Thus, new methods for imaging have been sought. One approach for improving imaging capability is to use ray tracing methods that allow the calculation of multiple-valued travel time tables to be used in migration. Additional improvements in ray-based imaging methods may be obtained by including amplitudes and phases of rays calculated using some ray tracing approach. One approach for calculating multiple-valued travel time tables along with estimates of amplitudes and phases is the use of wavefront construction ray tracing. We introduce our wavefront construction-based migration algorithm and present some example images obtained using the method. We compare the images obtained with those obtained using a dual-domain wave-equation migration method that we call Extended Local Rytov Fourier migration method.
Burns, Jack O.; Datta, Abhirup; Hallman, Eric J.
2016-06-01
Galaxy clusters are assembled through large and small mergers which are the most energetic events ("bangs") since the Big Bang. Cluster mergers "stir" the intracluster medium (ICM) creating shocks and turbulence which are illuminated by ~Mpc-sized radio features called relics and halos. These shocks heat the ICM and are detected in x-rays via thermal emission. Disturbed morphologies in x-ray surface brightness and temperatures are direct evidence for cluster mergers. In the radio, relics (in the outskirts of the clusters) and halos (located near the cluster core) are also clear signposts of recent mergers. Our recent ENZO cosmological simulations suggest that around a merger event, radio emission peaks very sharply (and briefly) while the x-ray emission rises and decays slowly. Hence, a sample of galaxy clusters that shows both luminous x-ray emission and radio relics/halos are good candidates for very recent mergers. We are in the early stages of analyzing a unique sample of 48 galaxy clusters with (i) known radio relics and/or halos and (ii) significant archival x-ray observations (>50 ksec) from Chandra and/or XMM. We have developed a new x-ray data analysis pipeline, implemented on parallel processor supercomputers, to create x-ray surface brightness, high fidelity temperature, and pressure maps of these clusters in order to study merging activity. The temperature maps are made using three different map-making techniques: Weighted Voronoi Tessellation, Adaptive Circular Binning, and Contour Binning. In this talk, we will show preliminary results for several clusters, including Abell 2744 and the Bullet cluster. This work is supported by NASA ADAP grant NNX15AE17G.
Dil, Ebrahim Alipanahpour; Ghaedi, Mehrorang; Asfaram, Arash; Mehrabi, Fatemeh; Bazrafshan, Ali Akbar; Ghaedi, Abdol Mohammad
2016-11-01
In this study, ultrasound assisted dispersive solid-phase micro extraction combined with spectrophotometry (USA-DSPME-UV) method based on activated carbon modified with Fe2O3 nanoparticles (Fe2O3-NPs-AC) was developed for pre-concentration and determination of safranin O (SO). It is known that the efficiency of USA-DSPME-UV method may be affected by pH, amount of adsorbent, ultrasound time and eluent volume and the extent and magnitude of their contribution on response (in term of main and interaction part) was studied by using central composite design (CCD) and artificial neural network-genetic algorithms (ANN-GA). Accordingly by adjustment of experimental conditions suggested by ANN-GA at pH 6.5, 1.1mg of adsorbent, 10min ultrasound and 150μL of eluent volume led to achievement of best operation performance like low LOD (6.3ngmL(-1)) and LOQ (17.5ngmL(-1)) in the range of 25-3500ngmL(-1). In following stage, the SO content in real water and wastewater samples with recoveries between 93.27-99.41% with RSD lower than 3% was successfully determined.
Pokhrel, Damodar; Murphy, Martin J.; Todor, Dorin A.; Weiss, Elisabeth; Williamson, Jeffrey F. [Department of Radiation Oncology, School of Medicine, Virginia Commonwealth University, Richmond, Virginia 23298 (United States)
2010-09-15
Purpose: To experimentally validate a new algorithm for reconstructing the 3D positions of implanted brachytherapy seeds from postoperatively acquired 2D conebeam-CT (CBCT) projection images. Methods: The iterative forward projection matching (IFPM) algorithm finds the 3D seed geometry that minimizes the sum of the squared intensity differences between computed projections of an initial estimate of the seed configuration and radiographic projections of the implant. In-house machined phantoms, containing arrays of 12 and 72 seeds, respectively, are used to validate this method. Also, four {sup 103}Pd postimplant patients are scanned using an ACUITY digital simulator. Three to ten x-ray images are selected from the CBCT projection set and processed to create binary seed-only images. To quantify IFPM accuracy, the reconstructed seed positions are forward projected and overlaid on the measured seed images to find the nearest-neighbor distance between measured and computed seed positions for each image pair. Also, the estimated 3D seed coordinates are compared to known seed positions in the phantom and clinically obtained VariSeed planning coordinates for the patient data. Results: For the phantom study, seed localization error is (0.58{+-}0.33) mm. For all four patient cases, the mean registration error is better than 1 mm while compared against the measured seed projections. IFPM converges in 20-28 iterations, with a computation time of about 1.9-2.8 min/iteration on a 1 GHz processor. Conclusions: The IFPM algorithm avoids the need to match corresponding seeds in each projection as required by standard back-projection methods. The authors' results demonstrate {approx}1 mm accuracy in reconstructing the 3D positions of brachytherapy seeds from the measured 2D projections. This algorithm also successfully localizes overlapping clustered and highly migrated seeds in the implant.
Jeon, Jong Ho; Nakajima, Kazuhisa; Kim, Hyung Taek; Rhee, Yong Joo; Pathak, Vishwa Bandhu; Cho, Myung Hoon; Shin, Jung Hun; Yoo, Byung Ju; Hojbota, Calin; Jo, Sung Ha; Shin, Kang Woo; Sung, Jae Hee; Lee, Seung Ku; Cho, Byeoung Ick; Choi, Il Woo; Nam, Chang Hee
2015-12-01
We present a high-flux, broadband gamma-ray spectrometry capable of characterizing the betatron radiation spectrum over the photon energy range from 10 keV to 20 MeV with respect to the peak photon energy, spectral bandwidth, and unique discrimination from background radiations, using a differential filtering spectrometer and the unfolding procedure based on the Monte Carlo code GEANT4. These properties are experimentally verified by measuring betatron radiation from a cm-scale laser wakefield accelerator (LWFA) driven by a 1-PW laser, using a differential filtering spectrometer consisting of a 15-filter and image plate stack. The gamma-ray spectra were derived by unfolding the photostimulated luminescence (PSL) values recorded on the image plates, using the spectrometer response matrix modeled with the Monte Carlo code GEANT4. The accuracy of unfolded betatron radiation spectra was assessed by unfolding the test PSL data simulated with GEANT4, showing an ambiguity of less than 20% and clear discrimination from the background radiation with less than 10%. The spectral analysis of betatron radiation from laser wakefield-accelerated electron beams with energies up to 3 GeV revealed radiation spectra characterized by synchrotron radiation with the critical photon energy up to 7 MeV. The gamma-ray spectrometer and unfolding method presented here facilitate an in-depth understanding of betatron radiation from LWFA process and a novel radiation source of high-quality photon beams in the MeV regime.
Jeon, Jong Ho, E-mail: jhjeon07@ibs.re.kr; Nakajima, Kazuhisa, E-mail: naka115@dia-net.ne.jp; Pathak, Vishwa Bandhu; Cho, Myung Hoon; Yoo, Byung Ju; Shin, Kang Woo [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 500-712 (Korea, Republic of); Kim, Hyung Taek; Sung, Jae Hee; Lee, Seung Ku; Choi, Il Woo [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 500-712 (Korea, Republic of); Advanced Photonics Research Institute, GIST, Gwangju 500-712 (Korea, Republic of); Rhee, Yong Joo [Nuclear Data Center, Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Shin, Jung Hun; Jo, Sung Ha [Advanced Photonics Research Institute, GIST, Gwangju 500-712 (Korea, Republic of); Hojbota, Calin; Cho, Byeoung Ick; Nam, Chang Hee [Center for Relativistic Laser Science, Institute for Basic Science (IBS), Gwangju 500-712 (Korea, Republic of); Department of Physics and Photon Science, GIST, Gwangju 500-712 (Korea, Republic of)
2015-12-15
We present a high-flux, broadband gamma-ray spectrometry capable of characterizing the betatron radiation spectrum over the photon energy range from 10 keV to 20 MeV with respect to the peak photon energy, spectral bandwidth, and unique discrimination from background radiations, using a differential filtering spectrometer and the unfolding procedure based on the Monte Carlo code GEANT4. These properties are experimentally verified by measuring betatron radiation from a cm-scale laser wakefield accelerator (LWFA) driven by a 1-PW laser, using a differential filtering spectrometer consisting of a 15-filter and image plate stack. The gamma-ray spectra were derived by unfolding the photostimulated luminescence (PSL) values recorded on the image plates, using the spectrometer response matrix modeled with the Monte Carlo code GEANT4. The accuracy of unfolded betatron radiation spectra was assessed by unfolding the test PSL data simulated with GEANT4, showing an ambiguity of less than 20% and clear discrimination from the background radiation with less than 10%. The spectral analysis of betatron radiation from laser wakefield-accelerated electron beams with energies up to 3 GeV revealed radiation spectra characterized by synchrotron radiation with the critical photon energy up to 7 MeV. The gamma-ray spectrometer and unfolding method presented here facilitate an in-depth understanding of betatron radiation from LWFA process and a novel radiation source of high-quality photon beams in the MeV regime.
李昕; 张勇; 刘君华; 吴浩扬
2001-01-01
将气敏元件阵列技术和遗传神经网络相结合，检测了电力变压器油中的4种微量故障特征气体(1×10－6～70×10－6级的φ(H2)、φ(C2H4)、φ(C2H2)和50×10－6～550×10－6级的φ(CO))．计算结果表明，单一网络的泛化能力较强，但识别准确度在某些值处达不到实用的要求．针对变压器油中故障特征气体临界值的识别在电力变压器早期故障诊断中的重要性，提出了一种利用遗传神经网络进一步提高混合气体临界值识别准确度的新技术，即多重遗传神经网络识别法，它既可以在大范围内识别故障气体的种类和浓度，又可以在这些气体临界值附近进行准确识别，以满足实际工况的应用．%Critical pattern recognition of trace gas by multiple neural network with genetic algorithm is presented. The trace gas concentration was measured by the above method， for example，1～70 parts per million by volume of hydrogen, or acetylene or ethene, or 50～550 parts per million of carbon monoxide. The single network can recognize gas species in large range, but cannot acquire the precise output at critical value. Since the concentration threshold of failure characteristic gas in transformer oil is very important to early failure diagnosis, the measurement precision of it should be improved. A new method of multiple neural network with genetic algorithm is presented, that can keep the recognition range and also improve the precision.
Evolutionary algorithm for optimization of nonimaging Fresnel lens geometry.
Yamada, N; Nishikawa, T
2010-06-21
In this study, an evolutionary algorithm (EA), which consists of genetic and immune algorithms, is introduced to design the optical geometry of a nonimaging Fresnel lens; this lens generates the uniform flux concentration required for a photovoltaic cell. Herein, a design procedure that incorporates a ray-tracing technique in the EA is described, and the validity of the design is demonstrated. The results show that the EA automatically generated a unique geometry of the Fresnel lens; the use of this geometry resulted in better uniform flux concentration with high optical efficiency.
Offset Trace-Based Video Quality Evaluation Network Transport
Seeling, P.; Reisslein, M.; Fitzek, Frank
2006-01-01
after networking transport that includes losses and delays. In this work, we provide (i) an overview of frame dependencies that have to be taken into consideration when working with video traces, (ii) an algorithmic approach to combine traditional video traces and offset distortion traces to determine...... the video quality or distortion after lossy network transport, (iii) offset distortion and quality characteristics and (iv) the offset distortion trace format and tools to create offset distortion traces....
文韬; 洪添胜; 李立君; 张南峰; 李震; 郭鑫
2014-01-01
Bactrocera Dorsalis (Hendel) are invasive pests which occur frequently and are seriously harmful to the growth of fruit trees, and they have been ranked an important quarantine object in many countries and regions. The regular manual survey used as the routine predicting method for Bactrocera Dorsalis (Hendel) has not accomplished the requirement of real-time and precise monitoring and warning by means of the adult trapping and monitoring device deployed in orchards. With the development of science and technologies, the method of the automatic machine monitoring for pests has been studied including detection of sound characteristics, radar monitoring and spectral monitoring. Considering the characteristic with randomness, migratory and hiding for Bactrocera Dorsalis (Hendel), there were some problems such as timing, processing and costs in monitoring pests with the aid of combining the above monitoring and the traditional method. In order to accomplish precise monitoring for Bactrocera Dorsalis (Hendel), machine vision technologies were used as an in-field automatic detecting method for the Hendel adults in this paper. Considering the problem with tracking Bactrocera Dorsalis (Hendel) object disappearance in multi-objects with more closer condition by means of the mean shift algorithm in color space according to previous machine vision technology research results, the fusion algorithm based on mean shift and Kalman filter theories for moving objects was proposed for optimizing multi-objects moving trace tracking by means of colorful analysis for moving objects and background in monitoring zones. The recurrence relation of the adults moving trace was obtained, and position coordinate, X-component and Y-component of speed in the 2D plane were extracted by image processing and matching technologies in this algorithm. By analyzing the state sequence linear minimum variance estimate theory of dynamic system and recurrence relation of the adults moving trace, the model
Feng, Chang; Cooray, Asantha; Keating, Brian
2017-02-01
The extragalactic γ-ray background and its spatial anisotropy could potentially contain a signature of dark matter (DM) annihilation or particle decay. Astrophysical foregrounds, such as blazars and star-forming galaxies (SFGs), however, dominate the γ-ray background, precluding an easy detection of the signal associated with the DM annihilation or decay in the background intensity spectrum. The DM imprint on the γ-ray background is expected to be correlated with large-scale structure tracers. In some cases, such a cross-correlation is even expected to have a higher signal-to-noise ratio than the auto-correlation. One reliable tracer of the DM distribution in the large-scale structure is lensing of the cosmic microwave background (CMB), and the cosmic infrared background (CIB) is a reliable tracer of SFGs. We analyze Fermi-LAT data taken over 92 months and study the cross-correlation with Planck CMB lensing, Planck CIB, and Fermi-γ maps. We put upper limits on the DM annihilation cross-section from the cross-power spectra with the γ-ray background anisotropies. The unbiased power spectrum estimation is validated with simulations that include cross-correlated signals. We also provide a set of systematic tests and show that no significant contaminations are found for the measurements presented here. Using γ-ray background map from data gathered over 92 months, we find the best constraint on the DM annihilation with a 1σ confidence level upper limit of 10‑25–10‑24 cm3 s‑1, when the mass of DM particles is between 20 and 100 GeV.
Local algorithm for computing complex travel time based on the complex eikonal equation.
Huang, Xingguo; Sun, Jianguo; Sun, Zhangqing
2016-04-01
The traditional algorithm for computing the complex travel time, e.g., dynamic ray tracing method, is based on the paraxial ray approximation, which exploits the second-order Taylor expansion. Consequently, the computed results are strongly dependent on the width of the ray tube and, in regions with dramatic velocity variations, it is difficult for the method to account for the velocity variations. When solving the complex eikonal equation, the paraxial ray approximation can be avoided and no second-order Taylor expansion is required. However, this process is time consuming. In this case, we may replace the global computation of the whole model with local computation by taking both sides of the ray as curved boundaries of the evanescent wave. For a given ray, the imaginary part of the complex travel time should be zero on the central ray. To satisfy this condition, the central ray should be taken as a curved boundary. We propose a nonuniform grid-based finite difference scheme to solve the curved boundary problem. In addition, we apply the limited-memory Broyden-Fletcher-Goldfarb-Shanno technology for obtaining the imaginary slowness used to compute the complex travel time. The numerical experiments show that the proposed method is accurate. We examine the effectiveness of the algorithm for the complex travel time by comparing the results with those from the dynamic ray tracing method and the Gauss-Newton Conjugate Gradient fast marching method.
Local algorithm for computing complex travel time based on the complex eikonal equation
Huang, Xingguo; Sun, Jianguo; Sun, Zhangqing
2016-04-01
The traditional algorithm for computing the complex travel time, e.g., dynamic ray tracing method, is based on the paraxial ray approximation, which exploits the second-order Taylor expansion. Consequently, the computed results are strongly dependent on the width of the ray tube and, in regions with dramatic velocity variations, it is difficult for the method to account for the velocity variations. When solving the complex eikonal equation, the paraxial ray approximation can be avoided and no second-order Taylor expansion is required. However, this process is time consuming. In this case, we may replace the global computation of the whole model with local computation by taking both sides of the ray as curved boundaries of the evanescent wave. For a given ray, the imaginary part of the complex travel time should be zero on the central ray. To satisfy this condition, the central ray should be taken as a curved boundary. We propose a nonuniform grid-based finite difference scheme to solve the curved boundary problem. In addition, we apply the limited-memory Broyden-Fletcher-Goldfarb-Shanno technology for obtaining the imaginary slowness used to compute the complex travel time. The numerical experiments show that the proposed method is accurate. We examine the effectiveness of the algorithm for the complex travel time by comparing the results with those from the dynamic ray tracing method and the Gauss-Newton Conjugate Gradient fast marching method.
Pokhrel, Damodar
Interstitial and intracavitary brachytherapy plays an essential role in management of several malignancies. However, the achievable accuracy of brachytherapy treatment for prostate and cervical cancer is limited due to the lack of intraoperative planning and adaptive replanning. A major problem in implementing TRUS-based intraoperative planning is an inability of TRUS to accurately localize individual seed poses (positions and orientations) relative to the prostate volume during or after the implantation. For the locally advanced cervical cancer patient, manual drawing of the source positions on orthogonal films can not localize the full 3D intracavitary brachytherapy (ICB) applicator geometry. A new iterative forward projection matching (IFPM) algorithm can explicitly localize each individual seed/applicator by iteratively matching computed projections of the post-implant patient with the measured projections. This thesis describes adaptation and implementation of a novel IFPM algorithm that addresses hitherto unsolved problems in localization of brachytherapy seeds and applicators. The prototype implementation of 3-parameter point-seed IFPM algorithm was experimentally validated using a set of a few cone-beam CT (CBCT) projections of both the phantom and post-implant patient's datasets. Geometric uncertainty due to gantry angle inaccuracy was incorporated. After this, IFPM algorithm was extended to 5-parameter elongated line-seed model which automatically reconstructs individual seed orientation as well as position. The accuracy of this algorithm was tested using both the synthetic-measured projections of clinically-realistic Model-6711 125I seed arrangements and measured projections of an in-house precision-machined prostate implant phantom that allows the orientations and locations of up to 100 seeds to be set to known values. The seed reconstruction error for simulation was less than 0.6 mm/3o. For the physical phantom experiments, IFPM absolute accuracy for
Improved electron probe microanalysis of trace elements in quartz
Donovan, John J.; Lowers, Heather; Rusk, Brian G.
2011-01-01
Quartz occurs in a wide range of geologic environments throughout the Earth's crust. The concentration and distribution of trace elements in quartz provide information such as temperature and other physical conditions of formation. Trace element analyses with modern electron-probe microanalysis (EPMA) instruments can achieve 99% confidence detection of ~100 ppm with fairly minimal effort for many elements in samples of low to moderate average atomic number such as many common oxides and silicates. However, trace element measurements below 100 ppm in many materials are limited, not only by the precision of the background measurement, but also by the accuracy with which background levels are determined. A new "blank" correction algorithm has been developed and tested on both Cameca and JEOL instruments, which applies a quantitative correction to the emitted X-ray intensities during the iteration of the sample matrix correction based on a zero level (or known trace) abundance calibration standard. This iterated blank correction, when combined with improved background fit models, and an "aggregate" intensity calculation utilizing multiple spectrometer intensities in software for greater geometric efficiency, yields a detection limit of 2 to 3 ppm for Ti and 6 to 7 ppm for Al in quartz at 99% t-test confidence with similar levels for absolute accuracy.
Ashraf Ragab Mohamed
2014-09-01
Full Text Available Recent advances of computational capabilities have motivated the development of more sophisticated models to simulate cement-based hydration. However, the input parameters for such models, obtained from SEM–X-ray image analyses, are quite complicated and hinder their versatile application. This paper addresses the utilization of the artificial neural networks (ANNs to predict the SEM–X-ray images’ data of cement-based materials (surface area fraction and the cement phases’ correlation functions. ANNs have been used to correlate these data, already obtained for 21 types of cement, to basic cement data (cement compounds and fineness. Two approaches have been proposed; the ANN, and the ANN-regression method. Comparisons have shown that the ANN proves effectiveness in predicting the surface area fraction, while the ANN-regression is more computationally suitable for the correlation functions. Results have shown good agreement between the proposed techniques and the actual data with respect to hydration products, degree of hydration, and simulated images.
Moreira, Silvana [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Civil, Arquitetura e Urbanismo]. E-mail: Silvana@fec.unicamp.br; Vives, Ana Elisa S. de [Universidade Metodista de Piracicaba (UNIMEP), Santa Barbara D' Oeste, SP (Brazil). Faculdade de Engenharia, Arquitetura e Urbanismo]. E-mail: aesvives@unimep.br; Brienza, Sandra Maria B. [Universidade Metodista de Piracicaba (UNIMEP), Santa Barbara D' Oeste, SP (Brazil) Faculdade de Ciencias Matematicas, da Natureza e de Tecnologia da Informacao]. E-mail: sbrienza@unimep.br; Medeiros, Jean Gabriel S.; Tomazello Filho, Mario [Sao Paulo Univ., Piracicaba, SP (Brazil). Escola Superior de Agricultura Luiz de Queiroz]. E-mail: jeangm@esalq.usp.br; mtomazel@esalq.usp.br; Zucchi, Orgheda L.A.D. [Sao Paulo Univ., Ribeirao Preto, SP (Brazil). Faculdade de Ciencias Farmaceuticas]. E-mail: olzucchi@fcfrp.usp.br; Nascimento Filho, Virgilio F. [Centro de Energia Nuclear na Agricultura (CENA), Piracicaba, SP (Brazil). Lab. de Instrumentacao Nuclear]. E-mail: virgilio@cena.usp.br
2005-07-01
This paper aims to study the environmental pollution in the tree development, as a manner to evaluate its use as bioindicator in urban and country sides. The sample collecting was carry out in Piracicaba city, Sao Paulo State, that presents high level of environmental contamination of the water, soil and air, due industrial activities, vehicles combustion, sugar-cane leaves burning in the harvesting, etc. It was selected the Caesalpinia peltophoroides ('Sibipiruna') specie because its very used in urban arborization. It was employed the analytical technique named total reflection X-ray fluorescence (TXRF) to identify and quantify the elements and metals of nutritional and toxicological importance in the wood samples. The analysis was done in the Brazilian Synchrotron Light Laboratory, using a white beam for excitation and a Si(Li) detector for characteristic X-ray detection. It was quantified the P, K, Ca, Ti, Fe, Sr, Ba e Pb elements. (author)
Hsu, Ming-Ying; Lin, Yu-Chuan; Chan, Chia-Yen; Lin, Wei-Cheng; Chan, Shenq-Tsong; Huang, Ting-Ming
2012-10-01
The Cassegrain telescope system in this study, is discussion correct lens thermal OPD (Optical Path Difference) effect optical performance. The correct lens assembly are includes several components such as correct lens, lens mount, spacer, mount barrel and retainer. The heat transfer from surrounding to the correct lens barrel will causes optical system aberration. Meanwhile, the off-axis rays path of the OPD must consider lens incidence point and emergence point. The correct lens temperature distribution is calculate the lens barrel heat transfer analysis, the thermal distortion and stress are solve by FEM (Finite Element Method) software. The temperature calculation results can be weighting to each incidence ray path and calculate thermal OPD. The thermal OPD on Z-direction can be fitted by rigid body motion and Zernike polynomial. The fitting results can be used to evaluate the thermal effect on correct lens assembly in telescope system.
Feng, Chang; Keating, Brian
2016-01-01
The extragalactic $\\gamma$-ray background, and its spatial anisotropy, could potentially contain a signature of dark matter annihilation or particle decay. Astrophysical foregrounds, such as blazars and star-forming galaxies, however, dominate the $\\gamma$-ray background, precluding an easy detection of the signal associated with the dark matter annihilation or decay in the background intensity spectrum. The dark matter imprint on the $\\gamma$-ray background is expected to be correlated with large-scale structure tracers. In some cases such a cross-correlation is even expected to have a higher signal-to-noise ratio than the auto-correlation. A reliable tracer of the dark matter distribution in the large-scale structure is lensing of the cosmic microwave background (CMB) and the cosmic infrared background (CIB) is a reliable tracer of star-forming galaxies. We analyze Fermi-LAT data taken over 92 months and study the cross-correlation with Planck CMB lensing, Planck CIB, and Fermi-$\\gamma$ maps. We put upper l...
Accelerating optimization by tracing valley
Li, Qing-Xiao; He, Rong-Qiang; Lu, Zhong-Yi
2016-06-01
We propose an algorithm to accelerate optimization when an objective function locally resembles a long narrow valley. In such a case, a conventional optimization algorithm usually wanders with too many tiny steps in the valley. The new algorithm approximates the valley bottom locally by a parabola that is obtained by fitting a set of successive points generated recently by a conventional optimization method. Then large steps are taken along the parabola, accompanied by fine adjustment to trace the valley bottom. The effectiveness of the new algorithm has been demonstrated by accelerating the Newton trust-region minimization method and the Levenberg-Marquardt method on the nonlinear fitting problem in exact diagonalization dynamical mean-field theory and on the classic minimization problem of the Rosenbrock's function. Many times speedup has been achieved for both problems, showing the high efficiency of the new algorithm.
Virtual X-ray imaging techniques in an immersive casting simulation environment
Li, Ning [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of)]. E-mail: lining@ewha.ac.kr; Kim, Sung-Hee [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of); Suh, Ji-Hyun [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of); Cho, Sang-Hyun [Center for e-Design, Korea Institute of Industrial Technology, 7-47, Songdo-Dong, Yeonsu-Ku, Inchon (Korea, Republic of); Choi, Jung-Gil [Center for e-Design, Korea Institute of Industrial Technology, 7-47, Songdo-Dong, Yeonsu-Ku, Inchon (Korea, Republic of); Kim, Myoung-Hee [Visual Computing and Virtual Reality Laboratory, Department of Computer Science and Engineering, Ewha Womans University, 405-1, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of) and Center for Computer Graphics and Virtual Reality, Ewha Womans University, 400, Ewha-SK Telecom Building, 11-1, Daehyun-dong, Seodaemun-gu, 120-750 Seoul (Korea, Republic of)]. E-mail: mhkim@ewha.ac.kr
2007-08-15
A computer code was developed to simulate radiograph of complex casting products in a CAVE{sup TM}-like environment. The simulation is based on the deterministic algorithms and ray tracing techniques. The aim of this study is to examine CAD/CAE/CAM models at the design stage, to optimize the design and inspect predicted defective regions with fast speed, good accuracy and small numerical expense. The present work discusses the algorithms for the radiography simulation of CAD/CAM model and proposes algorithmic solutions adapted from ray-box intersection algorithm and octree data structure specifically for radiographic simulation of CAE model. The stereoscopic visualization of full-size of product in the immersive casting simulation environment as well as the virtual X-ray images of castings provides an effective tool for design and evaluation of foundry processes by engineers and metallurgists.
Sirito de Vives, Ana Elisa [School of Civil Engineering, Architecture and Urban Design Methodist University of Piracicaba, Rodovia Santa Barbara D' Oeste/Iracemapolis, km 01, 13450-000 Santa Barbara D' Oeste, SP (Brazil)]. E-mail: aesvives@unimep.br; Moreira, Silvana [State University of Campinas - UNICAMP/FEC (Brazil); Brienza, Sandra Maria Boscolo [School of Civil Engineering, Architecture and Urban Design Methodist University of Piracicaba, Rodovia Santa Barbara D' Oeste/Iracemapolis, km 01, 13450-000 Santa Barbara D' Oeste, SP (Brazil); Silva Medeiros, Jean Gabriel [University of Sao Paulo - USP/ ESALQ (Brazil); Tomazello Filho, Mario Tomazello [University of Sao Paulo - USP/ ESALQ (Brazil); Araujo Domingues Zucchi, Orgheda Luiza [University of Sao Paulo - USP/FCFRP (Brazil); Nascimento Filho, Virgilio Franco do [University of Sao Paulo - USP/CENA (Brazil)
2006-11-15
This paper aims to study the environmental pollution in the tree development, in order to evaluate its use as bioindicator in urban and country sides. The sample collection was carried out in Piracicaba city, Sao Paulo State, which presents high level of environmental contamination in water, soil and air, due to industrial activities, vehicles combustion, sugar-cane leaves burning in the harvesting, etc. The species Caesalpinia peltophoroides ('Sibipiruna') was selected because it is widely used in urban forestation. Synchrotron Radiation Total Reflection X-ray Fluorescence technique (SR-TXRF) was employed to identify and quantify the elements and metals of nutritional and toxicological importance in the wood samples. The analysis was performed in the Brazilian Synchrotron Light Source Laboratory, using a white beam for excitation and a Si(Li) detector for X-ray detection. In several samples, P, K, Ca, Ti, Fe, Sr, Ba and Pb were quantified. The K/Ca, K/P and Pb/Ca ratios were found to decrease towards the bark.
Hamdi Sahraoui
2016-08-01
Full Text Available Lead (Pb, Zinc (Zn and Cadmium (Cd levels on soils surrounding of Lakhouat mine (North-West of Tunisia were measured. The total concentration of these elements in the soil samples was determined by portable X-ray fluorescence (XRF in situ and compared to the traditional digestion method using inductively coupled-plasma optical emission spectroscopy (ICP-OES. Statistical analyses were performed to determine if significant differences existed between the instrumental techniques, which included simple correlations by the regression lines and t-test for mean comparison. The statistical analysis demonstrated that no statistically significant differences were observed for the Pb concentrations; however, for the Zn and Cd concentrations, t-test showed significant differences between the inst
Sen, N; Roy, N K; Das, A K
1989-06-01
Separation by solvent extraction followed by X-ray fluorescence spectrometry has been used for determination of molybdenum and tungsten in rocks and minerals. Samples are decomposed either by heating with a mixture of hydrofluoric acid and perchloric acid or by fusion with potassium pyrosulphate, followed by extraction of molybdenum and tungsten with N-benzoylphenylhydroxylamine in toluene from 4-5M sulphuric acid medium. The extract is collected on a mass of cellulose powder, which is dried in vacuum, mixed thoroughly and pressed into a disc for XRF measurements. The method is free from all matrix effects and needs no mathematical corrections for interelement effects. The method is suitable for determination of molybdenum and tungsten in geological materials down to ppm levels, with reasonable precision and accuracy.
Ribstein, B.; Achatz, U.
2016-09-01
Gravity waves (GWs) play an important role in atmospheric dynamics. Due to their short wavelengths, they must be parameterized in current weather and forecast models, which cannot resolve them explicitly. We are here the first to report the possibility and the implication of having an online GW parameterization in a linear but global model that incorporates their horizontal propagation, the effects of transients and of horizontal background gradients on GW dynamics. The GW parameterization is based on a ray-tracer model with a spectral formulation that is safe against numerical instabilities due to caustics. The global model integrates the linearized primitive equations to obtain solar tides (STs), with a seasonally dependent reference climatology, forced by a climatological daily cycle of the tropospheric and stratospheric heating, and the (instantaneous) GW momentum and buoyancy flux convergences resulting from the ray tracer. Under a more conventional "single-column" approximation, where GWs only propagate vertically and do not respond to horizontal gradients of the resolved flow, GW impacts are shown to be significantly changed in comparison with "full" experiments, leading to significant differences in ST amplitudes and phases, pointing at a sensitive issue of GW parameterizations in general. In the full experiment, significant semidiurnal STs arise even if the tidal model is only forced by diurnal heating rates. This indicates that an important part of the tidal signal is forced directly by GWs via their momentum and buoyancy deposition. In general, the effect of horizontal GW propagation and the GW response to horizontal large-scale flow gradients is rather observed in nonmigrating than in migrating tidal components.
Trace Elements in Ovaries: Measurement and Physiology.
Ceko, Melanie J; O'Leary, Sean; Harris, Hugh H; Hummitzsch, Katja; Rodgers, Raymond J
2016-04-01
Traditionally, research in the field of trace element biology and human and animal health has largely depended on epidemiological methods to demonstrate involvement in biological processes. These studies were typically followed by trace element supplementation trials or attempts at identification of the biochemical pathways involved. With the discovery of biological molecules that contain the trace elements, such as matrix metalloproteinases containing zinc (Zn), cytochrome P450 enzymes containing iron (Fe), and selenoproteins containing selenium (Se), much of the current research focuses on these molecules, and, hence, only indirectly on trace elements themselves. This review focuses largely on two synchrotron-based x-ray techniques: X-ray absorption spectroscopy and x-ray fluorescence imaging that can be used to identify the in situ speciation and distribution of trace elements in tissues, using our recent studies of bovine ovaries, where the distribution of Fe, Se, Zn, and bromine were determined. It also discusses the value of other techniques, such as inductively coupled plasma mass spectrometry, used to garner information about the concentrations and elemental state of the trace elements. These applications to measure trace elemental distributions in bovine ovaries at high resolutions provide new insights into possible roles for trace elements in the ovary.
Setiani, Tia Dwi; Suprijadi, Haryanto, Freddy
2016-03-01
Monte Carlo (MC) is one of the powerful techniques for simulation in x-ray imaging. MC method can simulate the radiation transport within matter with high accuracy and provides a natural way to simulate radiation transport in complex systems. One of the codes based on MC algorithm that are widely used for radiographic images simulation is MC-GPU, a codes developed by Andrea Basal. This study was aimed to investigate the time computation of x-ray imaging simulation in GPU (Graphics Processing Unit) compared to a standard CPU (Central Processing Unit). Furthermore, the effect of physical parameters to the quality of radiographic images and the comparison of image quality resulted from simulation in the GPU and CPU are evaluated in this paper. The simulations were run in CPU which was simulated in serial condition, and in two GPU with 384 cores and 2304 cores. In simulation using GPU, each cores calculates one photon, so, a large number of photon were calculated simultaneously. Results show that the time simulations on GPU were significantly accelerated compared to CPU. The simulations on the 2304 core of GPU were performed about 64 -114 times faster than on CPU, while the simulation on the 384 core of GPU were performed about 20 - 31 times faster than in a single core of CPU. Another result shows that optimum quality of images from the simulation was gained at the history start from 108 and the energy from 60 Kev to 90 Kev. Analyzed by statistical approach, the quality of GPU and CPU images are relatively the same.
Magalhães, T.; Carvalho, M. L.; Von Bohlen, A.; Becker, M.
2010-06-01
In this work Total-reflection X-ray fluorescence (TXRF) was used to analyse healthy and cancerous tissues of the same individual along several contiguous thin sections of each tissue. Thirty two samples (16 pairs) of breast tissue, 30 samples (15 pairs) of intestine tissue and 10 samples (5 pairs) of stomach tissue were analysed. The samples were obtained in Civil Hospitals of Germany (Dortmund) and Portugal (Lisbon). The elemental distribution of P, S, Cl, K, Ca, Cr, Mn, Fe, Ni, Cu, Zn, Se, Br, Rb, Sr and Pb in these samples was studied. Descriptive statistics based on bar graphics and hypotheses tests and also an automatic classification based on hierarchical grouping analysis was used for the several analysed tissues. It was shown that the behaviour of the elements is tissue dependent. Some elements, like P and K exhibit the same behaviour in all the analysed tissue types. They have increased concentrations in all cancerous tissues. Unlike, other elements like Br show completely different behaviour depending on the tissue: similar concentration in healthy and cancerous stomach, decreased levels in colon cancerous tissues and enhanced concentrations in breast was observed. Moreover cancer tissues present decreased Se concentrations on colon and increased on breast.
Takahashi, Atsushi; Igarashi, Shukuro; Ueki, Yasuo [Ibaraki Univ., Hitachi (Japan). Faculty of Engineering; Yamaguchi, Hitoshi [National Research Inst. for Metals, Ibaraki (Japan)
2000-11-01
A homogeneous liquid-liquid extraction method for 36 metal ions with diethyldithiocarbamate was studied. As a result, 11 metal ions were extracted as metal-chelates. Under the experimental conditions, the maximum concentration factor was 500 (i.e., 0.1 mL of sedimented liquid phase was produced from 50 mL of aqueous phase). Moreover, the proposed method was utilized as a preconcentration method for X-ray fluorescence analysis of these metals. The recovery of each metal was ca. 97-100%. All calibration curves were linear over the range of 5.0 x 10{sup -7} mol L{sup -1} to 1.0 x 10{sup -5} mol L{sup -1}. The detection limits were at the 10{sup -8} mol L{sup -1} levels and the relative standard deviations were below 5% (5 determinations). When the proposed method was used for the determination of contaminants in a synthetic sample (Al-based alloy model) and of components in an Au-Pd alloy, the results were satisfactory. (orig.)
Mohammad Javad Dargahi
2011-01-01
Full Text Available High data rate acoustic transmission is required for diverse underwater operations such as the retrieval of large amounts of data from bottom packages and real time transmission of signals from underwater sensors. The major obstacle to underwater acoustic communication is the interference of multipath signals due to surface and bottom reflections. High speed acoustic transmission over a shallow water channel characterized by small grazing angles presents formidable difficulties. The reflection losses associated with such small angles are low, causing large amplitudes in multi-path signals. In this paper, based on the results obtained from practical measurements in the Persian Gulf and available data about sound speed variations in different depths, we propose a simple but effective model for shallow water short-range multipath acoustic channel. Based on the Ray theory, mathematical modeling of multipath effects is carried out. Also in channel modeling, the attenuation due to the wave scatterings at the surface and its bottom reflections for deferent grazing angles and bottom types is considered. In addition, we consider the attenuations due to the absorption of different materials and ambient noises such as see-state noise, shipping noise, thermal noise and turbulences. We use a three-dimensional hydrodynamic model (COHERENS in a fully prognostic mode to study the circulation and water mass properties of the Persian Gulf - a large inverse estuary. Maximum sound speed occurs during the summer in the Persian Gulf which decreases gradually moving from the Strait of Hormuz to the north western part of the Gulf. A gradual decrease in sound speed profiles with depth was commonly observed in almost all parts of the Gulf. However, an exception occurred in the Strait of Hormuz during the winter. The results of the model are in very good agreement with our observations.
Nicholas, Sarah L.; Erickson, Melinda L.; Woodruff, Laurel G.; Knaeble, Alan R.; Marcus, Matthew A.; Lynch, Joshua K.; Toner, Brandy M.
2017-08-01
Arsenic (As) is a geogenic contaminant affecting groundwater in geologically diverse systems globally. Arsenic release from aquifer sediments to groundwater is favored when biogeochemical conditions, especially oxidation-reduction (redox) potential, in aquifers fluctuate. The specific objective of this research is to identify the solid-phase sources and geochemical mechanisms of release of As in aquifers of the Des Moines Lobe glacial advance. The overarching concept is that conditions present at the aquifer-aquitard interfaces promote a suite of geochemical reactions leading to mineral alteration and release of As to groundwater. A microprobe X-ray absorption spectroscopy (μXAS) approach is developed and applied to rotosonic drill core samples to identify the solid-phase speciation of As in aquifer, aquitard, and aquifer-aquitard interface sediments. This approach addresses the low solid-phase As concentrations, as well as the fine-scale physical and chemical heterogeneity of the sediments. The spectroscopy data are analyzed using novel cosine-distance and correlation-distance hierarchical clustering for Fe 1s and As 1s μXAS datasets. The solid-phase Fe and As speciation is then interpreted using sediment and well-water chemical data to propose solid-phase As reservoirs and release mechanisms. The results confirm that in two of the three locations studied, the glacial sediment forming the aquitard is the source of As to the aquifer sediments. The results are consistent with three different As release mechanisms: (1) desorption from Fe (oxyhydr)oxides, (2) reductive dissolution of Fe (oxyhydr)oxides, and (3) oxidative dissolution of Fe sulfides. The findings confirm that glacial sediments at the interface between aquifer and aquitard are geochemically active zones for As. The diversity of As release mechanisms is consistent with the geographic heterogeneity observed in the distribution of elevated-As wells.
Walther, D; Bartha, G; Morris, M
2001-05-01
A pivotal step in electrophoresis sequencing is the conversion of the raw, continuous chromatogram data into the actual sequence of discrete nucleotides, a process referred to as basecalling. We describe a novel algorithm for basecalling implemented in the program LifeTrace. Like Phred, currently the most widely used basecalling software program, LifeTrace takes processed trace data as input. It was designed to be tolerant to variable peak spacing by means of an improved peak-detection algorithm that emphasizes local chromatogram information over global properties. LifeTrace is shown to generate high-quality basecalls and reliable quality scores. It proved particularly effective when applied to MegaBACE capillary sequencing machines. In a benchmark test of 8372 dye-primer MegaBACE chromatograms, LifeTrace generated 17% fewer substitution errors, 16% fewer insertion/deletion errors, and 2.4% more aligned bases to the finished sequence than did Phred. For two sets totaling 6624 dye-terminator chromatograms, the performance improvement was 15% fewer substitution errors, 10% fewer insertion/deletion errors, and 2.1% more aligned bases. The processing time required by LifeTrace is comparable to that of Phred. The predicted quality scores were in line with observed quality scores, permitting direct use for quality clipping and in silico single nucleotide polymorphism (SNP) detection. Furthermore, we introduce a new type of quality score associated with every basecall: the gap-quality. It estimates the probability of a deletion error between the current and the following basecall. This additional quality score improves detection of single basepair deletions when used for locating potential basecalling errors during the alignment. We also describe a new protocol for benchmarking that we believe better discerns basecaller performance differences than methods previously published.
Ghosh, D.
2007-07-01
Homogeneous nucleation rates of the n-alcohols and the n-alkanes have been determined by combining information from two sets of supersonic Laval nozzle expansion experiments under identical conditions. The nucleation rates J=N/{delta}t{sub Jmax} for the n-alcohols are in the range of 1.10{sup 17}
钱昕; 殷庆纵; 王栋
2012-01-01
In Intelligent Lunar Vehicle's infrared tracking automatic control, using reflective infrared sensor array for the analog photoelectric sensor detects the path. Process the sensor data and relative error through the detection of nonlinear analog A / D sampling, real ?time process and non-linearized the control parameters in the PID variable scale incremental control algorithm; In addition, incremental PD formula can be derived by fuzzy control technology and the sensitivity and stability of the car tracing can be improved by adjusting the parameters in programming. It has been proved by that this method can shorten response time of the lunar exploration vehicle, reduce the random interference, optimize the tracking line, effectively eliminate oscillation and slow response of turn, at the meantime it can also improve mobility.%智能探月车红外循迹自动控制中主要采用反射式红外传感器的模拟光电传感器阵列进行路径检测;通过A/D采样方式检测非线性模拟量,对传感器采样数据及相对误差进行处理,引入变尺度增量式PID控制算法,对控制参数采用实时非线性整定,实现智能闭环控制;另外,应用模糊控制技术推导增量式PD计算公式,通过编程设定可调节参数以改善小车寻迹的灵敏度和稳定性;实践证明该方法可以缩短探月车的控制响应时间,降低随机干扰,优化循迹路线,有效地克服其在行驶中易产生振荡、转弯时反应迟钝、机动性差的缺点.
Simulation of Ray Tracing in Misaligned Optical System under Mechanical Vibration%振动失调下光学系统光线追迹仿真计算
沈东富; 刘顺发; 扈宏毅
2013-01-01
运动平台上的光学系统不可避免的产生镜面失调,如何描述失调光学系统的光路传输是一个非常迫切的问题.本文利用Matlab符号运算,通过两次坐标旋转和坐标平移得出了失调镜面方程,随后用矢量形式反射定律求出光线经过失调镜面后的方向矢量,通过光线追迹法,建立了失调光路传输模型.实例计算中,以某仪器的望远镜结构为光学模型,用Patran_Nastran的瞬态求解功能,求得在xyz方向同时加载相应正弦振动时镜面的失调位移,随后利用光路传输模型,得出了过主镜中心的光线(视轴)经主镜、次镜和反射镜反射后的最大抖动角度分别为0.0034°,0.0161°,0.0177°,并且得出了光斑在靶面及空间运动轨迹.%A simulation of light beam propagation in the misaligned optical system under mechanical vibration is deduced by the means of ray tracing method based on coordinate transformation and reflection law. A telescope structure of an apparatus was constructed which was then imported into Patran_Nastran to perform a transient response after sine vibrations was applied in the xyz directions simultaneously so as to obtain the displacement of key points of mirrors. Then, the displacement was used by the simulation program to compute the jitter angles. The amplitude of beam jitter angles were 0.003 4°, 0.016 1°, 0.017 7° respectively after optical beam reflected by primary mirror, secondary mirror and reflection mirror. Moreover, the traces of facula in the target plane and in the space were drawn.
基于图形处理器的X射线锥束成像模拟算法%A GPU-Based Algorithm for the Simulation of X-Ray Cone-Beam Imaging
杨涛; 赵星
2011-01-01
To accelerate the simulation of X-ray cone-beam imaging, a GPU（ graphics processing unit） based algorithm is proposed in this paper. The algorithm generates X-ray image by accumulating the contribution of voxels along each X-ray. Intersection lengths of these voxels with X-ray are calculated by classifying the intersection types, which reduces the time-consuming dynamic branches compared to the famous incremental Siddon algorithm. To improve image quality, sampled values along X-ray are computed by GPU hardware supported linear interpolation instead of nearest interpolation used by the incremental Siddon algorithm. The experiment of the projection calculation of Shepp-logan phantom shows that the simulation speed is improved by 44% averagely as compared to the GPU-based incremental Siddon algorithm and a better image quality is achieved. Finally, the proposed algorithm is validated by the experiment with real measured data.%针对X射线锥束成像模拟计算量大、速度慢的问题，提出了一种基于图形处理器（GPU）的快速成像模拟算法。该算法沿着每条射线累加所经过体素对投影值的贡献量，实现了对X射线成像的模拟。在计算射线与体素的交线长时，采用分类处理交线的方法，减少了增量Sid—don算法的动态分支计算。为了提高投影图像质量，该算法还用GPU硬件线性插值采样取代Sid—don算法的邻近插值采样。对三维Shepp—logan模型的测试结果表明，该算法的速度比基于GPU的增量Siddon算法平均提高了44％，而且图像质量明显提高。最后，用实测数据进一步验证了算法的有效性。
Epidemic contact tracing via communication traces.
Katayoun Farrahi
Full Text Available Traditional contact tracing relies on knowledge of the interpersonal network of physical interactions, where contagious outbreaks propagate. However, due to privacy constraints and noisy data assimilation, this network is generally difficult to reconstruct accurately. Communication traces obtained by mobile phones are known to be good proxies for the physical interaction network, and they may provide a valuable tool for contact tracing. Motivated by this assumption, we propose a model for contact tracing, where an infection is spreading in the physical interpersonal network, which can never be fully recovered; and contact tracing is occurring in a communication network which acts as a proxy for the first. We apply this dual model to a dataset covering 72 students over a 9 month period, for which both the physical interactions as well as the mobile communication traces are known. Our results suggest that a wide range of contact tracing strategies may significantly reduce the final size of the epidemic, by mainly affecting its peak of incidence. However, we find that for low overlap between the face-to-face and communication interaction network, contact tracing is only efficient at the beginning of the outbreak, due to rapidly increasing costs as the epidemic evolves. Overall, contact tracing via mobile phone communication traces may be a viable option to arrest contagious outbreaks.
Quantum Hamiltonian Identification from Measurement Time Traces
Zhang, Jun; Sarovar, Mohan
2014-08-01
Precise identification of parameters governing quantum processes is a critical task for quantum information and communication technologies. In this Letter, we consider a setting where system evolution is determined by a parametrized Hamiltonian, and the task is to estimate these parameters from temporal records of a restricted set of system observables (time traces). Based on the notion of system realization from linear systems theory, we develop a constructive algorithm that provides estimates of the unknown parameters directly from these time traces. We illustrate the algorithm and its robustness to measurement noise by applying it to a one-dimensional spin chain model with variable couplings.
Room Acoustical Simulation Algorithm Based on the Free Path Distribution
VORLÄNDER, M.
2000-04-01
A new algorithm is presented which provides estimates of impulse responses in rooms. It is applicable to arbitrary shaped rooms, thus including non-diffuse spaces like workrooms or offices. In the latter cases, for instance, sound propagation curves are of interest to be applied in noise control. In the case of concert halls and opera houses, the method enables very fast predictions of room acoustical criteria like reverberation time, strength or clarity. The method is based on a low-resolved ray tracing and recording of the free paths. Estimates of impulse responses are derived from evaluation of the free path distribution and of the free path transition probabilities.
Algorithms for Protein Structure Prediction
Paluszewski, Martin
The problem of predicting the three-dimensional structure of a protein given its amino acid sequence is one of the most important open problems in bioinformatics. One of the carbon atoms in amino acids is the C-atom and the overall structure of a protein is often represented by a so-called C...... is competitive in quality and speed with other state-of-the-art decoy generation algorithms. Our third C-trace reconstruction approach is based on bee-colony optimization [24]. We demonstrate why this algorithm has some important properties that makes it suitable for protein structure prediction. Our approach......-trace. Here we present three different approaches for reconstruction of C-traces from predictable measures. In our first approach [63, 62], the C-trace is positioned on a lattice and a tabu-search algorithm is applied to find minimum energy structures. The energy function is based on half-sphere-exposure (HSE...
Algorithms for Protein Structure Prediction
Paluszewski, Martin
-trace. Here we present three different approaches for reconstruction of C-traces from predictable measures. In our first approach [63, 62], the C-trace is positioned on a lattice and a tabu-search algorithm is applied to find minimum energy structures. The energy function is based on half-sphere-exposure (HSE......) is more robust than standard Monte Carlo search. In the second approach for reconstruction of C-traces, an exact branch and bound algorithm has been developed [67, 65]. The model is discrete and makes use of secondary structure predictions, HSE, CN and radius of gyration. We show how to compute good lower...... bounds for partial structures very fast. Using these lower bounds, we are able to find global minimum structures in a huge conformational space in reasonable time. We show that many of these global minimum structures are of good quality compared to the native structure. Our branch and bound algorithm...