Sample records for computer generated holograms

  1. Use of Computer-Generated Holograms in Security Hologram Applications

    Directory of Open Access Journals (Sweden)

    Bulanovs A.


    Full Text Available The article discusses the use of computer-generated holograms (CGHs for the application as one of the security features in the relief-phase protective holograms. An improved method of calculating CGHs is presented, based on ray-tracing approach in the case of interference of parallel rays.

  2. Wavefront reconstruction using computer-generated holograms (United States)

    Schulze, Christian; Flamm, Daniel; Schmidt, Oliver A.; Duparré, Michael


    We propose a new method to determine the wavefront of a laser beam, based on modal decomposition using computer-generated holograms (CGHs). Thereby the beam under test illuminates the CGH with a specific, inscribed transmission function that enables the measurement of modal amplitudes and phases by evaluating the first diffraction order of the hologram. Since we use an angular multiplexing technique, our method is innately capable of real-time measurements of amplitude and phase, yielding the complete information about the optical field. A measurement of the Stokes parameters, respectively of the polarization state, provides the possibility to calculate the Poynting vector. Two wavefront reconstruction possibilities are outlined: reconstruction from the phase for scalar beams and reconstruction from the Poynting vector for inhomogeneously polarized beams. To quantify single aberrations, the reconstructed wavefront is decomposed into Zernike polynomials. Our technique is applied to beams emerging from different kinds of multimode optical fibers, such as step-index, photonic crystal and multicore fibers, whereas in this work results are exemplarily shown for a step-index fiber and compared to a Shack-Hartmann measurement that serves as a reference.

  3. Binary encoded computer generated holograms for temporal phase shifting. (United States)

    Amphawan, Angela


    The trend towards real-time optical applications predicates the need for real-time interferometry. For real-time interferometric applications, rapid processing of computer generated holograms is crucial as the intractability of rapid phase changes may compromise the input to the system. This paper introduces the design of a set of binary encoded computer generated holograms (CGHs) for real-time five-frame temporal phase shifting interferometry using a binary amplitude spatial light modulator. It is suitable for portable devices with constraints in computational power. The new set of binary encoded CGHs is used for measuring the phase of the generated electric field for a real-time selective launch in multimode fiber. The processing time for the new set of CGHs was reduced by up to 65% relative to the original encoding scheme. The results obtained from the new interferometric technique are in good agreement with the results obtained by phase shifting by means of a piezo-driven flat mirror.

  4. Computer Generated Hologram System for Wavefront Measurement System Calibration (United States)

    Olczak, Gene


    Computer Generated Holograms (CGHs) have been used for some time to calibrate interferometers that require nulling optics. A typical scenario is the testing of aspheric surfaces with an interferometer placed near the paraxial center of curvature. Existing CGH technology suffers from a reduced capacity to calibrate middle and high spatial frequencies. The root cause of this shortcoming is as follows: the CGH is not placed at an image conjugate of the asphere due to limitations imposed by the geometry of the test and the allowable size of the CGH. This innovation provides a calibration system where the imaging properties in calibration can be made comparable to the test configuration. Thus, if the test is designed to have good imaging properties, then middle and high spatial frequency errors in the test system can be well calibrated. The improved imaging properties are provided by a rudimentary auxiliary optic as part of the calibration system. The auxiliary optic is simple to characterize and align to the CGH. Use of the auxiliary optic also reduces the size of the CGH required for calibration and the density of the lines required for the CGH. The resulting CGH is less expensive than the existing technology and has reduced write error and alignment error sensitivities. This CGH system is suitable for any kind of calibration using an interferometer when high spatial resolution is required. It is especially well suited for tests that include segmented optical components or large apertures.

  5. Modelling the spatial shape of nondiffracting beams: Experimental generation of Frozen Waves via computer generated holograms


    Vieira, Tárcio A.; Zamboni-Rached, Michel; Gesualdi, Marcos R. R.


    In this paper we implement experimentally the spatial shape modelling of nondiffracting optical beams via computer generated holograms. The results reported here are the experimental confirmation of the so called Frozen Wave method, developed few years ago. Optical beams of this type can possess potential applications in optical tweezers, medicine, atom guiding, remote sensing, etc..

  6. OAM beams from incomplete computer generated holograms projected onto a DMD (United States)

    Zambale, Niña Angelica F.; Doblado, Gerald John H.; Hermosa, Nathaniel


    We show that optical beams with orbital angular momentum (OAM) can be generated even with incomplete computer generated holograms (CGH). These holograms are made such that random portions of it do not contain any information. We observe that although the beams produced with these holograms are less intense, these beams maintain their shape and that their topological charges are not affected. Furthermore, we show that superposition of two or more beams can be created using separate incomplete CGHs interspersed together. Our result is significant especially since most method to generate beams with OAM for various applications rely on pixelated devices or optical elements with imperfections.

  7. Floating full-color image with computer-generated alcove rainbow hologram (United States)

    Yamaguchi, Takeshi; Yoshikawa, Hiroshi


    We have investigated the floating full color image display with the computer-generated hologram (CGH). The floating image, when utilized as a 3D display, gives strong impression to the viewer. In our previous study, to change the CGH shape from the flat type to the half cylindrical type, the floating image from the output CGH has the nearly 180 degrees viewing angle. However, since the previous CGH does not have wavelength-selectivity, reconstructed image only has a single color. Also, the huge calculation amount of the fringe pattern is big problem. Therefore, we now propose the rainbow-type computer generated alcove hologram. To decrease the calculation amount, the rainbow hologram sacrifices the vertical parallax. Also, this hologram can reconstruct an image with white light. Compared with the previous study of the Fresnel type, the calculation speed becomes 165 times faster. After calculation, we print this hologram with a fringe printer, and evaluate reconstructed floating full color images. In this study, we introduce the computer-generated rainbow hologram into the floating image display. The rainbow hologram can reconstruct full color image with white light illumination. It can be recorded by using a horizontal slit to limit the vertical parallax. Therefore, the slit changes into the half cylindrical slit, the wide viewing angle floating image display can reconstruct full color image.

  8. Depth compensating calculation method of computer-generated holograms using symmetry and similarity of zone plates (United States)

    Wei, Hui; Gong, Guanghong; Li, Ni


    Computer-generated hologram (CGH) is a promising 3D display technology while it is challenged by heavy computation load and vast memory requirement. To solve these problems, a depth compensating CGH calculation method based on symmetry and similarity of zone plates is proposed and implemented on graphics processing unit (GPU). An improved LUT method is put forward to compute the distances between object points and hologram pixels in the XY direction. The concept of depth compensating factor is defined and used for calculating the holograms of points with different depth positions instead of layer-based methods. The proposed method is suitable for arbitrary sampling objects with lower memory usage and higher computational efficiency compared to other CGH methods. The effectiveness of the proposed method is validated by numerical and optical experiments.

  9. High-efficiency photorealistic computer-generated holograms based on the backward ray-tracing technique (United States)

    Wang, Yuan; Chen, Zhidong; Sang, Xinzhu; Li, Hui; Zhao, Linmin


    Holographic displays can provide the complete optical wave field of a three-dimensional (3D) scene, including the depth perception. However, it often takes a long computation time to produce traditional computer-generated holograms (CGHs) without more complex and photorealistic rendering. The backward ray-tracing technique is able to render photorealistic high-quality images, which noticeably reduce the computation time achieved from the high-degree parallelism. Here, a high-efficiency photorealistic computer-generated hologram method is presented based on the ray-tracing technique. Rays are parallelly launched and traced under different illuminations and circumstances. Experimental results demonstrate the effectiveness of the proposed method. Compared with the traditional point cloud CGH, the computation time is decreased to 24 s to reconstruct a 3D object of 100 ×100 rays with continuous depth change.

  10. Full parallax three-dimensional computer generated hologram with occlusion effect using ray casting technique

    International Nuclear Information System (INIS)

    Zhang, Hao; Tan, Qiaofeng; Jin, Guofan


    Holographic display is capable of reconstructing the whole optical wave field of a three-dimensional (3D) scene. It is the only one among all the 3D display techniques that can produce all the depth cues. With the development of computing technology and spatial light modulators, computer generated holograms (CGHs) can now be used to produce dynamic 3D images of synthetic objects. Computation holography becomes highly complicated and demanding when it is employed to produce real 3D images. Here we present a novel algorithm for generating a full parallax 3D CGH with occlusion effect, which is an important property of 3D perception, but has often been neglected in fully computed hologram synthesis. The ray casting technique, which is widely used in computer graphics, is introduced to handle the occlusion issue of CGH computation. Horizontally and vertically distributed rays are projected from each hologram sample to the 3D objects to obtain the complex amplitude distribution. The occlusion issue is handled by performing ray casting calculations to all the hologram samples. The proposed algorithm has no restriction on or approximation to the 3D objects, and hence it can produce reconstructed images with correct shading effect and no visible artifacts. Programmable graphics processing unit (GPU) is used to perform parallel calculation. This is made possible because each hologram sample belongs to an independent operation. To demonstrate the performance of our proposed algorithm, an optical experiment is performed to reconstruct the 3D scene by using a phase-only spatial light modulator. We can easily perceive the accommodation cue by focusing our eyes on different depths of the scene and the motion parallax cue with occlusion effect by moving our eyes around. The experiment result confirms that the CGHs produced by our algorithm can successfully reconstruct 3D images with all the depth cues.

  11. 3D computer generated medical holograms using spatial light modulators

    Directory of Open Access Journals (Sweden)

    Ahmed Sheet


    Full Text Available The aim of this work is to electronically generate the diffraction patterns of medical images and then trying to optically reconstruct the corresponding holographs to be displayed in space. This method is proposed in a trial to find a smart alternative of the expensive and perishable recording plates.

  12. Accelerated algorithm for three-dimensional computer generated hologram based on the ray-tracing method (United States)

    Xie, Z. W.; Zang, J. L.; Zhang, Y.


    An accelerated algorithm for three-dimensional computer generated holograms (CGHs) based on the ray-tracing method is proposed. The complex amplitude distribution from the center point of an object is calculated in advance and the field distributions of rest points on the hologram plane can be given by doing a small translation and an aberration to the pre-calculated field. A static two-dimensional car, a three-dimensional teapot, and a dynamic three-dimensional rotating teapot are reconstructed from CGHs calculated with the accelerated algorithm to prove its validity. The simulation results demonstrate that the accelerated algorithm is eight times faster than the conventional ray-tracing algorithm.

  13. Topological transformation of fractional optical vortex beams using computer generated holograms (United States)

    Maji, Satyajit; Brundavanam, Maruthi M.


    Optical vortex beams with fractional topological charges (TCs) are generated by the diffraction of a Gaussian beam using computer generated holograms embedded with mixed screw-edge dislocations. When the input Gaussian beam has a finite wave-front curvature, the generated fractional vortex beams show distinct topological transformations in comparison to the integer charge optical vortices. The topological transformations at different fractional TCs are investigated through the birth and evolution of the points of phase singularity, the azimuthal momentum transformation, occurrence of critical points in the transverse momentum and the vorticity around the singular points. This study is helpful to achieve better control in optical micro-manipulation applications.

  14. Speckle noise reduction for computer generated holograms of objects with diffuse surfaces (United States)

    Symeonidou, Athanasia; Blinder, David; Ahar, Ayyoub; Schretter, Colas; Munteanu, Adrian; Schelkens, Peter


    Digital holography is mainly used today for metrology and microscopic imaging and is emerging as an important potential technology for future holographic television. To generate the holographic content, computer-generated holography (CGH) techniques convert geometric descriptions of a 3D scene content. To model different surface types, an accurate model of light propagation has to be considered, including for example, specular and diffuse reflection. In previous work, we proposed a fast CGH method for point cloud data using multiple wavefront recording planes, look-up tables (LUTs) and occlusion processing. This work extends our method to account for diffuse reflections, enabling rendering of deep 3D scenes in high resolution with wide viewing angle support. This is achieved by modifying the spectral response of the light propagation kernels contained by the look-up tables. However, holograms encoding diffuse reflective surfaces depict significant amounts of speckle noise, a problem inherent to holography. Hence, techniques to improve the reduce speckle noise are evaluated in this paper. Moreover, we propose as well a technique to suppress the aperture diffraction during numerical, viewdependent rendering by apodizing the hologram. Results are compared visually and in terms of their respective computational efficiency. The experiments show that by modelling diffuse reflection in the LUTs, a more realistic yet computationally efficient framework for generating high-resolution CGH is achieved.

  15. Occlusion culling and calculation for a computer generated hologram using spatial frequency index method (United States)

    Zhao, Kai; Huang, Yingqing; Yan, Xingpeng; Jiang, Xiaoyu


    A spatial frequency index method is proposed to cull occlusion and generate a hologram. Object points with the same spatial frequency are put into a set for their mutual occlusion. The hidden surfaces of the three-dimensional (3D) scene are quickly removed through culling the object points that are furthest from the hologram plane in the set. The phases of plane wave, which are only interrelated with the spatial frequencies, are precomputed and stored in a table. According to the spatial frequency of the object points, the phases of plane wave for generating fringes are obtained directly from the table. Three 3D scenes are chosen to verify the spatial frequency index method. Both numerical simulation and optical reconstruction are performed. Experimental results demonstrate that the proposed method can cull the hidden surfaces of the 3D scene correctly. The occlusion effect of the 3D scene can be well reproduced. The computational speed is better than that obtained using conventional methods but is still time-consuming.

  16. Compression of computer generated phase-shifting hologram sequence using AVC and HEVC (United States)

    Xing, Yafei; Pesquet-Popescu, Béatrice; Dufaux, Frederic


    With the capability of achieving twice the compression ratio of Advanced Video Coding (AVC) with similar reconstruction quality, High Efficiency Video Coding (HEVC) is expected to become the newleading technique of video coding. In order to reduce the storage and transmission burden of digital holograms, in this paper we propose to use HEVC for compressing the phase-shifting digital hologram sequences (PSDHS). By simulating phase-shifting digital holography (PSDH) interferometry, interference patterns between illuminated three dimensional( 3D) virtual objects and the stepwise phase changed reference wave are generated as digital holograms. The hologram sequences are obtained by the movement of the virtual objects and compressed by AVC and HEVC. The experimental results show that AVC and HEVC are efficient to compress PSDHS, with HEVC giving better performance. Good compression rate and reconstruction quality can be obtained with bitrate above 15000kbps.

  17. Three-dimensional imaging using computer-generated holograms synthesized from 3-D Fourier spectra

    International Nuclear Information System (INIS)

    Yatagai, Toyohiko; Miura, Ken-ichi; Sando, Yusuke; Itoh, Masahide


    Computer-generated holograms(CGHs) synthesized from projection images of real existing objects are considered. A series of projection images are recorded both vertically and horizontally with an incoherent light source and a color CCD. According to the principles of computer tomography(CT), the 3-D Fourier spectrum is calculated from several projection images of objects and the Fresnel CGH is synthesized using a part of the 3-D Fourier spectrum. This method has following advantages. At first, no-blur reconstructed images in any direction are obtained owing to two-dimensionally scanning in recording. Secondarily, since not interference fringes but simple projection images of objects are recorded, a coherent light source is not necessary. Moreover, when a color CCD is used in recording, it is easily possible to record and reconstruct colorful objects. Finally, we demonstrate reconstruction of biological objects.

  18. Fast calculation method of computer-generated hologram using a depth camera with point cloud gridding (United States)

    Zhao, Yu; Shi, Chen-Xiao; Kwon, Ki-Chul; Piao, Yan-Ling; Piao, Mei-Lan; Kim, Nam


    We propose a fast calculation method for a computer-generated hologram (CGH) of real objects that uses a point cloud gridding method. The depth information of the scene is acquired using a depth camera and the point cloud model is reconstructed virtually. Because each point of the point cloud is distributed precisely to the exact coordinates of each layer, each point of the point cloud can be classified into grids according to its depth. A diffraction calculation is performed on the grids using a fast Fourier transform (FFT) to obtain a CGH. The computational complexity is reduced dramatically in comparison with conventional methods. The feasibility of the proposed method was confirmed by numerical and optical experiments.

  19. Occlusion processing for computer generated hologram by conversion between the wavefront and light-ray information (United States)

    Wakunami, Koki; Yamaguchi, Masahiro


    In the field of computational holography for three-dimensional (3D) display, the mutual occlusion of objects is one of the crucial issues. We propose a new mutual occlusion processing that is achieved by the conversion between the light-ray and wavefront on a virtual plane called ray-sampling (RS) plane located at near the interrupting object. The wavefront coming from background scene is converted into light-ray information at the RS plane by using Fourier transform based on the angular spectrum theory, then the converted light-rays are overwritten with those from interrupting object in the light-ray domain as an occlusion culling process. The ray information after the occlusion process is reconverted into wavefront by inverse Fourier transform at each RS point, then wave propagation from RS plane to hologram is computed by general light diffraction computation techniques. Since the light-ray information is used for the occlusion processing, our approach can realize a correct occlusion effect by a simple algorithm. In addition, high resolution 3D image can be reconstructed with wavefront-based technique. In the numerical simulation, we demonstrate that our approach for deep 3D scene with plural objects can realize a correct occlusion culling for varying observation angle and focusing distance.

  20. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer (United States)

    Guarnieri, Vittorio; Francini, Franco


    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  1. Enhancing performance of LCoS-SLM as adaptive optics by using computer-generated holograms modulation software (United States)

    Tsai, Chun-Wei; Lyu, Bo-Han; Wang, Chen; Hung, Cheng-Chieh


    We have already developed multi-function and easy-to-use modulation software that was based on LabVIEW system. There are mainly four functions in this modulation software, such as computer generated holograms (CGH) generation, CGH reconstruction, image trimming, and special phase distribution. Based on the above development of CGH modulation software, we could enhance the performance of liquid crystal on silicon - spatial light modulator (LCoSSLM) as similar as the diffractive optical element (DOE) and use it on various adaptive optics (AO) applications. Through the development of special phase distribution, we are going to use the LCoS-SLM with CGH modulation software into AO technology, such as optical microscope system. When the LCOS-SLM panel is integrated in an optical microscope system, it could be placed on the illumination path or on the image forming path. However, LCOS-SLM provides a program-controllable liquid crystal array for optical microscope. It dynamically changes the amplitude or phase of light and gives the obvious advantage, "Flexibility", to the system

  2. Ultrafast layer based computer-generated hologram calculation with sparse template holographic fringe pattern for 3-D object. (United States)

    Kim, Hak Gu; Man Ro, Yong


    In this paper, we propose a new ultrafast layer based CGH calculation that exploits the sparsity of hologram fringe pattern in 3-D object layer. Specifically, we devise a sparse template holographic fringe pattern. The holographic fringe pattern on a depth layer can be rapidly calculated by adding the sparse template holographic fringe patterns at each object point position. Since the size of sparse template holographic fringe pattern is much smaller than that of the CGH plane, the computational load can be significantly reduced. Experimental results show that the proposed method achieves 10-20 msec for 1024x1024 pixels providing visually plausible results.

  3. Fast distributed large-pixel-count hologram computation using a GPU cluster. (United States)

    Pan, Yuechao; Xu, Xuewu; Liang, Xinan


    Large-pixel-count holograms are one essential part for big size holographic three-dimensional (3D) display, but the generation of such holograms is computationally demanding. In order to address this issue, we have built a graphics processing unit (GPU) cluster with 32.5 Tflop/s computing power and implemented distributed hologram computation on it with speed improvement techniques, such as shared memory on GPU, GPU level adaptive load balancing, and node level load distribution. Using these speed improvement techniques on the GPU cluster, we have achieved 71.4 times computation speed increase for 186M-pixel holograms. Furthermore, we have used the approaches of diffraction limits and subdivision of holograms to overcome the GPU memory limit in computing large-pixel-count holograms. 745M-pixel and 1.80G-pixel holograms were computed in 343 and 3326 s, respectively, for more than 2 million object points with RGB colors. Color 3D objects with 1.02M points were successfully reconstructed from 186M-pixel hologram computed in 8.82 s with all the above three speed improvement techniques. It is shown that distributed hologram computation using a GPU cluster is a promising approach to increase the computation speed of large-pixel-count holograms for large size holographic display.

  4. Design of computer-generated hologram with ring focus for nonmechanical corneal trephination with Er:YAG laser in penetrating keratoplasty. (United States)

    Langenbucher, A; Seitz, B; Kus, M M; van der Heyd, G; Köchle, M; Naumann, G O


    To calculate a beam-shaping optical element for homogeneous intensity distribution within a focal ring to be used in nonmechanical trephination with the Er:YAG laser in penetrating keratoplasty instead of a spot guiding device. The phase distribution behind a holographic optical element (HOE) k psi(u) can be described by the addition of the hologram phase phiH(u) to the beam phase phiE(u): k psi(u) = phiH(u) + phiE(u), k = 2pi/lambda, where u denotes the coordinates inside the hologram aperture, k an integer, and lambda the laser wavelength. To avoid discontinuous wavefronts leading to speckle noise, a smooth phase function is necessary. After transforming the hologram aperture coordinates into the focal plane x in a focal distance f, psi can be retrieved from the slope equation: inverted delta psi(u) = x(u) - u/f. Creating a ring focus can be reduced to an essentially one-dimensional problem by separation of variables due to the symmetry condition. We calculated a computer-generated eight-level phase-only HOE with 4096 x 4096 pixels from a Gaussian-distributed 2.94 Er:YAG laser spot with a beam diameter of 10 mm and a focal distance of 100 mm. Thereby, a ring focus with an inner/outer radius of 7/8 mm can be created. To avoid Poisson's spo, the symmetry of the problem was broken by circular modulation of the phase leading to a spiral-like structure. The calculated efficiency of the HOE relating the energy within the ring to the total energy was 91%. With an HOE it is possible to redistribute the energy along the desired focal ring. The HOE design can be adapted to the intensity distribution of the impinging laser beam with its characteristic aperture shape. A circular homogeneous corneal trephination depth is possible, because the energy fluctuation from pulse to pulse does not locally affect the ablation process. A ring focus for the Er:YAG laser has the potential to render superfluous a manual beam control via micromanipulator and to allow a more rapid and more

  5. GPU-based implementation of an accelerated SR-NLUT based on N-point one-dimensional sub-principal fringe patterns in computer-generated holograms

    Directory of Open Access Journals (Sweden)

    Hee-Min Choi


    Full Text Available An accelerated spatial redundancy-based novel-look-up-table (A-SR-NLUT method based on a new concept of the N-point one-dimensional sub-principal fringe pattern (N-point1-D sub-PFP is implemented on a graphics processing unit (GPU for fast calculation of computer-generated holograms (CGHs of three-dimensional (3-Dobjects. Since the proposed method can generate the N-point two-dimensional (2-D PFPs for CGH calculation from the pre-stored N-point 1-D PFPs, the loading time of the N-point PFPs on the GPU can be dramatically reduced, which results in a great increase of the computational speed of the proposed method. Experimental results confirm that the average calculation time for one-object point has been reduced by 49.6% and 55.4% compared to those of the conventional 2-D SR-NLUT methods for each case of the 2-point and 3-point SR maps, respectively.

  6. Two schemes for rapid generation of digital video holograms using PC cluster (United States)

    Park, Hanhoon; Song, Joongseok; Kim, Changseob; Park, Jong-Il


    Computer-generated holography (CGH), which is a process of generating digital holograms, is computationally expensive. Recently, several methods/systems of parallelizing the process using graphic processing units (GPUs) have been proposed. Indeed, use of multiple GPUs or a personal computer (PC) cluster (each PC with GPUs) enabled great improvements in the process speed. However, extant literature has less often explored systems involving rapid generation of multiple digital holograms and specialized systems for rapid generation of a digital video hologram. This study proposes a system that uses a PC cluster and is able to more efficiently generate a video hologram. The proposed system is designed to simultaneously generate multiple frames and accelerate the generation by parallelizing the CGH computations across a number of frames, as opposed to separately generating each individual frame while parallelizing the CGH computations within each frame. The proposed system also enables the subprocesses for generating each frame to execute in parallel through multithreading. With these two schemes, the proposed system significantly reduced the data communication time for generating a digital hologram when compared with that of the state-of-the-art system.

  7. Effective memory reduction of the novel look-up table with one-dimensional sub-principle fringe patterns in computer-generated holograms. (United States)

    Kim, Seung-Cheol; Kim, Jae-Man; Kim, Eun-Soo


    We propose a novel approach to massively reduce the memory of the novel look-up table (N-LUT) for computer-generated holograms by employing one-dimensional (1-D) sub-principle fringe patterns (sub-PFPs). Two-dimensional (2-D) PFPs used in the conventional N-LUT method are decomposed into a pair of 1-D sub-PFPs through a trigonometric relation. Then, these 1-D sub-PFPs are pre-calculated and stored in the proposed method, which results in a remarkable reduction of the memory of the N-LUT. Experimental results reveal that the memory capacity of the LUT, N-LUT and proposed methods have been calculated to be 149.01 TB, 2.29 GB and 1.51 MB, respectively for the 3-D object having image points of 500 × 500 × 256, which means the memory of the proposed method could be reduced by 103 × 10(6) fold and 1.55 × 10(3) fold compared to those of the conventional LUT and N-LUT methods, respectively.

  8. Highly efficient electron vortex beams generated by nanofabricated phase holograms

    Energy Technology Data Exchange (ETDEWEB)

    Grillo, Vincenzo, E-mail: [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); CNR-IMEM Parco Area delle Scienze 37/A, I-43124 Parma (Italy); Carlo Gazzadi, Gian [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); Karimi, Ebrahim [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, Ontario K1N 6N5 (Canada); Mafakheri, Erfan [Dipartimento di Fisica Informatica e Matematica, Università di Modena e Reggio Emilia, via G Campi 213/a, I-41125 Modena (Italy); Boyd, Robert W. [Department of Physics, University of Ottawa, 150 Louis Pasteur, Ottawa, Ontario K1N 6N5 (Canada); Frabboni, Stefano [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); Dipartimento di Fisica Informatica e Matematica, Università di Modena e Reggio Emilia, via G Campi 213/a, I-41125 Modena (Italy)


    We propose an improved type of holographic-plate suitable for the shaping of electron beams. The plate is fabricated by a focused ion beam on a silicon nitride membrane and introduces a controllable phase shift to the electron wavefunction. We adopted the optimal blazed-profile design for the phase hologram, which results in the generation of highly efficient (25%) electron vortex beams. This approach paves the route towards applications in nano-scale imaging and materials science.

  9. Highly efficient electron vortex beams generated by nanofabricated phase holograms

    International Nuclear Information System (INIS)

    Grillo, Vincenzo; Carlo Gazzadi, Gian; Karimi, Ebrahim; Mafakheri, Erfan; Boyd, Robert W.; Frabboni, Stefano


    We propose an improved type of holographic-plate suitable for the shaping of electron beams. The plate is fabricated by a focused ion beam on a silicon nitride membrane and introduces a controllable phase shift to the electron wavefunction. We adopted the optimal blazed-profile design for the phase hologram, which results in the generation of highly efficient (25%) electron vortex beams. This approach paves the route towards applications in nano-scale imaging and materials science

  10. Fast generation of video hologram patterns by use of motion vectors of three-dimensional objects (United States)

    Dong, Xiao-Bin; Choi, Hee-Min; Kwon, Min-Woo; Kim, Seung-Cheol; Kim, Eun-Soo


    Thus far, various approaches to generate the computer-generated holograms (CGHs) of 3-D objects have been suggested but, most of them have been applied to the still images, not to the video images due to their computational complexity. Recently, a method to fast compute the CGH patterns of 3-D video images has been proposed by combined use of data compression and novel look-up table (N-LUT) techniques. In this method, temporally redundant data of 3-D video images are removed with the differential pulse code modulation (DPCM) algorithm and then the CGH patterns for these compressed video images are calculated with the N-LUT method. However, as the 3-D objects move rapidly, image differences between the video frames may increase, which results in a massive growth of calculation time of the video holograms. Therefore, we propose a novel approach to significantly reduce the computation time of 3-D video holograms by employing a new concept of motion-vector of the 3-D object. In the proposed method, 3-D objects are firstly segmented from the 1st frame of the 3-D videos, and the CGH patterns for each segmented object are computed with the N-LUT algorithm. Secondly, motion vectors between each segmented object and the corresponding objects in the consecutive 3-D video frames are calculated. Thirdly, the CGH patterns for each segmented object are shifted with the calculated motion vectors. Finally, all these shifted CGH patterns are added up to generate the hologram patterns of the consecutive 3-D video frames. To confirm the feasibility of the proposed method, experiments are performed and the results are comparatively discussed with the conventional methods in terms of the number of object points and computation time.

  11. Complementary computer generated holography for aesthetic watermarking. (United States)

    Martinez, Christophe; Lemonnier, Olivier; Laulagnet, Fabien; Fargeix, Alain; Tissot, Florent; Armand, Marie Françoise


    We present herein an original solution for the watermarking of holograms in binary graphic arts without unaesthetic diffractive effect. It is based on the Babinet principle of complementary diffractive structures adapted to Lohmann-type computer generated holograms. We introduce the concept and demonstrate its interest for anti-counterfeiting applications with the decoding of a hidden data matrix. A process of thermal lithography is used for the manufacturing of binary graphic arts containing complementary computer generated holograms.

  12. Independent and arbitrary generation of spots in the 3D space domain with computer generated holograms written on a phase-only liquid crystal spatial light modulator

    International Nuclear Information System (INIS)

    Wang, Dong; Zhang, Jian; Xia, Yang; Wang, Hao


    An improved multiple independent iterative plane algorithm, based on a projection optimization idea, is proposed for the independent and arbitrary generation of one spot or multiple spots in a speckle-suppressed 3D work-area. Details of the mathematical expressions of the algorithm are given to theoretically show how it is improved for 3D spot generation. Both simulations and experiments are conducted to investigate the performance of the algorithm for independent and arbitrary 3D spot generation in several different cases. Simulation results agree well with experimental results, which validates the effectiveness of the algorithm proposed. Several additional experiments are demonstrated for fast and independent generation of four or more spots in the 3D space domain, which confirms the capabilities and practicalities of the algorithm further. (paper)

  13. Generation of Binary Off-axis Digital Fresnel Hologram with Enhanced Quality

    Directory of Open Access Journals (Sweden)

    Peter Wai Ming Tsang


    Full Text Available The emergence of high resolution printer and digital micromirror device (DMD has enabled real, off-axis holograms to be printed, or projected onto a screen. As most printers and DMD can only reproduce binary dots, the pixels in a hologram have to be truncated to 2 levels. However, direct binarizing a hologram will lead to severe degradation on its reconstructed image. In this paper, a method for generating binary off-axis digital Fresnel hologram is reported. A hologram generated with the proposed method is referred to as the "Enhanced Sampled Binary Hologram" (ESBH. The reconstructed image of the ESBH is superior in visual quality as compare with the one obtained with existing technique, and also resistant to noise contamination.

  14. Comparison of the different approaches to generate holograms from data acquired with a Kinect sensor (United States)

    Kang, Ji-Hoon; Leportier, Thibault; Ju, Byeong-Kwon; Song, Jin Dong; Lee, Kwang-Hoon; Park, Min-Chul


    Data of real scenes acquired in real-time with a Kinect sensor can be processed with different approaches to generate a hologram. 3D models can be generated from a point cloud or a mesh representation. The advantage of the point cloud approach is that computation process is well established since it involves only diffraction and propagation of point sources between parallel planes. On the other hand, the mesh representation enables to reduce the number of elements necessary to represent the object. Then, even though the computation time for the contribution of a single element increases compared to a simple point, the total computation time can be reduced significantly. However, the algorithm is more complex since propagation of elemental polygons between non-parallel planes should be implemented. Finally, since a depth map of the scene is acquired at the same time than the intensity image, a depth layer approach can also be adopted. This technique is appropriate for a fast computation since propagation of an optical wavefront from one plane to another can be handled efficiently with the fast Fourier transform. Fast computation with depth layer approach is convenient for real time applications, but point cloud method is more appropriate when high resolution is needed. In this study, since Kinect can be used to obtain both point cloud and depth map, we examine the different approaches that can be adopted for hologram computation and compare their performance.

  15. Fast generation of three-dimensional video holograms by combined use of data compression and lookup table techniques. (United States)

    Kim, Seung-Cheol; Yoon, Jung-Hoon; Kim, Eun-Soo


    Even though there are many types of methods to generate CGH (computer-generated hologram) patterns of three-dimensional (3D) objects, most of them have been applied to still images but not to video images due to their computational complexity in applications of 3D video holograms. A new method for fast computation of CGH patterns for 3D video images is proposed by combined use of data compression and lookup table techniques. Temporally redundant data of the 3D video images are removed with the differential pulse code modulation (DPCM) algorithm, and then the CGH patterns for these compressed videos are generated with the novel lookup table (N-LUT) technique. To confirm the feasibility of the proposed method, some experiments with test 3D videos are carried out, and the results are comparatively discussed with the conventional methods in terms of the number of object points and computation time.

  16. Efficient generation of 3D hologram for American Sign Language using look-up table (United States)

    Park, Joo-Sup; Kim, Seung-Cheol; Kim, Eun-Soo


    American Sign Language (ASL) is one of the languages giving the greatest help for communication of the hearing impaired person. Current 2-D broadcasting, 2-D movies are used the ASL to give some information, help understand the situation of the scene and translate the foreign language. These ASL will not be disappeared in future three-dimensional (3-D) broadcasting or 3-D movies because the usefulness of the ASL. On the other hands, some approaches for generation of CGH patterns have been suggested like the ray-tracing method and look-up table (LUT) method. However, these methods have some drawbacks that needs much time or needs huge memory size for look-up table. Recently, a novel LUT (N-LUT) method for fast generation of CGH patterns of 3-D objects with a dramatically reduced LUT without the loss of computational speed was proposed. Therefore, we proposed the method to efficiently generate the holographic ASL in holographic 3DTV or 3-D movies using look-up table method. The proposed method is largely consisted of five steps: construction of the LUT for each ASL images, extraction of characters in scripts or situation, call the fringe patterns for characters in the LUT for each ASL, composition of hologram pattern for 3-D video and hologram pattern for ASL and reconstruct the holographic 3D video with ASL. Some simulation results confirmed the feasibility of the proposed method in efficient generation of CGH patterns for ASL.

  17. New fully functioning digital hologram recording system and its applications (United States)

    Kang, Der-Kuan; Alcaraz Rivera, Miguel; Báez, Javier; Cruz-Lopez, María Luisa


    We propose a low-cost optical system that is able to simply generate computer-generated holograms and rainbow type and true-color Lippmann type holograms. In this system, a microimaging system is applied to reduce the digitized optical data/patterns to achieve about 0.45 μm of optical resolution for rainbow hologram recording and an about 60-deg viewing angle for the Lippmann hologram. The system is designed as a turn-key machine switching between different operating modes. Custom-generated software is applied to calculate and write digital fringe patterns at real-time speed. Applications for the digital rainbow hologram include computer-generated holograms, anticounterfeiting security labels, 3-D display, and 2-D/3-D truecolor holographic stereogram images for the Lippmann hologram. This single-process synthesizing system can be considered as a fully functioning hologram printer.

  18. Accelerated computation of hologram patterns by use of interline redundancy of 3-D object images (United States)

    Kim, Seung-Cheol; Choe, Woo-Young; Kim, Eun-Soo


    We present a new approach for accelerated computation of hologram patterns of a three-dimensional (3-D) image by taking into account of its interline redundant data. Interline redundant data of a 3-D image are extracted with the differential pulse code modulation (DPCM) algorithm, and then the CGH patterns for these compressed line images are generated with the novel lookup table (N-LUT) technique. To confirm the feasibility of the proposed method, experiments with four kinds of 3-D test objects are carried out, and the results are comparatively discussed with the conventional methods in terms of the number of object points and the computation time. Experimental results show that the number of calculated object points and the computation time for one object point have been reduced by 73.3 and 83.9%, on the average, for four test 3-D images in the proposed method employing a top-down scanning method, compared to the conventional method.

  19. Dynamical hologram generation for high speed optical trapping of smart droplet microtools. (United States)

    Lanigan, P M P; Munro, I; Grace, E J; Casey, D R; Phillips, J; Klug, D R; Ces, O; Neil, M A A


    This paper demonstrates spatially selective sampling of the plasma membrane by the implementation of time-multiplexed holographic optical tweezers for Smart Droplet Microtools (SDMs). High speed (>1000fps) dynamical hologram generation was computed on the graphics processing unit of a standard display card and controlled by a user friendly LabView interface. Time multiplexed binary holograms were displayed in real time and mirrored to a ferroelectric Spatial Light Modulator. SDMs were manufactured with both liquid cores (as previously described) and solid cores, which confer significant advantages in terms of stability, polydispersity and ease of use. These were coated with a number of detergents, the most successful based upon lipids doped with transfection reagents. In order to validate these, trapped SDMs were maneuvered up to the plasma membrane of giant vesicles containing Nile Red and human biliary epithelial (BE) colon cancer cells with green fluorescent labeled protein (GFP)-labeled CAAX (a motif belonging to the Ras protein). Bright field and fluorescence images showed that successful trapping and manipulation of multiple SDMs in x, y, z was achieved with success rates of 30-50% and that subsequent membrane-SDM interactions led to the uptake of Nile Red or GFP-CAAX into the SDM.

  20. LightLeaves: computer controlled kinetic reflection hologram installation and a brief discussion of earlier work

    International Nuclear Information System (INIS)

    Connors Chen, Betsy


    LightLeaves is an installation combining leaf shaped, white light reflection holograms of landscape images with a special kinetic lighting device that houses a lamp and moving leaf shaped masks. The masks are controlled by an Arduino microcontroller and servomotors that position the masks in front of the illumination source of the holograms. The work is the most recent in a long series of landscapes that combine multi-hologram installations with computer controlled devices that play with the motion of the holograms, the light, sound or other elements in the work. LightLeaves was first exhibited at the Peabody Essex Museum in Salem, Massachusetts in a show titled E ye Spy: Playing with Perception .

  1. Computer generated holographic microtags

    International Nuclear Information System (INIS)

    Sweatt, W.C.


    A microlithographic tag comprising an array of individual computer generated holographic patches having feature sizes between 250 and 75 nanometers is disclosed. The tag is a composite hologram made up of the individual holographic patches and contains identifying information when read out with a laser of the proper wavelength and at the proper angles of probing and reading. The patches are fabricated in a steep angle Littrow readout geometry to maximize returns in the -1 diffracted order. The tags are useful as anti-counterfeiting markers because of the extreme difficulty in reproducing them. 5 figs

  2. Computer-Generated Holographic Matched Filters (United States)

    Butler, Steven Frank

    This dissertation presents techniques for the use of computer-generated holograms (CGH) for matched filtering. An overview of the supporting technology is provided. Included are techniques for modifying existing CGH algorithms to serve as matched filters in an optical correlator. It shows that matched filters produced in this fashion can be modified to improve the signal-to-noise and efficiency over that possible with conventional holography. The effect and performance of these modifications are demonstrated. In addition, a correction of film non-linearity in continuous -tone filter production is developed. Computer simulations provide quantitative and qualitative demonstration of theoretical principles, with specific examples validated in optical hardware. Conventional and synthetic holograms, both bleached and unbleached, are compared.

  3. Encryption techniques to the design of e-beam-generated digital pixel hologram for anti-counterfeiting (United States)

    Chan, Hau P.; Bao, Nai-Keng; Kwok, Wing O.; Wong, Wing H.


    The application of Digital Pixel Hologram (DPH) as anti-counterfeiting technology for products such as commercial goods, credit cards, identity cards, paper money banknote etc. is growing important nowadays. It offers many advantages over other anti-counterfeiting tools and this includes high diffraction effect, high resolving power, resistance to photo copying using two-dimensional Xeroxes, potential for mass production of patterns at a very low cost. Recently, we have successfully in fabricating high definition DPH with resolution higher than 2500dpi for the purpose of anti-counterfeiting by applying modern optical diffraction theory to computer pattern generation technique with the assist of electron beam lithography (EBL). In this paper, we introduce five levels of encryption techniques, which can be embedded in the design of such DPHs to further improve its anti-counterfeiting performance with negligible added on cost. The techniques involved, in the ascending order of decryption complexity, are namely Gray-level Encryption, Pattern Encryption, Character Encryption, Image Modification Encryption and Codebook Encryption. A Hong Kong Special Administration Regions (HKSAR) DPH emblem was fabricated at a resolution of 2540dpi using the facilities housed in our Optoelectronics Research Center. This emblem will be used as an illustration to discuss in details about each encryption idea during the conference.

  4. Optimizing the efficiency of femtosecond-laser-written holograms

    DEFF Research Database (Denmark)

    Wædegaard, Kristian Juncher; Hansen, Henrik Dueholm; Balling, Peter


    Computer-generated binary holograms are written on a polished copper surface using single 800-nm, 120-fs pulses from a 1-kHz-repetition-rate laser system. The hologram efficiency (i.e. the power in the holographic reconstructed image relative to the incoming laser power) is investigated for diffe......Computer-generated binary holograms are written on a polished copper surface using single 800-nm, 120-fs pulses from a 1-kHz-repetition-rate laser system. The hologram efficiency (i.e. the power in the holographic reconstructed image relative to the incoming laser power) is investigated...... the optimal hole size. For a coverage (i.e. relative laser-structured area) of ∼43 %, the efficiency reaches ∼10 %, which corresponds to a relative power transferred to one reconstructed image of ∼20 %. The efficiency as a function of pitch (for fixed coverage) is fairly constant from 2 to 6 μm....

  5. Scanning holograms

    International Nuclear Information System (INIS)

    Natali, S.


    This chapter reports on the scanning of 1000 holograms taken in HOBC at CERN. Each hologram is triggered by an interaction in the chamber, the primary particles being pions at 340 GeV/c. The aim of the experiment is the study of charm production. The holograms, recorded on 50 mm film with the ''in line'' technique, can be analyzed by shining a parallel expanded laser beam through the film, obtaining immediately above it the real image of the chamber which can then be scanned and measured with a technique half way between emulsions and bubble chambers. The results indicate that holograms can be analyzed as quickly and reliably as in other visual techniques and that to them is open the same order of magnitude of large scale experiments

  6. Holograms as Teaching Agents (United States)

    Walker, Robin A.


    Hungarian physicist Dennis Gabor won the Pulitzer Prize for his 1947 introduction of basic holographic principles, but it was not until the invention of the laser in 1960 that research scientists, physicians, technologists and the general public began to seriously consider the interdisciplinary potentiality of holography. Questions around whether and when Three-Dimensional (3-D) images and systems would impact American entertainment and the arts would be answered before educators, instructional designers and students would discover how much Three-Dimensional Hologram Technology (3DHT) would affect teaching practices and learning environments. In the following International Symposium on Display Holograms (ISDH) poster presentation, the author features a traditional board game as well as a reflection hologram to illustrate conventional and evolving Three-Dimensional representations and technology for education. Using elements from the American children's toy Operation® (Hasbro, 2005) as well as a reflection hologram of a human brain (Ko, 1998), this poster design highlights the pedagogical effects of 3-D images, games and systems on learning science. As teaching agents, holograms can be considered substitutes for real objects, (human beings, organs, and animated characters) as well as agents (pedagogical, avatars, reflective) in various learning environments using many systems (direct, emergent, augmented reality) and electronic tools (cellphones, computers, tablets, television). In order to understand the particular importance of utilizing holography in school, clinical and public settings, the author identifies advantages and benefits of using 3-D images and technology as instructional tools.

  7. Parallel computing of a digital hologram and particle searching for microdigital-holographic particle-tracking velocimetry

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kanamori, Hiroyuki; Kunugi, Tomoaki; Sato, Kazuho; Ito, Tomoyoshi; Yamamoto, Keisuke


    We have developed a parallel algorithm for microdigital-holographic particle-tracking velocimetry. The algorithm is used in (1) numerical reconstruction of a particle image computer using a digital hologram, and (2) searching for particles. The numerical reconstruction from the digital hologram makes use of the Fresnel diffraction equation and the FFT (fast Fourier transform),whereas the particle search algorithm looks for local maximum graduation in a reconstruction field represented by a 3D matrix. To achieve high performance computing for both calculations (reconstruction and particle search), two memory partitions are allocated to the 3D matrix. In this matrix, the reconstruction part consists of horizontally placed 2D memory partitions on the x-y plane for the FFT, whereas, the particle search part consists of vertically placed 2D memory partitions set along the z axes.Consequently, the scalability can be obtained for the proportion of processor elements,where the benchmarks are carried out for parallel computation by a SGI Altix machine

  8. Phase holograms in polymethyl methacrylate (United States)

    Maker, P. D.; Muller, R. E.


    A procedure is described for the fabrication of complex computer-generated phase holograms in polymethyl methacrylate (PMMA) by means of partial-exposure e-beam lithography and subsequent carefully controlled partial development. Following the development, the pattern appears (rendered in relief) in the PMMA, which then acts as the phase-delay medium. The devices fabricated were designed with 16 equal phase steps per retardation cycle, were up to 3 mm square, and consisted of up to 10 millions of 0.3-2.0-micron square pixels. Data files were up to 60 Mb-long, and the exposure times ranged to several hours. A Fresnel phase lens was fabricated with a diffraction-limited optical performance of 83-percent efficiency.

  9. Encryption and display of multiple-image information using computer-generated holography with modified GS iterative algorithm (United States)

    Xiao, Dan; Li, Xiaowei; Liu, Su-Juan; Wang, Qiong-Hua


    In this paper, a new scheme of multiple-image encryption and display based on computer-generated holography (CGH) and maximum length cellular automata (MLCA) is presented. With the scheme, the computer-generated hologram, which has the information of the three primitive images, is generated by modified Gerchberg-Saxton (GS) iterative algorithm using three different fractional orders in fractional Fourier domain firstly. Then the hologram is encrypted using MLCA mask. The ciphertext can be decrypted combined with the fractional orders and the rules of MLCA. Numerical simulations and experimental display results have been carried out to verify the validity and feasibility of the proposed scheme.

  10. Holograms a cultural history

    CERN Document Server

    Johnston, Sean F


    Holograms have been in the public eye for over a half-century, but their influences have deeper cultural roots. No other visual experience is quite like interacting with holograms; no other cultural product melds the technological sublime with magic and optimism in quite the same way. As holograms have evolved, they have left their audiences alternately fascinated, bemused, inspired or indifferent. From expressions of high science to countercultural art to consumer security, holograms have represented modernity, magic and materialism. Their most pervasive impact has been to galvanize hopeful technological dreams. This book explores how holograms found a place in distinct cultural settings. Engineers, artists, hippies and hobbyists have played with, and dreamed about, holograms. This book explores the technical attractions and cultural uses of the hologram, how they were shaped by what came before them, and how they have matured to shape our notional futures. Today, holograms are in our pockets (as identity do...

  11. Three-directional motion compensation-based novel-look-up-table for video hologram generation of three-dimensional objects freely maneuvering in space. (United States)

    Dong, Xiao-Bin; Kim, Seung-Cheol; Kim, Eun-Soo


    A new three-directional motion compensation-based novel-look-up-table (3DMC-NLUT) based on its shift-invariance and thin-lens properties, is proposed for video hologram generation of three-dimensional (3-D) objects moving with large depth variations in space. The input 3-D video frames are grouped into a set of eight in sequence, where the first and remaining seven frames in each set become the reference frame (RF) and general frames (GFs), respectively. Hence, each 3-D video frame is segmented into a set of depth-sliced object images (DOIs). Then x, y, and z-directional motion vectors are estimated from blocks and DOIs between the RF and each of the GFs, respectively. With these motion vectors, object motions in space are compensated. Then, only the difference images between the 3-directionally motion-compensated RF and each of the GFs are applied to the NLUT for hologram calculation. Experimental results reveal that the average number of calculated object points and the average calculation time of the proposed method have been reduced compared to those of the conventional NLUT, TR-NLUT and MPEG-NLUT by 38.14%, 69.48%, and 67.41% and 35.30%, 66.39%, and 64.46%, respectively.

  12. Alternative approach to develop digital hologram interaction system by bounding volumes for identifying object collision (United States)

    Cho, Sungjin; Mun, Sungchul; Park, Min-Chul; Ju, Byeong-Kwon; Son, Jung-Young


    Digital holography technology has been considered a powerful method for reconstructing real objects and displaying completed 3D information. Although many studies on holographic displays have been conducted, research on interaction methods for holographic displays is still in an early stage. For developing an appropriate interaction method for digital holograms, a two-way interaction which is able to provide natural interaction between humans and holograms should be considered. However, digital holography technology is not yet fully developed to make holograms capable of naturally responding to human behaviors. Thus, the purpose of this study was to propose an alternative interaction method capable of applying it to interacting with holograms in the future. In order to propose an intuitive interaction method based on computer-generated objects, we utilized a depth camera, Kinect, which provides depth information per pixel. In doing so, humans and environment surrounding them were captured by the depth camera. The captured depth images were simulated on a virtual space and computer graphic objects were generated on the same virtual space. Detailed location information of humans was continuously extracted to provide a natural interaction with the generated objects. In order to easily identify whether two objects were overlapped or not, bounding volumes were generated around both humans and objects, respectively. The local information of the bounding volumes was correlated with one another, which made it possible for humans to control the computer-generated objects. Then, we confirmed a result of interaction through computer generated holograms. As a result, we obtained extreme reduction of computation time accuracy within 80% through bounding volume.

  13. Bit-mapped Holograms Using Phase Transition Mastering (PTM) and Blu-ray Disks

    International Nuclear Information System (INIS)

    Barnhart, Donald


    Due to recent advances made in data storage, cloud computing, and Blu-ray mastering technology, it is now straight forward to calculate, store, transfer, and print bitmapped holograms that use terabytes of data and tera-pixels of information. This presentation reports on the potential of using the phase transition mastering (PTM) process to construct bitmapped, computer generated holograms with spatial resolutions of 5000 line-pairs/mm (70 nm pixel width). In particular, for Blu-ray disk production, Sony has developed a complete process that could be alternately deployed in holographic applications. The PTM process uses a 405 nm laser to write phase patterns onto a layer of imperfect transition metal oxides that is deposited onto an 8 inch silicon wafer. After the master hologram has been constructed, its imprint can then be cheaply mass produced with the same process as Blu-ray disks or embossed holograms. Unlike traditional binary holograms made with expensive e-beam lithography, the PTM process has the potential for multiple phase levels using inexpensive optics similar to consumer-grade desktop Blu-ray writers. This PTM process could revolutionise holography for entertainment, industrial, and scientific applications.

  14. Integrated large view angle hologram system with multi-slm (United States)

    Yang, ChengWei; Liu, Juan


    Recently holographic display has attracted much attention for its ability to generate real-time 3D reconstructed image. CGH provides an effective way to produce hologram, and spacial light modulator (SLM) is used to reconstruct the image. However the reconstructing system is usually very heavy and complex, and the view-angle is limited by the pixel size and spatial bandwidth product (SBP) of the SLM. In this paper a light portable holographic display system is proposed by integrating the optical elements and host computer units.Which significantly reduces the space taken in horizontal direction. CGH is produced based on the Fresnel diffraction and point source method. To reduce the memory usage and image distortion, we use an optimized accurate compressed look up table method (AC-LUT) to compute the hologram. In the system, six SLMs are concatenated to a curved plane, each one loading the phase-only hologram in a different angle of the object, the horizontal view-angle of the reconstructed image can be expanded to about 21.8°.

  15. Anticounterfeit holograms in China (United States)

    Hsu, Dahsiung; Zhou, Jing; Pei, Wen; Li, Qiang; Huang, Xuhuai; Cao, Yulin


    The Chinese holography industry has been given an enormous boost by the energetic sales and technology transfer of several western businesses. It is a fast growing industry which can keep up with domestic demand for anti-counterfeit embossed holograms because product counterfeiting is so rife internally. Tax papers, stamps, plastic cards, identification cards, and many packaged goods are authenticated with embossed holograms. Up to now, about 1,000 kinds of products in China have used holograms to protect themselves. Anti-counterfeit holograms with secret codes have also been used. After dependence on imports, China is rapidly developing its own sources of equipment, holographic materials, and embossing substrates. The quality of this equipment and materials is improving. The new Chinese Holography Association, a national industry association aiming to develop the application of holograms and to promote cooperation between organizations, was established in 1993. The CHA has requested affiliation to the International Hologram Manufacturers Association, a move which should improve the communication between the Chinese industry and the rest of the world industry.

  16. A new generation in computing

    International Nuclear Information System (INIS)

    Kahn, R.E.


    Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed

  17. Designing security holograms (United States)

    James, Randy; Long, Michael; Newcomb, Diana


    Over the years, holograms have evolved from purely decorative images to bona fide security devices. During this evolution, highly secure technologies have been developed specifically for product and document protection. To maximize the security potential of these hologram technologies requires a holistic approach. A hologram alone is not enough. To be effective it must be part of a security program and that security program needs to inform the design and development of the actual hologram. In the most elementary case the security program can be as simple as applying a tamper evident label for a one-day event. In a complex implementation it would include multi-level technologies and corresponding verification methods. A holistic approach is accomplished with good planning and articulation of the problem to be solved, and then meeting the defined security objectives. Excellent communication among all the stakeholders in a particular project is critical to the success of the project. The results of this dialogue inform the design of the security hologram.

  18. Fundamental superstrings as holograms

    International Nuclear Information System (INIS)

    Dabholkar, A.; Murthy, S.


    The worldsheet of a macroscopic fundamental superstring in the Green-Schwarz light-cone gauge is viewed as a possible boundary hologram of the near horizon region of a small black string. For toroidally compactified strings, the hologram has global symmetries of AdS 3 x S d-1 x T 8-d ( d = 3, . . . , 8), only some of which extend to local conformal symmetries. We construct the bulk string theory in detail for the particular case of d = 3. The symmetries of the hologram are correctly reproduced from this exact worldsheet description in the bulk. Moreover, the central charge of the boundary Virasoro algebra obtained from the bulk agrees with the Wald entropy of the associated small black holes. This construction provides an exact CFT description of the near horizon region of small black holes both in Type-II and heterotic string theory arising from multiply wound fundamental superstrings. (author)

  19. Computer generated holography with intensity-graded patterns

    Directory of Open Access Journals (Sweden)

    Rossella Conti


    Full Text Available Computer Generated Holography achieves patterned illumination at the sample plane through phase modulation of the laser beam at the objective back aperture. This is obtained by using liquid crystal-based spatial light modulators (LC-SLMs, which modulate the spatial phase of the incident laser beam. A variety of algorithms are employed to calculate the phase modulation masks addressed to the LC-SLM. These algorithms range from simple gratings-and-lenses to generate multiple diffraction-limited spots, to iterative Fourier-transform algorithms capable of generating arbitrary illumination shapes perfectly tailored on the base of the target contour. Applications for holographic light patterning include multi-trap optical tweezers, patterned voltage imaging and optical control of neuronal excitation using uncaging or optogenetics. These past implementations of computer generated holography used binary input profile to generate binary light distribution at the sample plane. Here we demonstrate that using graded input sources, enables generating intensity graded light patterns and extend the range of application of holographic light illumination. At first, we use intensity-graded holograms to compensate for LC-SLM position dependent diffraction efficiency or sample fluorescence inhomogeneity. Finally we show that intensity-graded holography can be used to equalize photo evoked currents from cells expressing different level of chanelrhodopsin2 (ChR2, one of the most commonly used optogenetics light gated channels, taking into account the non-linear dependence of channel opening on incident light.

  20. Digital holograms for laser mode multiplexing

    CSIR Research Space (South Africa)

    Mhlanga, T


    Full Text Available beams,” Optics Express 17(26), 23389–23395 (2009). 12. Arrizo´n, V., Ruiz, U., Carrada, R., and Gonza´lez, L. A., “Pixelated phase computer holograms for the accurate encoding of scalar complex fields,” JOSA A 24(11), 3500–3507 (2007). 13. McLaren, M...

  1. Very large computer generated holograms for precision metrology of aspheric optical surfaces, Phase I (United States)

    National Aeronautics and Space Administration — Both ground and space telescopes employ aspheric mirrors. A particular example is the X-ray telescope where primary and secondary mirrors have nearly cylindrical...

  2. Phase-only stereoscopic hologram calculation based on Gerchberg–Saxton iterative algorithm

    International Nuclear Information System (INIS)

    Xia Xinyi; Xia Jun


    A phase-only computer-generated holography (CGH) calculation method for stereoscopic holography is proposed in this paper. The two-dimensional (2D) perspective projection views of the three-dimensional (3D) object are generated by the computer graphics rendering techniques. Based on these views, a phase-only hologram is calculated by using the Gerchberg–Saxton (GS) iterative algorithm. Comparing with the non-iterative algorithm in the conventional stereoscopic holography, the proposed method improves the holographic image quality, especially for the phase-only hologram encoded from the complex distribution. Both simulation and optical experiment results demonstrate that our proposed method can give higher quality reconstruction comparing with the traditional method. (special topic)

  3. Modeling of light-emitting diode wavefronts for the optimization of transmission holograms. (United States)

    Karthaus, Daniela; Giehl, Markus; Sandfuchs, Oliver; Sinzinger, Stefan


    The objective of applying transmission holograms in automotive headlamp systems requires the adaptation of holograms to divergent and polychromatic light sources like light-emitting diodes (LEDs). In this paper, four different options to describe the scalar light waves emitted by a typical automotive LED are regarded. This includes a new approach to determine the LED's wavefront from interferometric measurements. Computer-generated holograms are designed considering the different LED approximations and recorded into a photopolymer. The holograms are reconstructed with the LED and the resulting images are analyzed to evaluate the quality of the wave descriptions. In this paper, we show that our presented new approach leads to better results in comparison to other wave descriptions. The enhancement is evaluated by the correlation between reconstructed and ideal images. In contrast to the next best approximation, a spherical wave, the correlation coefficient increased by 0.18% at 532 nm, 1.69% at 590 nm, and 0.75% at 620 nm.

  4. Computer-Generated Feedback on Student Writing (United States)

    Ware, Paige


    A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…

  5. Helicity multiplexed broadband metasurface holograms (United States)

    Wen, Dandan; Yue, Fuyong; Li, Guixin; Zheng, Guoxing; Chan, Kinlong; Chen, Shumei; Chen, Ming; Li, King Fai; Wong, Polis Wing Han; Cheah, Kok Wai; Yue Bun Pun, Edwin; Zhang, Shuang; Chen, Xianzhong


    Metasurfaces are engineered interfaces that contain a thin layer of plasmonic or dielectric nanostructures capable of manipulating light in a desirable manner. Advances in metasurfaces have led to various practical applications ranging from lensing to holography. Metasurface holograms that can be switched by the polarization state of incident light have been demonstrated for achieving polarization multiplexed functionalities. However, practical application of these devices has been limited by their capability for achieving high efficiency and high image quality. Here we experimentally demonstrate a helicity multiplexed metasurface hologram with high efficiency and good image fidelity over a broad range of frequencies. The metasurface hologram features the combination of two sets of hologram patterns operating with opposite incident helicities. Two symmetrically distributed off-axis images are interchangeable by controlling the helicity of the input light. The demonstrated helicity multiplexed metasurface hologram with its high performance opens avenues for future applications with functionality switchable optical devices.

  6. The hologram principles and techniques

    CERN Document Server

    Richardson, Martin J


    Written by Martin Richardson (an acclaimed leader and pioneer in the field) and John Wiltshire, The Hologram: Principles and Techniques is an important book that explores the various types of hologram in their multiple forms and explains how to create and apply the technology. The authors offer an insightful overview of the currently available recording materials, chemical formulas, and laser technology that includes the history of phase imaging and laser science. Accessible and comprehensive, the text contains a step-by-step guide to the production of holograms. In addition, The Hologram outlines the most common problems encountered in producing satisfactory images in the laboratory, as well as dealing with the wide range of optical and chemical techniques used in commercial holography. The Hologram is a well-designed instructive tool, involving three distinct disciplines: physics, chemistry, and graphic arts. This vital resource offers a guide to the development and understanding of the recording of mater...

  7. Holograms for the Masses

    International Nuclear Information System (INIS)

    Newswanger, Craig; Klug, Michael


    Traditional holography subject matter has been generally limited to small dead things (SMD). Pulse lasers and the advent of holographic stereography have made it easier to make holograms of scaled objects and those that live (un-SMD), at a cost of single dimensional parallax or monochromaticity. While stunning results have been produced, all of these required access to a lab, expensive lasers and optics, and infinite patience, care and skill to collect and record content. This complexity has generally kept holography out of reach for the masses. The recent introduction of new 3D data sources, free or inexpensive composition and editing software, and fast, consistent print services may make it possible to finally 'democratize' holography, and enable image makers to focus on message rather than medium. This paper will outline several photogrammetry-based methods for producing 3D content for holograms (with a camera and mouse finger), software applications for editing, positioning and lighting, and production means that are usable by anyone, from novice to professional. We will present step-by-step examples and display results depicting various subject matter, from color holographic portraits made from smart phone input to holographic maps made from movies collected with remote control airplanes. The aim is to inspire image making, spontaneity, and maybe even social media-based collaboration to make EVERYONE a holographer.

  8. Submicron Confocal Raman Microscopy of Optical Holograms in Multicomponent Photopolymers (United States)

    Kagan, C. R.; Harris, T. D.; Harris, A. L.; Schilling, M. L.


    We demonstrate submicron chemical imaging of optical holograms in multicomponent photopolymers using a scanning confocal Raman microscope. Our microscope is sensitive to the submicron, <1 percent concentration variations of the polymeric components that form the refractive index modulation responsible for hologram diffraction. Photopolymers are attractive media for holographic data storage, yet the mechanisms for generating the refractive index modulations responsible for hologram diffraction remain poorly understood. We obtain the first direct chemical evidence showing that these concentration modulations are established both by monomer diffusion and by polymer matrix swelling during hologram writing. Spatial variations in both density and composition contribute to the refractive index modulation. These measurements demonstrate the feasibility of submicron Raman microscopy in chemically imaging photodegradable organic and biological materials.

  9. 48 CFR 53.105 - Computer generation. (United States)


    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer generation. 53.105 Section 53.105 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION (CONTINUED) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the...

  10. Multiplexing 200 spatial modes with a single hologram (United States)

    Rosales-Guzmán, Carmelo; Bhebhe, Nkosiphile; Mahonisi, Nyiku; Forbes, Andrew


    The on-demand tailoring of light's spatial shape is of great relevance in a wide variety of research areas. Computer-controlled devices, such as spatial light modulators (SLMs) or digital micromirror devices, offer a very accurate, flexible and fast holographic means to this end. Remarkably, digital holography affords the simultaneous generation of multiple beams (multiplexing), a tool with numerous applications in many fields. Here, we provide a self-contained tutorial on light beam multiplexing. Through the use of several examples, the readers will be guided step by step in the process of light beam shaping and multiplexing. Additionally, we provide a quantitative analysis on the multiplexing capabilities of SLMs to assess the maximum number of beams that can be multiplexed on a single SLM, showing approximately 200 modes on a single hologram.

  11. HoloTrap: Interactive hologram design for multiple dynamic optical trapping (United States)

    Pleguezuelos, E.; Carnicer, A.; Andilla, J.; Martín-Badosa, E.; Montes-Usategui, M.


    This work presents an application that generates real-time holograms to be displayed on a holographic optical tweezers setup; a technique that allows the manipulation of particles in the range from micrometres to nanometres. The software is written in Java, and uses random binary masks to generate the holograms. It allows customization of several parameters that are dependent on the experimental setup, such as the specific characteristics of the device displaying the hologram, or the presence of aberrations. We evaluate the software's performance and conclude that real-time interaction is achieved. We give our experimental results from manipulating 5 μm microspheres using the program. Program summaryTitle of program: HoloTrap Catalogue identifier:ADZB_v1_0 Program summary URL: Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: General computer Operating systems or monitors under which the program has been tested: Windows, Linux Programming language used: Java Memory required to execute with typical data: up to 34 MB including the Java Virtual Machine No. of bits in a word: 8 bits No. of processors used: 1 Has the code been vectorized or parallelized?: No No. of lines in distributed program, including test data, etc.: 471 145 No. of bytes in distributed program, including test data, etc.: 1 141 457 Distribution format: tar.gz Nature of physical problem: To calculate and display holograms for generating multiple and dynamic optical tweezers to be reconfigured interactively. Method of solution: Fast random binary mask for the simultaneous codification of multiple phase functions into a phase modulation device. Typical running time: Up to 10 frames per second Unusual features of the program: None References: The method for calculating holograms can be found in [M. Montes-Usategui, E. Pleguezuelos, J. Andilla

  12. Sugar holograms with erioglaucine and tartrazine (United States)

    Mejias-Brizuela, N. Y.; Olivares-Pérez, A.; Páez-Trujillo, G.; Fuentes-Tapia, I.


    An artificial green colorant, composed by erioglaucine (Blue 1) and tartrazine (Yellow 5), was employed in a sugar matrix to improve the material sensibility and to make a comparative analysis of the diffraction efficiency parameter, for holograms replications, the holographic pattern was obtained by a computer and recorded in sugar films and in modified sugar (sugar-colorant). Conventional lithography and UV radiation were used. The results show that the behavior diffraction efficiency of the sugar-colorant films is slightly larger than in the sugar matrix under the same recording conditions.

  13. A Flock of Words: live music performance with holograms and interactive multimedia (United States)

    Vila, Doris K.


    This paper describes A Flock of Words, a cross-media music performance realized in collaboration with composer Robert Rowe. An interactive computer system linked large-scale holograms, video projection, animation, robotic lighting effects, and computer music. With a text from Elias Cannetti's Crowds and Power, an artificial-life algorithm animates swarming words. Projected onto the large holograms, the text flies in and out of linear readability, set off by computer music signals. A Flock of Words uses custom computer software to analyze the music being performance by an ensemble of human players and guide the simultaneous projection of real-time animation onto holograms, video, holographic lighting, and computer music. To stage the piece, we created an interactive computer system combining large-scale holograms, video projection, animation, robotic lighting effects, and computer music. The real-time animation was an adaptation of Craig Reynolds's Boids algorithm, which we dubbed `woids', and was used for animating flocking words.

  14. Computer-generated slide technology. (United States)

    Palmer, D S


    Presentation technology is available, and it does not have to be expensive. This article describes computer hardware and software concepts for graphics use, and recommends principles for making cost-effective buying decisions. Also included is a previously published technique for making custom computer graphic 35-mm slides at minimal expense. This information is vital to anyone lecturing without the support of a custom graphics laboratory.

  15. Imaging properties of Young holograms (United States)

    Polyanskii, Peter V.; Polyanskaya, G. V.


    Inequality of imaging by Young holograms from complementary diffraction devices are grounded using a stationary phase principle which causes a reduced recording distributivity. A holographic method for determination of an angular dependence of the secondary wave amplitude function associated with a diffraction screen edge is proposed, and the Rubinowicz representation for a diffraction integral at the primary illuminated are is verified on this basis.

  16. Computer-Generated Photorealistic Hair


    Lin, Alice J.


    This paper presents an efficient method for generating and rendering photorealistic hair in two dimensional pictures. The method consists of three major steps. Simulating an artist drawing is used to design the rough hair shape. A convolution based filter is then used to generate photorealistic hair patches. A refine procedure is finally used to blend the boundaries of the patches with surrounding areas. This method can be used to create all types of photorealistic human hair (head hair, faci...

  17. Image encryption scheme based on computer generated holography and time-averaged moiré (United States)

    Palevicius, Paulius; Ragulskis, Minvydas; Janusas, Giedrius; Palevicius, Arvydas


    A technique of computational image encryption and optical decryption based on computer generated holography and time-averaged moiŕe is investigated in this paper. Dynamic visual cryptography (a visual cryptography scheme based on time-averaging geometric moiŕe), Gerchberg-Saxton algorithm and 3D microstructure manufacturing techniques are used to construct the optical scheme. The secret is embedded into a cover image by using a stochastic moiŕe grating and can be visually decoded by a naked eye. The secret is revealed if the amplitude of harmonic oscillations in the Fourier plane corresponds to an accurately preselected value. The process of the production of 3D microstructure is described in details. Computer generated holography is used in the design step and electron beam lithography is exploited for physical 3D patterning. The phase data of a complex 3D microstructure is obtained by Gerchberg-Saxton algorithm and is used to produce a computer generated hologram. Physical implementation of microstructure is performed by using a single layer polymethyl methacrylate as a basis for 3D microstructure. Numerical simulations demonstrate efficient applicability of this technique.

  18. Structured grid generator on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Murakami, Hiroyuki; Higashida, Akihiro; Yanagisawa, Ichiro.


    A general purpose structured grid generator on parallel computers, which generates a large-scale structured grid efficiently, has been developed. The generator is applicable to Cartesian, cylindrical and BFC (Boundary-Fitted Curvilinear) coordinates. In case of BFC grids, there are three adaptable topologies; L-type, O-type and multi-block type, the last of which enables any combination of L- and O-grids. Internal BFC grid points can be automatically generated and smoothed by either algebraic supplemental method or partial differential equation method. The partial differential equation solver is implemented on parallel computers, because it consumes a large portion of overall execution time. Therefore, high-speed processing of large-scale grid generation can be realized by use of parallel computer. Generated grid data are capable to be adjusted to domain decomposition for parallel analysis. (author)

  19. Holograms and authentication: meeting future demands (United States)

    Lancaster, Ian M.


    The use of holograms as authentication or security devices is the most valuable application of holograms yet devised. In 20 years, this has developed from the first use of a hologram on credit cards, to the situation today where governments turn to holograms as a key security feature on the protected documents they issue, including banknotes, identity documents and tax banderols. At the same time, holograms (and related devices) are the most recognised visible feature used to authenticate and protect branded and OEM products; this sector covers the gamut from CD jewel-box seals to the protection of safety-related items such as medicines and vehicle replacement parts. There has been creative synergy between the commercial suppliers of such holograms and the practical holography community. But new technologies are coming forward to challenge the role of holograms, a challenge that is aided by the counterfeiting of security holograms. What are the characteristics of those technologies and can holograms provide similar resources to users? Examples of collaboration between hologram producers and producers of other technologies to create combination devices suggest a possible route forward for holography to maintain its role in authentication and security. By scrutinising and adapting to needs, often by combination with other techniques, holographers may be able to retain their role in this important application.

  20. Interactive Computation for Undergraduates: The Next Generation (United States)

    Kolan, Amy J.


    A generation ago (29 years ago), Leo Kadanoff and Michael Vinson created the Computers, Chaos, and Physics course. A major pedagogical thrust of this course was to help students form and test hypotheses via computer simulation of small problems in physics. Recently, this aspect of the 1987 course has been revived for use with first year physics undergraduate students at St. Olaf College.

  1. Efficiency of quantum volume hologram (United States)

    Vasilyev, D. V.; Sokolov, I. V.


    We discuss storage and retrieval efficiency of parallel spatially multimode quantum memory for light - quantum volume hologram. The introduced in [D.V. Vasyliev, I.V. Sokolov, E.S. Polzik, Phys. Rev. A 81, 020302(R) (2010)] scheme is based on the counter-propagating (non-collinear in general case) quantum signal wave and strong classical reference wave in presence of the Raman-type off-resonant interaction with atomic spins rotating in the magnetic field. By the forward-propagating retrieval the quantum volume hologram is less sensitive to diffraction [D.V. Vasyliev, I.V. Sokolov, E.S. Polzik, Phys. Rev. A 81, 020302(R) (2010)] and therefore is capable of achieving high density of storage of spatial modes. We propose to use for the forward-propagating retrieval the signal temporal eigenmodes of the whole write-in and readout memory cycle. As compared to the approach when there are used the eigenmodes optimal only for the write-in stage of the memory, our proposal allows for better efficiencies for given physical parameters of the scheme, and, hence, for higher quantum capacity of parallel quantum memory. We also demonstrate that for the backward-propagating retrieval of quantum volume hologram the collective spin wave momentum inversion is needed, which is achieved by means of the π-pulse of stimulated Raman scattering of counter-propagating classical waves.

  2. Image design and replication for image-plane disk-type multiplex holograms (United States)

    Chen, Chih-Hung; Cheng, Yih-Shyang


    The fabrication methods and parameter design for both real-image generation and virtual-image display in image-plane disk-type multiplex holography are introduced in this paper. A theoretical model of a disk-type hologram is also presented and is then used in our two-step holographic processes, including the production of a non-image-plane master hologram and optical replication using a single-beam copying system for the production of duplicated holograms. Experimental results are also presented to verify the possibility of mass production using the one-shot holographic display technology described in this study.

  3. Next generation distributed computing for cancer research. (United States)

    Agarwal, Pankaj; Owzar, Kouros


    Advances in next generation sequencing (NGS) and mass spectrometry (MS) technologies have provided many new opportunities and angles for extending the scope of translational cancer research while creating tremendous challenges in data management and analysis. The resulting informatics challenge is invariably not amenable to the use of traditional computing models. Recent advances in scalable computing and associated infrastructure, particularly distributed computing for Big Data, can provide solutions for addressing these challenges. In this review, the next generation of distributed computing technologies that can address these informatics problems is described from the perspective of three key components of a computational platform, namely computing, data storage and management, and networking. A broad overview of scalable computing is provided to set the context for a detailed description of Hadoop, a technology that is being rapidly adopted for large-scale distributed computing. A proof-of-concept Hadoop cluster, set up for performance benchmarking of NGS read alignment, is described as an example of how to work with Hadoop. Finally, Hadoop is compared with a number of other current technologies for distributed computing.

  4. Volume Hologram Formation in SU-8 Photoresist

    Directory of Open Access Journals (Sweden)

    Tina Sabel


    Full Text Available In order to further understand the mechanism of volume hologram formation in photosensitive polymers, light-induced material response is analyzed in commonly used epoxy-based negative photoresist Epon SU-8. For this purpose, time-resolved investigation of volume holographic grating growth is performed in the SU-8 based host–guest system and in the pure SU-8 material, respectively. The comparison of grating growth curves from doped and undoped system allows us to draw conclusions on the impact of individual components on the grating formation process. The successive formation of transient absorption as well as phase gratings in SU-8 is observed. Influence of exposure duration and UV flood cure on the grating growth are investigated. Observed volume holographic grating formation in SU-8 can be explained based on the generation and subsequent diffusion of photoacid as well as time-delayed polymerization of exposed and unexposed areas.

  5. Far-field and Fresnel Liquid Crystal Geometric Phase Holograms via Direct-Write Photo-Alignment

    Directory of Open Access Journals (Sweden)

    Xiao Xiang


    Full Text Available We study computer-generated geometric-phase holograms (GPHs realized by photo-aligned liquid crystals, in both simulation and experiment. We demonstrate both far-field and Fresnel holograms capable of producing far-field and near-field images with preserved fidelity for all wavelengths. The GPHs are fabricated by patterning a photo-alignment layer (PAL using a direct-write laser scanner and coating the surface with a polymerizable liquid crystal (i.e., a reactive mesogen. We study various recording pixel sizes, down to 3 μm, that are easily recorded in the PAL. We characterize the fabricated elements and find good agreement with theory and numerical simulation. Because of the wavelength independent geometric phase, the (phase fidelity of the replay images is preserved for all wavelengths, unlike conventional dynamic phase holograms. However, governed by the diffraction equation, the size and location of a reconstructed image depends on the replay wavelength for far-field and near-field GPHs, respectively. These offer interesting opportunities for white-light holography.

  6. Computer Aided Implementation using Xilinx System Generator


    Eriksson, Henrik


    The development in electronics increases the demand for good design methods and design tools in the field of electrical engeneering. To improve their design methods Ericsson Microwave Systems AB is interested in using computer tools to create a link between the specification and the implementation of a digital system in a FPGA. Xilinx System Generator for DSP is a tool for implementing a model of a digital signalprocessing algorithm in a Xilinx FPGA. To evaluate Xilinx System Generator two t...

  7. Computer generated movies to display biotelemetry data

    International Nuclear Information System (INIS)

    White, G.C.


    The three dimensional nature of biotelemetry data (x, y, time) makes them difficult to comprehend. Graphic displays provide a means of extracting information and analyzing biotelemetry data. The extensive computer graphics facilities at Los Alamos Scientific Laboratory have been utilized to analyze elk biotelemetry data. Fixes have been taken weekly for 14 months on 14 animals'. The inadequacy of still graphic displays to portray the time dimension of this data has lead to the use of computer generated movies to help grasp time relationships. A computer movie of the data from one animal demonstrates habitat use as a function of time, while a movie of 2 or more animals illustrates the correlations between the animals movements. About 2 hours of computer time were required to generate the movies for each animal for 1 year of data. The cost of the movies is quite small relative to the cost of collecting the data, so that computer generated movies are a reasonable method to depict biotelemetry data

  8. Speckle noise suppression using part of pixels in a single-exposure digital hologram (United States)

    Leng, Junmin; Zhou, Zhehai; Li, Fubing; Zheng, Qingyu; Liu, Gang


    A method is proposed to suppress speckle noise using only part of the pixels in a single-exposure digital hologram. Different holographic patterns are first generated from a single-exposure digital hologram using specially designed binary masks; then, these holographic patterns are reconstructed according to the Fresnel transform. The reconstructed images are superposed and averaged on the intensity to achieve the suppression of speckle noise. The entire denoising process does not need any additional digital holograms or specific requirements for recording a hologram. Theoretical simulation and experiment verification were carried out and confirm that the proposed method is a very convenient and effective way to suppress speckle noise in digital holography. The proposed method has wide applications in holographic imaging, holographic storage, and art display.

  9. High collimated coherent illumination for reconstruction of digitally calculated holograms: design and experimental realization (United States)

    Morozov, Alexander; Dubinin, German; Dubynin, Sergey; Yanusik, Igor; Kim, Sun Il; Choi, Chil-Sung; Song, Hoon; Lee, Hong-Seok; Putilin, Andrey; Kopenkin, Sergey; Borodin, Yuriy


    Future commercialization of glasses-free holographic real 3D displays requires not only appropriate image quality but also slim design of backlight unit and whole display device to match market needs. While a lot of research aimed to solve computational issues of forming Computer Generated Holograms for 3D Holographic displays, less focus on development of backlight units suitable for 3D holographic display applications with form-factor of conventional 2D display systems. Thereby, we report coherent backlight unit for 3D holographic display with thickness comparable to commercially available 2D displays (cell phones, tablets, laptops, etc.). Coherent backlight unit forms uniform, high-collimated and effective illumination of spatial light modulator. Realization of such backlight unit is possible due to holographic optical elements, based on volume gratings, constructing coherent collimated beam to illuminate display plane. Design, recording and measurement of 5.5 inch coherent backlight unit based on two holographic optical elements are presented in this paper.

  10. Polarization volume holograms in layers of polymethylmethacrylate with phenanthrenequinone (United States)

    Marmysh, D. N.; Mahilny, U. V.


    Polarization volume holograms are recorded in the polymethylmethacrylate layers that contain phenanthrenequinone at a molar content of 2.5-3%. The effect of the polarization of recording beams on the kinetics of diffraction efficiency and properties of holograms is analyzed. Polarization hologram recording in the polymethylmethacrylate layers with phenanthrenequinone and a relatively high optical stability of the holograms are demonstrated.

  11. Application of photopolymer holograms for the anticounterfeit protection of banknotes (United States)

    Markova, Nina V.; Yamnikov, Leonid S.; Turkina, Elena S.; Semenov, Erick G.; Bulygin, Theodor V.; Levin, Gennady G.


    The article analyses holograms counterfeiting techniques and presents grounds for the choice of the optimum type of holograms best protected against counterfeiting -- volume holograms. There is a description of these holograms registration process on DuPont OmniDex photopolymer film and the analysis of the perspective of their use during banknotes production.

  12. Computer generated timing diagrams to supplement simulation

    CERN Document Server

    Booth, A W


    The ISPS computer description language has been used in a simulation study to specify the components of a high speed data acquisition system and its protocols. A facility has been developed for automatically generating timing diagrams from the specification of the data acquisition system written in the ISPS description language. Diagrams can be generated for both normal and abnormal working modes of the system. They are particularly useful for design and debugging in the prototyping stage of a project and can be later used for reference by maintenance engineers. (11 refs).

  13. The computer as a generative tool

    Directory of Open Access Journals (Sweden)

    Tomaž Pipan


    Full Text Available The computer is becoming an indispensable tool. It is capable of fast processing large quantities of digitalised data so urbanists predominantly use it in analytical work with spatial data. Many useful user-friendly GIS tools have been developed, which enable gathering, reviewing and manipulating various spatial information. The article deals with the CAUP programme that uses basic spatial information for automatic generating of spatial limitations and potentials. The programme enables generating of spatial solutions important for determining mutual effects of various spatial data, thus simultaneously allowing different interpretations.

  14. Holograms preparation using commercial fluorescent benzyl

    Energy Technology Data Exchange (ETDEWEB)

    Dorantes-GarcIa, V; Olivares-Perez, A; Ordonez-Padilla, M J; Mejias-Brizuela, N Y, E-mail:, E-mail: [Instituto Nacional de Astrofisica, Optica y Electronica (INAOE), Coordinacion de Optica, Calle Luis Enrique Erro N0 1, Santa Maria Tonantzintla, Puebla (Mexico)


    We have been able to make holograms with substances such as fluorescence thought of light blue laser to make transmissions holograms, using ammonium dichromate as photo-sensitizer and polyvinyl alcohol (PVA) as matrix. Ammonium dichromate inhibit the fluorescence properties of inks, both mixed in a (PVA) matrix, but we avoid this chemical reaction and we show the results to use the method of painting hologram with fluorescents ink and we describe how the diffraction efficiency parameter changes as a function of the ink absorbed by the emulsion recorded with the gratings, we got good results, making holographic gratings with a blue light from laser diode 470 nm. And we later were painting with fluorescent ink, integrating fluorescence characteristics to the hologram.


    Directory of Open Access Journals (Sweden)

    Muhammad Jumal Wahda


    Full Text Available Alat transportasi merupakan salah satu kebutuhan utama manusia untuk menunjang berbagai kegiatan sehari-hari. Alat transportasi dalam pengelompokannya dapat berupa alat transportasi darat, udara dan laut. Alat transportasi tersebut antara lain adalah : mobil, sepeda motor, bus umum, taksi, sepeda, becak, pesawat terbang, kapal laut dan lain-lain. Umumnya didaerah perkotaan alat transportasi lebih banyak dibandingkan dengan pedesaan. Oleh karena itu dibutuhkan sebuah edukasi yang menyenangkan untuk meningkatkan pengetahuan masyarakat mengenai alat transportasi. “3D Hologram Pengenalan Alat Transportasi” adalah sebuah 3D hologram tentang edukasi mengenai alat transportasi. Dengan adanya “3D Hologram Pengenalan Alat Transportasi”, diharapkan mampu membuat pengguna lebih mengenali alat transportasi khususnya di Indonesia. Dari hasil pengujian didapatkan bahwa “3D Hologram Pengenalan Alat Transportasi” mampu memberikan informasi dan pengetahuan kepada pengguna transportasi di Indonesia.

  16. CERPHASE: Computer-generated phase diagrams

    International Nuclear Information System (INIS)

    Ruys, A.J.; Sorrell, C.C.; Scott, F.H.


    CERPHASE is a collection of computer programs written in the programming language basic and developed for the purpose of teaching the principles of phase diagram generation from the ideal solution model of thermodynamics. Two approaches are used in the generation of the phase diagrams: freezing point depression and minimization of the free energy of mixing. Binary and ternary phase diagrams can be generated as can diagrams containing the ideal solution parameters used to generate the actual phase diagrams. Since the diagrams generated utilize the ideal solution model, data input required from the operator is minimal: only the heat of fusion and melting point of each component. CERPHASE is menu-driven and user-friendly, containing simple instructions in the form of screen prompts as well as a HELP file to guide the operator. A second purpose of CERPHASE is in the prediction of phase diagrams in systems for which no experimentally determined phase diagrams are available, enabling the estimation of suitable firing or sintering temperatures for otherwise unknown systems. Since CERPHASE utilizes ideal solution theory, there are certain limitations imposed on the types of systems that can be predicted reliably. 6 refs., 13 refs

  17. Colour vision and computer-generated images

    International Nuclear Information System (INIS)

    Ramek, Michael


    Colour vision deficiencies affect approximately 8% of the male and approximately 0.4% of the female population. In this work, it is demonstrated that computer generated images oftentimes pose unnecessary problems for colour deficient viewers. Three examples, the visualization of molecular structures, graphs of mathematical functions, and colour coded images from numerical data are used to identify problematic colour combinations: red/black, green/black, red/yellow, yellow/white, fuchsia/white, and aqua/white. Alternatives for these combinations are discussed.

  18. Computer-assisted time-averaged holograms of the motion of the surface of the mammalian tympanic membrane with sound stimuli of 0.4 to 25 kHz (United States)

    Rosowski, John J.; Cheng, Jeffrey Tao; Ravicz, Michael E.; Hulli, Nesim; Hernandez-Montes, Maria; Harrington, Ellery; Furlong, Cosme


    Time-averaged holograms describing the sound-induced motion of the tympanic membrane (TM) in cadaveric preparations from three mammalian species and one live ear were measured using opto-electronic holography. This technique allows rapid measurements of the magnitude of motion of the tympanic membrane surface at frequencies as high as 25 kHz. The holograms measured in response to low and middle-frequency sound stimuli are similar to previously reported time-averaged holograms. However, at higher frequencies (f > 4 kHz), our holograms reveal unique TM surface displacement patterns that consist of highly-ordered arrangements of multiple local displacement magnitude maxima, each of which is surrounded by nodal areas of low displacement magnitude. These patterns are similar to modal patterns (two-dimensional standing waves) produced by either the interaction of surface waves traveling in multiple directions or the uniform stimulation of modes of motion that are determined by the structural properties and boundary conditions of the TM. From the ratio of the displacement magnitude peaks to nodal valleys in these apparent surface waves, we estimate a Standing Wave Ratio of at least 4 that is consistent with energy reflection coefficients at the TM boundaries of at least 0.35. It is also consistent with small losses within the uniformly stimulated modal surface waves. We also estimate possible TM surface wave speeds that vary with frequency and species from 20 to 65 m/s, consistent with other estimates in the literature. The presence of standing wave or modal phenomena has previously been intuited from measurements of TM function, but is ignored in some models of tympanic membrane function. Whether these standing waves result either from the interactions of multiple surface waves that travel along the membrane, or by uniformly excited modal displacement patterns of the entire TM surface is still to be determined. PMID:19328841

  19. Phase holograms in PMMA with proximity effect correction (United States)

    Maker, Paul D.; Muller, R. E.


    Complex computer generated phase holograms (CGPH's) have been fabricated in PMMA by partial e-beam exposure and subsequent partial development. The CGPH was encoded as a sequence of phase delay pixels and written by the JEOL JBX-5D2 E-beam lithography system, a different dose being assigned to each value of phase delay. Following carefully controlled partial development, the pattern appeared rendered in relief in the PMMA, which then acts as the phase-delay medium. The exposure dose was in the range 20-200 micro-C/sq cm, and very aggressive development in pure acetone led to low contrast. This enabled etch depth control to better than plus or minus lambda(sub vis)/60. That result was obtained by exposing isolated 50 micron square patches and measuring resist removal over the central area where the proximity effect dose was uniform and related only to the local exposure. For complex CGPH's with pixel size of the order of the e-beam proximity effect radius, the patterns must be corrected for the extra exposure caused by electrons scattered back up out of the substrate. This has been accomplished by deconvolving the two-dimensional dose deposition function with the desired dose pattern. The deposition function, which plays much the same role as an instrument response function, was carefully measured under the exact conditions used to expose the samples. The devices fabricated were designed with 16 equal phase steps per retardation cycle, were up to 1 cm square, and consisted of up to 100 million 0.3-2.0 micron square pixels. Data files were up to 500 MB long and exposure times ranged to tens of hours. A Fresnel phase lens was fabricated that had diffraction limited optical performance with better than 85 percent efficiency.

  20. 2nd Generation QUATARA Flight Computer (United States)

    National Aeronautics and Space Administration — The primary objective of this activity is to develop, design, and test (DD&T) the QUAD-core siTARA (QUATARA) computer to distribute computationally intensive...

  1. Computer holography: 3D digital art based on high-definition CGH

    International Nuclear Information System (INIS)

    Matsushima, K; Arima, Y; Nishi, H; Yamashita, H; Yoshizaki, Y; Ogawa, K; Nakahara, S


    Our recent works of high-definition computer-generated holograms (CGH) and the techniques used for the creation, such as the polygon-based method, silhouette method and digitized holography, are summarized and reviewed in this paper. The concept of computer holography is proposed in terms of integrating and crystalizing the techniques into novel digital art.

  2. Fragment volume determination in bullet/armor holograms (United States)

    Smith, David L.; Watts, David B.; Marsh, James S.; Gordon, Joseph E.; Anderson, Christopher S.


    This report presents automatic data reduction techniques for determining bullet and fragment volumes, positions, and momenta from holograms of bullets penetrating armor. The holography technique and the computer data reduction methods are described. Initial results are shown and sources of error in the technique are described. 2D digital images of the hologram are computationally combined by running a backprojection algorithm to produce a 3D array that represents the space containing the bullet and fragments. Thresholding the numbers in this space from the backprojection algorithm produces a representation of the bullet and fragments. Methods of automatically counting the voxels (3D picture elements) that occur in separated fragments have been programmed. These programs also find the centroids and shapes of the fragments and determine velocity using timing information. Volume errors are 40% in current results. These errors could be reduced to less than 3% if the described error sources were eliminated. Future work to improve the algorithms and the holographic process is described.

  3. Compensation of Hologram Distortion by Controlling Defocus Component in Reference Beam Wavefront for Angle Multiplexed Holograms (United States)

    Muroi, T.; Kinoshita, N.; Ishii, N.; Kamijo, K.; Kawata, Y.; Kikuchi, H.


    Holographic memory has the potential to function as a recording system with a large capacity and high data-transfer-rate. Photopolymer materials are typically used as a write-once recording medium. When holograms are recorded on this medium, they can distort due to shrinkage or expansion of the materials, which degrades the reconstructed image and causes a higher bit error rate (bER) of the reproduced data. We propose optically compensating for hologram distortion by controlling aberration components in the reference beam wavefront while reproducing data, thereby improving the reproduced data quality. First, we investigated the relation between each aberration component of the reference beam and the signal to noise ratio (SNR) of the reproduced data using numerical simulation and found that horizontal tilt and the defocus component affect the SNR. Next, we experimentally evaluated the reproduced data by controlling the defocus component in the reference beam and found that the bER of the reproduced data could be decreased by controlling the defocus center with respect to the hologram position and phase modulation depth of the defocus component. Then, we investigated a practical control method of the defocus component using an evaluation value similar to the definition of the SNR for actual data reproduction from holograms. Using a defocus controlled wavefront enabled us to decrease the bER from 3.54 x 10^-3 with a plane wave to 3.14 x 10^-4. We also investigated how to reduce the bERs of reproduced data in angle multiplexed holograms. By using a defocus controlled wavefront to compensate for hologram distortion on the 40th data page in 80-page angle multiplexed holograms, the bERs of all pages could be decreased to less than 1x10^-3. We showed that controlling the defocus component is an effective way to compensate for hologram distortion and to decrease the bER of reproduced data in holographic memory.


    Directory of Open Access Journals (Sweden)

    Oleg V. Nikanorov


    Full Text Available The paper considers the features of synthesized holograms suitable for practical use. It is established that binary holograms are the first of all suitable ones for successful application in practice. In order to select the most suitable (optimal level of hologram binarization, we propose a criterion for estimating the quality of an image reconstructed with a binary hologram. An algorithm is developed to find the optimal level. On the basis of the conducted experiments it is established that the introduction of the developed module gives the possibility to reduce the search time of the optimal binarization level of the hologram by eleven times in comparison with manual search.

  5. Laser beam characterization with digital holograms

    CSIR Research Space (South Africa)

    Forbes, A


    Full Text Available We show how laser beam characterization may be done in real-time with digital holograms. We illustrate the power of the techniques by applying them to a variety of laser sources, from fibers to solid-state....

  6. Computer-Controlled Force Generator, Phase II (United States)

    National Aeronautics and Space Administration — TDA Research, Inc. is developing a compact, low power, Next-Generation Exercise Device (NGRED) that can generate any force between 5 and 600 lbf. We use a closed...

  7. Computer-Controlled Force Generator Project (United States)

    National Aeronautics and Space Administration — TDA Research, Inc. is developing a compact, low power, Next-Generation Exercise Device (NGRED) that can generate any force between 5 and 600 lbf. We use a closed...

  8. Computing Homology Group Generators of Images Using Irregular Graph Pyramids


    Peltier , Samuel; Ion , Adrian; Haxhimusa , Yll; Kropatsch , Walter; Damiand , Guillaume


    International audience; We introduce a method for computing homology groups and their generators of a 2D image, using a hierarchical structure i.e. irregular graph pyramid. Starting from an image, a hierarchy of the image is built, by two operations that preserve homology of each region. Instead of computing homology generators in the base where the number of entities (cells) is large, we first reduce the number of cells by a graph pyramid. Then homology generators are computed efficiently on...

  9. Recent developments in computer-generated holography: toward a practical electroholography system for interactive 3D visualization (United States)

    Slinger, Christopher W.; Cameron, Colin D.; Coomber, Stuart D.; Miller, Richard J.; Payne, Doug A.; Smith, Allan P.; Smith, Mark G.; Stanley, Maurice; Watson, Philip J.


    This paper will give an overview of some recent developments in electroholography for applications in interactive 3D visualisation. Arguably the ultimate technology for this task, it is the only approach having the potential to deliver full depth cue, 3D images, having resolutions beyond that which can be perceived by the human eye. Despite significant advances by many researchers, the high pixel counts required by the computer generated hologram (CGH) patterns in these systems remain daunting - in practice, systems able to calculate and display reconfigurable CGH having pixel counts of more than one billion may be required for 300 mm width, 3D images. Advances described include novel Fourier mode variants of diffraction specific algorithms and parallel binarisation techniques for design of the CGH patterns; computer architectures for effective implementation of these algorithms for interactive CGH calculation; the latest developments in the Active Tiling spatial light modulator technology and novel replay optics arrangements including folded mirror geometries, viewer tracking alternatives and new horizontal parallax configurations. Throughout, the emphasis is optimisation towards implementation as an interactive electroholography system having practical utility. Some recent results from demonstrations of aspects of the technology will be shown. These include monochrome and colour, static and dynamic, horizontal parallax only (HPO) and full parallax, 3D images, generated from true CGH systems with up to 24 billion pixels.

  10. The RANDOM computer program: A linear congruential random number generator (United States)

    Miles, R. F., Jr.


    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  11. 2nd Generation QUATARA Flight Computer Project (United States)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven


    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  12. Fluorescent holograms with albumin-acrylamide (United States)

    Ordóñez-Padilla, M. J.; Olivares-Pérez, A.; Fuentes-Tapia, I.


    We describe fluorescent holograms were made with photosensitive films of albumin (protein) quail, used as modified matrices. Albumin is mixed with acrylamide and eosin Y. Therefore, prepare a photosensitive emulsion and solid hydrated with the ability to phase transmission holograms and volume (VPH). Eosin Y is a fluorescent agent that acts as a photo-sensitizing dye which stimulates the polymerization of acrylamide. To record the interference pattern produced by two waves superimposed on the modified matrix, we use a He-Cd laser. To reconstruct the diffraction pattern is observed with He- Ne laser, λ = 632.8nm, the material is self-developing properties. Measure the diffraction efficiency of the diffracted orders (η[-1, +1]) as a function of exposure energy. We work with various thicknesses and measure the variation of the refractive index using the coupled wave theory of Kogelnik, the holographic gratings meet Bragg condition.

  13. Computing specified generators of structured matrix inverses


    Jeannerod , Claude-Pierre; Mouilleron , Christophe


    International audience; The asymptotically fastest known divide-and-conquer methods for inverting dense structured matrices are essentially variations or extensions of the Morf/Bitmead-Anderson algorithm. Most of them must deal with the growth in length of intermediate generators, and this is done by incorporating various generator compression techniques into the algorithms. One exception is an algorithm by Cardinal, which in the particular case of Cauchy-like matrices avoids such growth by f...

  14. Advanced synthetic holograms for security purposes (United States)

    Kotačka, Libor; Vízdal, Petr; Behounek, Tomás


    Our paper deals with the recent advances in synthetically written optical security devices (DOVIDs) and holograms. The synthesized holographic security elements are recorded with a resolution reaching 500.000 dpi and are specially developed for the "layman-level" security of the most important state valuables and documents, like banknotes and identity cards. We especially pay an attention to such holographic features being impossible to originate through conventional optical holography of matrix based devices.

  15. Examination of concept of next generation computer. Progress report 1999

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Hasegawa, Yukihiro; Hirayama, Toshio


    The Center for Promotion of Computational Science and Engineering has conducted R and D works on the technology of parallel processing and has started the examination of the next generation computer in 1999. This report describes the behavior analyses of quantum calculation codes. It also describes the consideration for the analyses and examination results for the method to reduce cash misses. Furthermore, it describes a performance simulator that is being developed to quantitatively examine the concept of the next generation computer. (author)

  16. Hololujah; A One Kilometre Art Hologram (United States)

    Warren, David


    This paper will outline the production of the white light transmission achromatic image art hologram titled Hololujah displaying forward projecting real imagery of text and measuring 100,000 × 3.5 cm. Materials and methods: The paper will cover the use of Slavich VRP-M film 33.3 × 1.05 metres that was exposed and processed as thirty-three and a third 100 × 105 cm frames. These thirty-three and a third frames were subsequently cut into one thousand 100 × 3.5 cm strips with their ends cold laminated together to form the kilometre length hologram. This paper will expand on the use of a Coherent Compass 315M, 532 nm, 150 mW DPSS laser diode in a lensless setup, using a single beam through diffuse glass, no isolation systems and a two minute exposure time with the film lying flat on the floor. Lastly, this paper will illustrate how the hologram was produced in a 220 × 200 × 300 cm confined area of a suburban bedroom. Theme: This artwork is a comment on the social networking site Twitter.

  17. Computer-generated holographic near-eye display system based on LCoS phase only modulator (United States)

    Sun, Peng; Chang, Shengqian; Zhang, Siman; Xie, Ting; Li, Huaye; Liu, Siqi; Wang, Chang; Tao, Xiao; Zheng, Zhenrong


    Augmented reality (AR) technology has been applied in various areas, such as large-scale manufacturing, national defense, healthcare, movie and mass media and so on. An important way to realize AR display is using computer-generated hologram (CGH), which is accompanied by low image quality and heavy computing defects. Meanwhile, the diffraction of Liquid Crystal on Silicon (LCoS) has a negative effect on image quality. In this paper, a modified algorithm based on traditional Gerchberg-Saxton (GS) algorithm was proposed to improve the image quality, and new method to establish experimental system was used to broaden field of view (FOV). In the experiment, undesired zero-order diffracted light was eliminated and high definition 2D image was acquired with FOV broadened to 36.1 degree. We have also done some pilot research in 3D reconstruction with tomography algorithm based on Fresnel diffraction. With the same experimental system, experimental results demonstrate the feasibility of 3D reconstruction. These modifications are effective and efficient, and may provide a better solution in AR realization.

  18. The next generation of command post computing (United States)

    Arnold, Ross D.; Lieb, Aaron J.; Samuel, Jason M.; Burger, Mitchell A.


    The future of command post computing demands an innovative new solution to address a variety of challenging operational needs. The Command Post of the Future is the Army's primary command and control decision support system, providing situational awareness and collaborative tools for tactical decision making, planning, and execution management from Corps to Company level. However, as the U.S. Army moves towards a lightweight, fully networked battalion, disconnected operations, thin client architecture and mobile computing become increasingly essential. The Command Post of the Future is not designed to support these challenges in the coming decade. Therefore, research into a hybrid blend of technologies is in progress to address these issues. This research focuses on a new command and control system utilizing the rich collaboration framework afforded by Command Post of the Future coupled with a new user interface consisting of a variety of innovative workspace designs. This new system is called Tactical Applications. This paper details a brief history of command post computing, presents the challenges facing the modern Army, and explores the concepts under consideration for Tactical Applications that meet these challenges in a variety of innovative ways.

  19. Dynamic array generation and pattern formation for optical tweezers

    DEFF Research Database (Denmark)

    Mogensen, P.C.; Glückstad, J.


    The generalised phase contrast approach is used for the generation of optical arrays of arbitrary beam shape, suitable for applications in optical tweezers for the manipulation of biological specimens. This approach offers numerous advantages over current techniques involving the use of computer-......-generated holograms or diffractive optical elements. We demonstrate a low-loss system for generating intensity patterns suitable for the trapping and manipulation of small particles or specimens....

  20. Development of high-index optical coating for security holograms (United States)

    Ahmed, Nadir A. G.


    Over the past few years security holograms have grown into a complex business to prevent counterfeiting of security cards, banknotes and the like. Rapid advances in holographic technology have led to a growing requirement for optical materials and coating methods to produce such holograms at reasonable costs. These materials have specific refractive indices and are used to fabricate semi- transparent holograms. The present paper describes a coating process to deposit optical coating on flexible films inside a vacuum web metallizer for the production of high quality semi-transparent holograms.

  1. Computer generation and manipulation of sounds

    DEFF Research Database (Denmark)

    Serafin, Stefania


    Musicians are always quick to adopt and explore new technologies. The fast-paced changes wrought by electrification, from the microphone via the analogue synthesiser to the laptop computer, have led to a wide diversity of new musical styles and techniques. Electronic music has grown to a broad...... field of investigation, taking in historical movements like musique concrète and elecktronische musik, and contemporary trends such as electronic dance music. A fascinating array of composers and inventors have contributed to a diverse set of technologies, practices and music. This book brings together....... Recent areas of intense activity such as audiovisuals, live electronic music, interactivity and network music are actively promoted....

  2. Cohesion in Computer Text Generation: Lexical Substitution. (United States)


    can contain any information desired, the rules need not be strictly syntactic, but can reflect semantic and pragmatic information as well. A subset of...its antecedent. Otherwise, unintelligible text may be generated. Investigation into anaphora resolution has been performed in the pursuit of natural...syntactic, semantic, and pragmatic acceptance. The first item in the ranked list that passes these criteria is assumed to be the antecedent for the

  3. ADGEN: ADjoint GENerator for computer models

    Energy Technology Data Exchange (ETDEWEB)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.


    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs.

  4. ADGEN: ADjoint GENerator for computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Horwedel, J.E.; Oblow, E.M.


    This paper presents the development of a FORTRAN compiler and an associated supporting software library called ADGEN. ADGEN reads FORTRAN models as input and produces and enhanced version of the input model. The enhanced version reproduces the original model calculations but also has the capability to calculate derivatives of model results of interest with respect to any and all of the model data and input parameters. The method for calculating the derivatives and sensitivities is the adjoint method. Partial derivatives are calculated analytically using computer calculus and saved as elements of an adjoint matrix on direct assess storage. The total derivatives are calculated by solving an appropriate adjoint equation. ADGEN is applied to a major computer model of interest to the Low-Level Waste Community, the PRESTO-II model. PRESTO-II sample problem results reveal that ADGEN correctly calculates derivatives of response of interest with respect to 300 parameters. The execution time to create the adjoint matrix is a factor of 45 times the execution time of the reference sample problem. Once this matrix is determined, the derivatives with respect to 3000 parameters are calculated in a factor of 6.8 that of the reference model for each response of interest. For a single 3000 for determining these derivatives by parameter perturbations. The automation of the implementation of the adjoint technique for calculating derivatives and sensitivities eliminates the costly and manpower-intensive task of direct hand-implementation by reprogramming and thus makes the powerful adjoint technique more amenable for use in sensitivity analysis of existing models. 20 refs., 1 fig., 5 tabs

  5. Innovations in Computer Generated Autonomy at the MOVES Institute

    National Research Council Canada - National Science Library

    Hiles, John


    The M6VES Institute's Computer-Generated Autonomy Group has focused on a research goal of modeling intensely complex and adaptive behavior while at the same time making the behavior far easier to create and control...

  6. International Space Station (ISS) Addition of Hardware - Computer Generated Art (United States)


    This computer generated scene of the International Space Station (ISS) represents the first addition of hardware following the completion of Phase II. The 8-A Phase shows the addition of the S-9 truss.

  7. A computer program for the pointwise functions generation

    International Nuclear Information System (INIS)

    Caldeira, Alexandre D.


    A computer program that was developed with the objective of generating pointwise functions, by a combination of tabulated values and/or mathematical expressions, to be used as weighting functions for nuclear data is presented. This simple program can be an important tool for researchers involved in group constants generation. (author). 5 refs, 2 figs

  8. a computer controlled pulse generator for an st radar system

    African Journals Online (AJOL)

    an ~T radar system is described. It uses a highly flexible software and a hardware with a small. IC count, making the system compact and highly programmable. The parameters of the signals of the pulse generator are initially entered from the keyboard. The computer then generates one period of the set of signals in a ...

  9. New challenges in grid generation and adaptivity for scientific computing

    CERN Document Server

    Formaggia, Luca


    This volume collects selected contributions from the “Fourth Tetrahedron Workshop on Grid Generation for Numerical Computations”, which was held in Verbania, Italy in July 2013. The previous editions of this Workshop were hosted by the Weierstrass Institute in Berlin (2005), by INRIA Rocquencourt in Paris (2007), and by Swansea University (2010). This book covers different, though related, aspects of the field: the generation of quality grids for complex three-dimensional geometries; parallel mesh generation algorithms; mesh adaptation, including both theoretical and implementation aspects; grid generation and adaptation on surfaces – all with an interesting mix of numerical analysis, computer science and strongly application-oriented problems.

  10. Parallel grid generation algorithm for distributed memory computers (United States)

    Moitra, Stuti; Moitra, Anutosh


    A parallel grid-generation algorithm and its implementation on the Intel iPSC/860 computer are described. The grid-generation scheme is based on an algebraic formulation of homotopic relations. Methods for utilizing the inherent parallelism of the grid-generation scheme are described, and implementation of multiple levELs of parallelism on multiple instruction multiple data machines are indicated. The algorithm is capable of providing near orthogonality and spacing control at solid boundaries while requiring minimal interprocessor communications. Results obtained on the Intel hypercube for a blended wing-body configuration are used to demonstrate the effectiveness of the algorithm. Fortran implementations bAsed on the native programming model of the iPSC/860 computer and the Express system of software tools are reported. Computational gains in execution time speed-up ratios are given.

  11. Pseudo-random number generator for the Sigma 5 computer (United States)

    Carroll, S. N.


    A technique is presented for developing a pseudo-random number generator based on the linear congruential form. The two numbers used for the generator are a prime number and a corresponding primitive root, where the prime is the largest prime number that can be accurately represented on a particular computer. The primitive root is selected by applying Marsaglia's lattice test. The technique presented was applied to write a random number program for the Sigma 5 computer. The new program, named S:RANDOM1, is judged to be superior to the older program named S:RANDOM. For applications requiring several independent random number generators, a table is included showing several acceptable primitive roots. The technique and programs described can be applied to any computer having word length different from that of the Sigma 5.

  12. Generative Representations for Computer-Automated Evolutionary Design (United States)

    Hornby, Gregory S.


    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  13. Generative Representations for Computer-Automated Design Systems (United States)

    Hornby, Gregory S.


    With the increasing computational power of Computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design programs is the representation with which they encode designs. If the representation cannot encode a certain design, then the design program cannot produce it. Similarly, a poor representation makes some types of designs extremely unlikely to be created. Here we define generative representations as those representations which can create and reuse organizational units within a design and argue that reuse is necessary for design systems to scale to more complex and interesting designs. To support our argument we describe GENRE, an evolutionary design program that uses both a generative and a non-generative representation, and compare the results of evolving designs with both types of representations.

  14. Properties Of Reflection Holograms Recorded In Polaroid's DMP-128 Photopolymer (United States)

    Ingwall, R. T.; Troll, M.; Vetterling, W. T.


    Optical and microstructural properties of reflection holograms recorded in DMP-128 are reported. The optical properties are determined from transmission spectra of the holograms by ignoring the relatively small effects of light scattering and absorption. Microstructure is revealed by light and electron microscopic examination of hologram sections prepared either by diamond grit abrasion or by freeze fracture. Fringe planes are clearly seen with both sectioning procedures. The spacing between adjacent planes is strongly affected by processing conditions. Standard processing produces reflection holograms with fringe plane spacing that decreases continuously from the film:substrate interface to the film:air interface. These holograms have wide spectral bandwidths (100-200nm) and irregular band shapes. Narrow bandwidth holograms are produced from slightly more complicated processing steps. The optical properties of both types of holograms are compared to a theoretical model developed to account for nonuniform fringe plane spacing. Important experimental features such as spectral bandwidth and diffraction efficiency are readily explained by the theory and the observed microstructure.

  15. Submicron confocal Raman imaging of holograms in multicomponent photopolymers (United States)

    Kagan, C. R.; Harris, T. D.; Harris, A. L.; Schilling, M. L.


    We report submicron chemical imaging of optical holograms in multicomponent photopolymers using a confocal scanning Raman microscope. The microscope is sensitive to the submicron, ˜1% concentration variations of the polymeric components that form refractive index modulations (Δn) responsible for hologram diffraction. Concentration variations are established by both small molecule diffusion and polymer matrix swelling during hologram writing. Both density and composition variations contribute to Δn. These measurements demonstrate that submicron Raman microscopy is applicable to multicomponent organic, inorganic, and hybrid materials as a route to correlate materials chemistry/morphology with their physical properties.

  16. Diffraction efficiency and noise analysis of hidden image holograms

    DEFF Research Database (Denmark)

    Tamulevičius, Sigitas; Andrulevičius, Mindaugas; Puodžiukynas, Linas


    The simplified approach for analysis of hidden image holograms is discussed in this paper. Diffraction efficiency and signal to noise ratio of reconstructed images were investigated using direct measurements technique and digitized image analysis employing “ImageJ” software. All holograms were...... energy densities demonstrated improved diffraction efficiency and reduced signal to noise ratio of the reconstructed image. The best diffraction efficiency at sufficient signal to noise ratio was obtained using exposure energy density in the range from 150 to 200 J/m2 during the hologram writing process....

  17. Methods of compression of digital holograms, based on 1-level wavelet transform

    International Nuclear Information System (INIS)

    Kurbatova, E A; Cheremkhin, P A; Evtikhiev, N N


    To reduce the size of memory required for storing information about 3D-scenes and to decrease the rate of hologram transmission, digital hologram compression can be used. Compression of digital holograms by wavelet transforms is among most powerful methods. In the paper the most popular wavelet transforms are considered and applied to the digital hologram compression. Obtained values of reconstruction quality and hologram's diffraction efficiencies are compared. (paper)

  18. Challenges in scaling NLO generators to leadership computers (United States)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.


    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  19. Computer Generated Optical Illusions: A Teaching and Research Tool. (United States)

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  20. Student generated assignments about electrical circuits in a computer simulation

    NARCIS (Netherlands)

    Vreman-de Olde, Cornelise; de Jong, Anthonius J.M.


    In this study we investigated the design of assignments by students as a knowledge-generating activity. Students were required to design assignments for 'other students' in a computer simulation environment about electrical circuits. Assignments consisted of a question, alternatives, and feedback on

  1. Color evaluation of computer-generated color rainbow holography

    International Nuclear Information System (INIS)

    Shi, Yile; Wang, Hui; Wu, Qiong


    A color evaluation approach for computer-generated color rainbow holography (CGCRH) is presented. Firstly, the relationship between color quantities of a computer display and a color computer-generated holography (CCGH) colorimetric system is discussed based on color matching theory. An isochromatic transfer relationship of color quantity and amplitude of object light field is proposed. Secondly, the color reproduction mechanism and factors leading to the color difference between the color object and the holographic image that is reconstructed by CGCRH are analyzed in detail. A quantitative color calculation method for the holographic image reconstructed by CGCRH is given. Finally, general color samples are selected as numerical calculation test targets and the color differences between holographic images and test targets are calculated based on our proposed method. (paper)

  2. Development of unstructured mesh generator on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Shimada, Akio; Murakami, Hiroyuki; Higashida, Akihiro; Wakatsuki, Shigeto


    A general-purpose unstructured mesh generator, 'GRID3D/UNST', has been developed on parallel computers. High-speed operations and large-scale memory capacity of parallel computers enable the system to generate a large-scale mesh at high speed. As a matter of fact, the system generates large-scale mesh composed of 2,400,000 nodes and 14,000,000 elements about 1.5 hours on HITACHI SR2201, 64 PEs (Processing Elements) through 2.5 hours pre-process on SUN. Also the system is built on standard FORTRAN, C and Motif, and therefore has high portability. The system enables us to solve a large-scale problem that has been impossible to be solved, and to break new ground in the field of science and engineering. (author)

  3. LED-based digital hologram reconstruction by compressive sensing (United States)

    Weng, Jiawen; Yang, Chuping; Qin, Yi; Li, Hai


    LED-based digital hologram, considered as low-coherence digital hologram, is confined to in-line holography because the interference fringes could be observed only when the angle between the object and reference wave is small enough. So, phase-shifting technique is usually employed. But it is not fit for dynamic analysis for demanding more than one hologram. A numerical reconstruction method based on compressive sensing theory for single LED-based digital hologram is proposed to achieve dynamic analysis. By this method, the out-of-focus twin image and the coherent noise can be inhibited to some extent. The theory is presented in detail, and experimental result on LED-based digital holography with USAF pattern as test target, is performed to demonstrate the feasibility and validity of the method.

  4. Quickly Updatable Hologram Images Using Poly(N-vinyl Carbazole (PVCz Photorefractive Polymer Composite

    Directory of Open Access Journals (Sweden)

    Wataru Sakai


    Full Text Available Quickly updatable hologram images using photorefractive (PR polymer composite based on poly(N-vinyl carbazole (PVCz is presented. PVCz is one of the pioneer materials of photoconductive polymers. PR polymer composite consists of 44 wt % of PVCz, 35 wt % of 4-azacycloheptylbenzylidene-malonitrile (7-DCST as a nonlinear optical dye, 20 wt % of carbazolylethylpropionate (CzEPA as a photoconductive plasticizer and 1 wt % of 2,4,7-trinitro-9-fluorenone (TNF as a sensitizer. PR composite gives high diffraction efficiency of 68% at E = 45 V μm−1. Response speed of optical diffraction is the key parameter for real-time 3D holographic display. The key parameter for obtaining quickly updatable holographic images is to control the glass transition temperature lower enough to enhance chromophore orientation. Object image of the reflected coin surface recorded with reference beam at 532 nm (green beam in the PR polymer composite is simultaneously reconstructed using a red probe beam at 642 nm. Instead of using a coin object, an object image produced by a computer was displayed on a spatial light modulator (SLM and used for the hologram. The reflected object beam from an SLM was interfered with a reference beam on PR polymer composite to record a hologram and simultaneously reconstructed by a red probe beam.

  5. The Next Generation ARC Middleware and ATLAS Computing Model

    CERN Document Server

    Filipcic, A; The ATLAS collaboration; Smirnova, O; Konstantinov, A; Karpenko, D


    The distributed NDGF Tier-1 and associated Nordugrid clusters are well integrated into the ATLAS computing model but follow a slightly different paradigm than other ATLAS resources. The current strategy does not divide the sites as in the commonly used hierarchical model, but rather treats them as a single storage endpoint and a pool of distributed computing nodes. The next generation ARC middleware with its several new technologies provides new possibilities in development of the ATLAS computing model, such as pilot jobs with pre-cached input files, automatic job migration between the sites, integration of remote sites without connected storage elements, and automatic brokering for jobs with non-standard resource requirements. ARC's data transfer model provides an automatic way for the computing sites to participate in ATLAS' global task management system without requiring centralised brokering or data transfer services. The powerful API combined with Python and Java bindings can easily be used to build new ...

  6. Efficient computation of clipped Voronoi diagram for mesh generation

    KAUST Repository

    Yan, Dongming


    The Voronoi diagram is a fundamental geometric structure widely used in various fields, especially in computer graphics and geometry computing. For a set of points in a compact domain (i.e. a bounded and closed 2D region or a 3D volume), some Voronoi cells of their Voronoi diagram are infinite or partially outside of the domain, but in practice only the parts of the cells inside the domain are needed, as when computing the centroidal Voronoi tessellation. Such a Voronoi diagram confined to a compact domain is called a clipped Voronoi diagram. We present an efficient algorithm to compute the clipped Voronoi diagram for a set of sites with respect to a compact 2D region or a 3D volume. We also apply the proposed method to optimal mesh generation based on the centroidal Voronoi tessellation. Crown Copyright © 2011 Published by Elsevier Ltd. All rights reserved.


    Directory of Open Access Journals (Sweden)

    G. G. Rogozinsky


    Full Text Available Problem Statement. The paper deals with distributed intelligent multi-agent system for computer music generation. A mathematical model for data extraction from the environment and their application in the music generation process is proposed. Methods. We use Resource Description Framework for representation of timbre data. A special musical programming language Csound is used for subsystem of synthesis and sound processing. Sound generation occurs according to the parameters of compositional model, getting data from the outworld. Results. We propose architecture of a potential distributed system for computer music generation. An example of core sound synthesis is presented. We also propose a method for mapping real world parameters to the plane of compositional model, in an attempt to imitate elements and aspects of creative inspiration. Music generation system has been represented as an artifact in the Central Museum of Communication n.a. A.S. Popov in the framework of «Night of Museums» action. In the course of public experiment it was stated that, in the whole, the system tends to a quick settling of neutral state with no musical events generation. This proves the necessity of algorithms design for active condition support of agents’ network, in the whole. Practical Relevance. Realization of the proposed system will give the possibility for creation of a technological platform for a whole new class of applications, including augmented acoustic reality and algorithmic composition.

  8. A computational tool to design and generate crystal structures (United States)

    Ferreira, R. C.; Vieira, M. B.; Dantas, S. O.; Lobosco, M.


    The evolution of computers, more specifically regarding the increased storage and data processing capacity, allowed the construction of computational tools for the simulation of physical and chemical phenomena. Thus, practical experiments are being replaced, in some cases, by computational ones. In this context, we can highlight models used to simulate different phenomena on atomic scale. The construction of these simulators requires, by developers, the study and definition of accurate and reliable models. This complexity is often reflected in the construction of complex simulators, which simulate a limited group of structures. Such structures are sometimes expressed in a fixed manner using a limited set of geometric shapes. This work proposes a computational tool that aims to generate a set of crystal structures. The proposed tool consists of a) a programming language, which is used to describe the structures using for this purpose their characteristic functions and CSG (Constructive Solid Geometry) operators, and b) a compiler/interpreter that examines the source code written in the proposed language, and generates the objects accordingly. This tool enables the generation of an unrestricted number of structures, which can be incorporated in simulators such as the Monte Carlo Spin Engine, developed by our group at UFJF.

  9. Generation of a computer library for discrete ordinates quadrature sets

    International Nuclear Information System (INIS)

    Jenal, J.P.; Erickson, P.J.; Rhoades, W.A.; Simpson, D.B.; Williams, M.L.


    Although various discrete-ordinates quadrature sets are in general use, there exists a need for a standard collection supported by documentation of their origin and derivation. This report attempts to provide this documentation for most of the commonly used sets. Instructions for using DOQDP--the computer code used to generate these quadratures--is provided. A library (in standard interface format) containing many of the quadratures was generated and is documented in this report. Finally, a listing of the fully symmetric, half-symmetric, and several biased sets is provided for XY, RZ, and R THETA geometries. Part I of the report describes the DOQDP code and the generation of S2-S16 fully symmetric quadratures. Part II describes the generation of S4-S10 half-symmetric and several biased sets, as well as a discussion of the quadrature library. 2 figures, 1 table

  10. Pit Distribution Design for Computer-Generated Waveguide Holography (United States)

    Yagi, Shogo; Imai, Tadayuki; Ueno, Masahiro; Ohtani, Yoshimitsu; Endo, Masahiro; Kurokawa, Yoshiaki; Yoshikawa, Hiroshi; Watanabe, Toshifumi; Fukuda, Makoto


    Multilayered waveguide holography (MWH) is one of a number of page-oriented data multiplexing holographies that will be applied to optical data storage and three-dimensional (3D) moving images. While conventional volumetric holography using photopolymer or photorefractive materials requires page-by-page light exposure for recording, MWH media can be made by employing stamping and laminating technologies that are suitable for mass production. This makes devising an economical mastering technique for replicating holograms a key issue. In this paper, we discuss an approach to pit distribution design that enables us to replace expensive electron beam mastering with economical laser beam mastering. We propose an algorithm that avoids the overlapping of even comparatively large adjacent pits when we employ laser beam mastering. We also compensate for the angular dependence of the diffraction power, which strongly depends on pit shape, by introducing an enhancement profile so that a diffracted image has uniform intensity.

  11. A study on NMI report generation with computer aid diagnosis

    International Nuclear Information System (INIS)

    Yang Xiaona; Li Zhimin; Zhao Xiangjun; Qiu Jinping


    An expert system of intelligent diagnosis, computer aid diagnosis and computerized report generation and management for an nuclear medicine imaging (NMI) was performed. The mathematic model with finite set mapping for the diagnosis was evaluated. The clinical application shows, the diagnostic sensitivity and specificity of it was 85.7% ∼ 93.4% and 92% ∼ 95.6% respectively. Therefore, its application may be extended

  12. Highly-efficient all-dielectric Huygens' surface holograms (Conference Presentation) (United States)

    Chong, Katie; Wang, Lei; Staude, Isabelle; James, Anthony; Dominguez, Jason; Subramania, Ganapathi; Liu, Sheng; Decker, Manuel; Neshev, Dragomir N.; Brener, Igal; Kivshar, Yuri S.


    that lack dissipative losses and also suppress unwanted reflections without relying on cross-polarization schemes that additionally suffer from polarization-conversion losses. We now use such Huygens' surfaces in order to create a highly-efficient phase masks for the generation of optical holograms. By varying only one geometrical parameter, namely the lattice periodicity that can be controlled easily during the fabrication process we can effectively generate arbitrary hologram images from a 4-level phase discretization. In order to design the arrangement of the pixels in the metasurfaces, we calculate the phase mask required for a hologram generating the letters `hv' in the hologram plane. In the next step the Huygens' hologram is fabricated on a back-side polished SOI wafer by electron-beam lithography followed by a reactive-ion etching process. Then, we measure the phase of the generated hologram using a home-built Mach-Zehnder interferometer and perform a phase retrieval process to compare the experimental phase with the designed phase. Finally, we record the holographic image in the hologram plane and demonstrate that the device functionality is completely polarization insensitive with a transmission efficiency of 82%, in contrast to all the earlier works utilizing geometric phase. References [1] Yu et al., Nat. Mater. 13, 139 (2014). [2] Pfeiffer et al., Phys. Rev. Lett. 110, 197401 (2013). [3] Monticone et al., Phys. Rev. Lett. 110, 203903 (2013). [4] Decker et al., Adv. Opt. Mater. 3, 813 (2015).

  13. Computational power and generative capacity of genetic systems. (United States)

    Igamberdiev, Abir U; Shklovskiy-Kordi, Nikita E


    Semiotic characteristics of genetic sequences are based on the general principles of linguistics formulated by Ferdinand de Saussure, such as the arbitrariness of sign and the linear nature of the signifier. Besides these semiotic features that are attributable to the basic structure of the genetic code, the principle of generativity of genetic language is important for understanding biological transformations. The problem of generativity in genetic systems arises to a possibility of different interpretations of genetic texts, and corresponds to what Alexander von Humboldt called "the infinite use of finite means". These interpretations appear in the individual development as the spatiotemporal sequences of realizations of different textual meanings, as well as the emergence of hyper-textual statements about the text itself, which underlies the process of biological evolution. These interpretations are accomplished at the level of the readout of genetic texts by the structures defined by Efim Liberman as "the molecular computer of cell", which includes DNA, RNA and the corresponding enzymes operating with molecular addresses. The molecular computer performs physically manifested mathematical operations and possesses both reading and writing capacities. Generativity paradoxically resides in the biological computational system as a possibility to incorporate meta-statements about the system, and thus establishes the internal capacity for its evolution. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Integrating publicly-available data to generate computationally ... (United States)

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  15. Changing thoughts: a series of digital art holograms (United States)

    Azevedo, Isabel; Richardson, Martin


    Light is the one key essential quality of holography and as such holograms are morphologically closer to 'Optical', 'Kinetic' and 'Light Art'. In my attempt to explore this 'Kinetic' feature I collaborated Martin Richardson in the Modern Holography Program at DeMontfort University, in Leicester between March 2010 and 2011, to produce a series of digital art holograms and lenticulars with an open and experiential reference to light capture as energetic element. These holograms were filmed using a 35mm digital camera on a moving rail system and are as such 'Stereogram's', then printed by GEOLA in Lithuania as reflection holograms measuring 50cm x 60cm. The title of this series of digital stereographic holograms is 'Changing Thoughts'. They allude to the interrelationship the observer has in assuming an understanding of what they see, only to suddenly change when they find out that what they are seeing is actually something quite different to what they had understood. Much critical theorization, in recent times, has focused on the body and related to the work of Merleau-Ponty. And to António Damásio, the word images, means mental patterns with each sensorial way. Not only related at "visual" images nor static objects. But also sonorous images, or body inside images like those described by Einstein when he was trying to solve problems.


    Directory of Open Access Journals (Sweden)

    Ahmad Suroso


    Full Text Available Media promosi dari waktu ke waktu semakin berkembang, dari  pembicaraan orang ke orang sampai memasuki era  digital  sesuai perkembangan teknologi saat  ini.  Dimana pada  era  digital ini  banyak bermunculan aplikasi–aplikasi yang digunakan sebagai media promosi. Dalam hal ini media yang akan dikembangkan adalah media promosi dengan 3D hologram yaitu membuat sebuah 3D hologram katalog mebel pesona bahari yang masih dalam bentuk buku atau gambar 2D yang nantinya akan menampilkan detail jenis suatu barang mebel yang akan dipromosikan. 3D hologram merupakan salah satu contoh perkembangan teknologi yang masih tergolong baru pada saat ini, dimana 3D hologram akan menampilkan bentuk barang yang menyerupai bentuk aslinya. Dengan metode pengembangan multimedia yang dikembangkan oleh sutopo yaitu concept, design, material collecting, assembly, testing, dan distribution.Animasi akan dibangun dimulai dari pengonsepan animasi sampai pada akhirnya animasi akan didisbrusikan kepada masyarakat umum yang membutuhkan. Aplikasi yang akan didistribusikan berupa aplikasi jadi dimana di dalam aplikasi tersebut berisi animasi 3D Hologram barang-barang mebel yang telah ditentukan.

  17. Application of Mathematical Symmetrical Group Theory in the Creation Process of Digital Holograms

    Directory of Open Access Journals (Sweden)

    Agustín Pérez-Ramírez


    Full Text Available This work presents an algorithm to reduce the multiplicative computational complexity in the creation of digital holograms, where an object is considered as a set of point sources using mathematical symmetry properties of both the core in the Fresnel integral and the image. The image is modeled using group theory. This algorithm has multiplicative complexity equal to zero and an additive complexity (k-1N2 for the case of sparse matrices or binary images, where k is the number of pixels other than zero and N2 is the total of points in the image.

  18. A Generative Computer Model for Preliminary Design of Mass Housing

    Directory of Open Access Journals (Sweden)

    Ahmet Emre DİNÇER


    Full Text Available Today, we live in what we call the “Information Age”, an age in which information technologies are constantly being renewed and developed. Out of this has emerged a new approach called “Computational Design” or “Digital Design”. In addition to significantly influencing all fields of engineering, this approach has come to play a similar role in all stages of the design process in the architectural field. In providing solutions for analytical problems in design such as cost estimate, circulation systems evaluation and environmental effects, which are similar to engineering problems, this approach is being used in the evaluation, representation and presentation of traditionally designed buildings. With developments in software and hardware technology, it has evolved as the studies based on design of architectural products and production implementations with digital tools used for preliminary design stages. This paper presents a digital model which may be used in the preliminary stage of mass housing design with Cellular Automata, one of generative design systems based on computational design approaches. This computational model, developed by scripts of 3Ds Max software, has been implemented on a site plan design of mass housing, floor plan organizations made by user preferences and facade designs. By using the developed computer model, many alternative housing types could be rapidly produced. The interactive design tool of this computational model allows the user to transfer dimensional and functional housing preferences by means of the interface prepared for model. The results of the study are discussed in the light of innovative architectural approaches.

  19. Nanometric holograms based on a topological insulator material (United States)

    Yue, Zengji; Xue, Gaolei; Liu, Juan; Wang, Yongtian; Gu, Min


    Holography has extremely extensive applications in conventional optical instruments spanning optical microscopy and imaging, three-dimensional displays and metrology. To integrate holography with modern low-dimensional electronic devices, holograms need to be thinned to a nanometric scale. However, to keep a pronounced phase shift modulation, the thickness of holograms has been generally limited to the optical wavelength scale, which hinders their integration with ultrathin electronic devices. Here, we break this limit and achieve 60 nm holograms using a topological insulator material. We discover that nanometric topological insulator thin films act as an intrinsic optical resonant cavity due to the unequal refractive indices in their metallic surfaces and bulk. The resonant cavity leads to enhancement of phase shifts and thus the holographic imaging. Our work paves a way towards integrating holography with flat electronic devices for optical imaging, data storage and information security.

  20. Efficient Verification of Holograms Using Mobile Augmented Reality. (United States)

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter


    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.

  1. Computer-generated fiscal reports for food cost accounting. (United States)

    Fromm, B; Moore, A N; Hoover, L W


    To optimize resource utilization for the provision of health-care services, well designed food cost accounting systems should facilitate effective decision-making. Fiscal reports reflecting the financial status of an organization at a given time must be current and representative so that managers have adequate data for planning and controlling. The computer-assisted food cost accounting discussed in this article can be integrated with other sub-systems and operations management techniques to provide the information needed to make decisions regarding revenues and expenses. Management information systems must be routinely evaluated and updated to meet the current needs of administrators. Further improvements in the food cost accounting system will be desirable whenever substantial changes occur within the foodservice operation at the University of Missouri-Columbia Medical Center or when advancements in computer technology provide more efficient methods for manipulating data and generating reports. Development of new systems and better applications of present systems could contribute significantly to the efficiency of operations in both health care and commercial foodservices. The computer-assisted food cost accounting system reported here might serve s a prototype for other management cost information systems.

  2. Computational Needs for the Next Generation Electric Grid Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Birman, Kenneth; Ganesh, Lakshmi; Renessee, Robbert van; Ferris, Michael; Hofmann, Andreas; Williams, Brian; Sztipanovits, Janos; Hemingway, Graham; University, Vanderbilt; Bose, Anjan; Stivastava, Anurag; Grijalva, Santiago; Grijalva, Santiago; Ryan, Sarah M.; McCalley, James D.; Woodruff, David L.; Xiong, Jinjun; Acar, Emrah; Agrawal, Bhavna; Conn, Andrew R.; Ditlow, Gary; Feldmann, Peter; Finkler, Ulrich; Gaucher, Brian; Gupta, Anshul; Heng, Fook-Luen; Kalagnanam, Jayant R; Koc, Ali; Kung, David; Phan, Dung; Singhee, Amith; Smith, Basil


    The April 2011 DOE workshop, 'Computational Needs for the Next Generation Electric Grid', was the culmination of a year-long process to bring together some of the Nation's leading researchers and experts to identify computational challenges associated with the operation and planning of the electric power system. The attached papers provide a journey into these experts' insights, highlighting a class of mathematical and computational problems relevant for potential power systems research. While each paper defines a specific problem area, there were several recurrent themes. First, the breadth and depth of power system data has expanded tremendously over the past decade. This provides the potential for new control approaches and operator tools that can enhance system efficiencies and improve reliability. However, the large volume of data poses its own challenges, and could benefit from application of advances in computer networking and architecture, as well as data base structures. Second, the computational complexity of the underlying system problems is growing. Transmitting electricity from clean, domestic energy resources in remote regions to urban consumers, for example, requires broader, regional planning over multi-decade time horizons. Yet, it may also mean operational focus on local solutions and shorter timescales, as reactive power and system dynamics (including fast switching and controls) play an increasingly critical role in achieving stability and ultimately reliability. The expected growth in reliance on variable renewable sources of electricity generation places an exclamation point on both of these observations, and highlights the need for new focus in areas such as stochastic optimization to accommodate the increased uncertainty that is occurring in both planning and operations. Application of research advances in algorithms (especially related to optimization techniques and uncertainty quantification) could accelerate power

  3. Computer-Generated Experimental Designs for Irregular-Shaped Regions

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, Nam K.; Piepel, Gregory F.


    This paper focuses on the construction of computer-generated designs on irregularly-shaped, constrained regions. Overviews of the Fedorov exchange algorithm (FEA) and other exchange algorithms for the construction of D-optimal designs are given. A faster implementation of the FEA is presented, which is referred to as fast-FEA (denoted FFEA). The FFEA was applied to construct D-optimal designs for several published examples with constrained experimental regions. Designs resulting from the FFEA are more D-efficient than published designs, and provide benchmarks for future comparisons of design construction algorithms. The construction of G-optimal designs for constrained regions is also discussed and illustrated with a published example.

  4. Computational Swarming: A Cultural Technique for Generative Architecture

    Directory of Open Access Journals (Sweden)

    Sebastian Vehlken


    Full Text Available After a first wave of digital architecture in the 1990s, the last decade saw some approaches where agent-based modelling and simulation (ABM was used for generative strategies in architectural design. By taking advantage of the self-organisational capabilities of computational agent collectives whose global behaviour emerges from the local interaction of a large number of relatively simple individuals (as it does, for instance, in animal swarms, architects are able to understand buildings and urbanscapes in a novel way as complex spaces that are constituted by the movement of multiple material and informational elements. As a major, zoo-technological branch of ABM, Computational Swarm Intelligence (SI coalesces all kinds of architectural elements – materials, people, environmental forces, traffic dynamics, etc. – into a collective population. Thereby, SI and ABM initiate a shift from geometric or parametric planning to time-based and less prescriptive software tools.Agent-based applications of this sort are used to model solution strategies in a number of areas where opaque and complex problems present themselves – from epidemiology to logistics, and from market simulations to crowd control. This article seeks to conceptualise SI and ABM as a fundamental and novel cultural technique for governing dynamic processes, taking their employment in generative architectural design as a concrete example. In order to avoid a rather conventional application of philosophical theories to this field, the paper explores how the procedures of such technologies can be understood in relation to the media-historical concept of Cultural Techniques.

  5. Interferometric key readable security holograms with secrete-codes

    Indian Academy of Sciences (India)

    A new method is described to create secrete-codes in the security holograms for enhancing their anti-counterfeiting characteristics. ... Scientific Instruments Organisation, Sector 30, Chandigarh 160 030, India; Department of Applied Physics, Guru Jambheshwar University of Science & Technology, Hisar 125 001, India ...

  6. Interferometric key readable security holograms with secrete-codes

    Indian Academy of Sciences (India)

    To imitate these codes is difficult as pure phase objects having complex phase distribution function are used to modulate the object beam that is recorded in conjunction with an encoded interferometric reference beam derived from a key hologram. Lloyd's folding mirror interferometer is used to convert phase variations of ...

  7. Interferometric key readable security holograms with secrete-codes

    Indian Academy of Sciences (India)

    Abstract. A new method is described to create secrete-codes in the security holograms for enhancing their anti-counterfeiting characteristics. To imitate these codes is difficult as pure phase objects having complex phase distribution function are used to modulate the object beam that is recorded in conjunction with an ...

  8. Automatic speech recognition for report generation in computed tomography

    International Nuclear Information System (INIS)

    Teichgraeber, U.K.M.; Ehrenstein, T.; Lemke, M.; Liebig, T.; Stobbe, H.; Hosten, N.; Keske, U.; Felix, R.


    Purpose: A study was performed to compare the performance of automatic speech recognition (ASR) with conventional transcription. Materials and Methods: 100 CT reports were generated by using ASR and 100 CT reports were dictated and written by medical transcriptionists. The time for dictation and correction of errors by the radiologist was assessed and the type of mistakes was analysed. The text recognition rate was calculated in both groups and the average time between completion of the imaging study by the technologist and generation of the written report was assessed. A commercially available speech recognition technology (ASKA Software, IBM Via Voice) running of a personal computer was used. Results: The time for the dictation using digital voice recognition was 9.4±2.3 min compared to 4.5±3.6 min with an ordinary Dictaphone. The text recognition rate was 97% with digital voice recognition and 99% with medical transcriptionists. The average time from imaging completion to written report finalisation was reduced from 47.3 hours with medical transcriptionists to 12.7 hours with ASR. The analysis of misspellings demonstrated (ASR vs. medical transcriptionists): 3 vs. 4 for syntax errors, 0 vs. 37 orthographic mistakes, 16 vs. 22 mistakes in substance and 47 vs. erroneously applied terms. Conclusions: The use of digital voice recognition as a replacement for medical transcription is recommendable when an immediate availability of written reports is necessary. (orig.) [de

  9. The fifth generation computer project state of the art report 111

    CERN Document Server



    The Fifth Generation Computer Project is a two-part book consisting of the invited papers and the analysis. The invited papers examine various aspects of The Fifth Generation Computer Project. The analysis part assesses the major advances of the Fifth Generation Computer Project and provides a balanced analysis of the state of the art in The Fifth Generation. This part provides a balanced and comprehensive view of the development in Fifth Generation Computer technology. The Bibliography compiles the most important published material on the subject of The Fifth Generation.

  10. Networked Microcomputers--The Next Generation in College Computing. (United States)

    Harris, Albert L.

    The evolution of computer hardware for college computing has mirrored the industry's growth. When computers were introduced into the educational environment, they had limited capacity and served one user at a time. Then came large mainframes with many terminals sharing the resource. Next, the use of computers in office automation emerged. As…

  11. Computational design of materials for solar hydrogen generation (United States)

    Umezawa, Naoto

    Photocatalysis has a great potential for the production of hydrogen from aquerous solution under solar light. In this talk, two different approaches toward the computational materials desing for solar hydrogen generation will be presented. Tin (Sn), which has two major oxidation states, Sn2+ and Sn4+, is abundant on the earth's crust. Recently, visible-light responsive photocatalytc H2 evolution reaction was identified over a mixed valence tin oxide Sn3O4. We have carried out crystal structure prediction for mixed valence tin oxides in different atomic compositions under ambient pressure condition using advanced computational methods based on the evolutionary crystal-structure search and density-functional theory. The predicted novel crystal structures realize the desirable band gaps and band edge positions for H2 evolution under visible light irradiation. It is concluded that multivalent tin oxides have a great potential as an abundant, cheap and environmentally-benign solar-energy conversion photofunctional materials. Transition metal doping is effective for sensitizing SrTiO3 under visible light. We have theoretically investigated the roles of the doped Cr in STO based on hybrid density-functional calculations. Cr atoms are preferably substituting for Ti under any equilibrium growth conditions. The lower oxidation state Cr3+, which is stabilized under an n-type condition of STO, is found to be advantageous for the photocatalytic performance. It is firther predicted that lanthanum is the best codopant for stabilizing the favorable oxidation state, Cr3+. The prediction was validated by our experiments that La and Cr co-doped STO shows the best performance among examined samples. This work was supported by the Japan Science and Technology Agency (JST) Precursory Research for Embryonic Science and Technology (PRESTO) and International Research Fellow program of Japan Society for the Promotion of Science (JSPS) through project P14207.

  12. How Well Do Computer-Generated Faces Tap Face Expertise?

    Directory of Open Access Journals (Sweden)

    Kate Crookes

    Full Text Available The use of computer-generated (CG stimuli in face processing research is proliferating due to the ease with which faces can be generated, standardised and manipulated. However there has been surprisingly little research into whether CG faces are processed in the same way as photographs of real faces. The present study assessed how well CG faces tap face identity expertise by investigating whether two indicators of face expertise are reduced for CG faces when compared to face photographs. These indicators were accuracy for identification of own-race faces and the other-race effect (ORE-the well-established finding that own-race faces are recognised more accurately than other-race faces. In Experiment 1 Caucasian and Asian participants completed a recognition memory task for own- and other-race real and CG faces. Overall accuracy for own-race faces was dramatically reduced for CG compared to real faces and the ORE was significantly and substantially attenuated for CG faces. Experiment 2 investigated perceptual discrimination for own- and other-race real and CG faces with Caucasian and Asian participants. Here again, accuracy for own-race faces was significantly reduced for CG compared to real faces. However the ORE was not affected by format. Together these results signal that CG faces of the type tested here do not fully tap face expertise. Technological advancement may, in the future, produce CG faces that are equivalent to real photographs. Until then caution is advised when interpreting results obtained using CG faces.

  13. A new generation drilling rig: hydraulically powered and computer controlled

    Energy Technology Data Exchange (ETDEWEB)

    Laurent, M.; Angman, P.; Oveson, D. [Tesco Corp., Calgary, AB, (Canada)


    Development, testing and operation of a new generation of hydraulically powered and computer controlled drilling rig that incorporates a number of features that enhance functionality and productivity, is described. The rig features modular construction, a large heated common drilling machinery room, permanently-mounted draw works which, along with the permanently installed top drive, significantly reduces rig-up/rig-down time. Also featured are closed and open hydraulic systems and a unique hydraulic distribution manifold. All functions are controlled through a programmable logic controller (PLC), providing almost unlimited interlocks and calculations to increase rig safety and efficiency. Simplified diagnostic routines, remote monitoring and troubleshooting are also part of the system. To date, two rigs are in operation. Performance of both rigs has been rated as `very good`. Little or no operational problems have been experienced; downtime has averaged 0.61 per cent since August 1998 when the the first of the two rigs went into operation. The most important future application for this rig is for use with the casing drilling process which eliminates the need for drill pipe and tripping. It also reduces the drilling time lost due to unscheduled events such as reaming, fishing and taking kicks while tripping. 1 tab., 6 figs.

  14. Computer generated slides: a need to curb our enthusiasm. (United States)

    Dalal, M D; Daver, B M


    The popular use of computer generated slides for presentations during plastic surgery scientific meetings has opened a fresh chapter in audiovisual techniques. Although the profusion of colours seen during presentations is a visual treat, the information imparted by these slides leaves much to be desired and raises the question of whether such attractive and apparently professionally made slides are visual aids during such presentations. Presentation slides are displayed for a very short time (10-15 seconds) as compared to slides displayed during a lecture and therefore these presentation slides should have the ability to impart their information very quickly. We conducted a study wherein 36 slides, each having a different colour combination, were displayed to a class of third year medical students who were asked to judge the efficacy of each slide. The attractiveness, clarity and recall of each slide was graded by every student and, with the information obtained, the most effective format and colour combinations to be used while making slides for presentations were established. We conclude that the best format for slides is a plain dark coloured background (blue, purple or green) and a separate, contrasting plain dark coloured title text background (red, green or purple), with white letters for the text and yellow letters for the title.

  15. Reconstruction of on-axis lensless Fourier transform digital hologram with the screen division method (United States)

    Jiang, Hongzhen; Liu, Xu; Liu, Yong; Li, Dong; Chen, Zhu; Zheng, Fanglan; Yu, Deqiang


    An effective approach for reconstructing on-axis lensless Fourier Transform digital hologram by using the screen division method is proposed. Firstly, the on-axis Fourier Transform digital hologram is divided into sub-holograms. Then the reconstruction result of every sub-hologram is obtained according to the position of corresponding sub-hologram in the hologram reconstruction plane with Fourier transform operation. Finally, the reconstruction image of on-axis Fourier Transform digital hologram can be acquired by the superposition of the reconstruction result of sub-holograms. Compared with the traditional reconstruction method with the phase shifting technology, in which multiple digital holograms are required to record for obtaining the reconstruction image, this method can obtain the reconstruction image with only one digital hologram and therefore greatly simplify the recording and reconstruction process of on-axis lensless Fourier Transform digital holography. The effectiveness of the proposed method is well proved with the experimental results and it will have potential application foreground in the holographic measurement and display field.

  16. Supporting hypothesis generation by learners exploring an interactive computer simulation

    NARCIS (Netherlands)

    van Joolingen, Wouter; de Jong, Anthonius J.M.


    Computer simulations provide environments enabling exploratory learning. Research has shown that these types of learning environments are promising applications of computer assisted learning but also that they introduce complex learning settings, involving a large number of learning processes. This

  17. Generative Computer Assisted Instruction: An Application of Artificial Intelligence to CAI. (United States)

    Koffman, Elliot B.

    Frame-oriented computer-assisted instruction (CAI) systems dominate the field, but these mechanized programed texts utilize the computational power of the computer to a minimal degree and are difficult to modify. Newer, generative CAI systems which are supplied with a knowledge of subject matter can generate their own problems and solutions, can…

  18. Design and visualization of synthetic holograms for security applications

    International Nuclear Information System (INIS)

    Škeren, M; Nývlt, M; Svoboda, J


    In this paper we present a software for the design and visualization of holographic elements containing full scale of visual effects. It enables to simulate an observation of the holographic elements under general conditions including different light sources with various spectral and coherence properties and various geometries of reconstruction. Furthermore, recent technologies offer interesting possibilities for the 3D visualization such as the 3D techniques based on shutter or polarization glasses, anaglyphs, etc. The presented software is compatible with the mentioned techniques and enables an application of the 3D hardware tools for visualization. The software package can be used not only for visualization of the existing designs, but also for a fine tuning of the spatial, kinetic, and color properties of the hologram. Moreover, the holograms containing all types of the 3D effects, general color mixing, kinetic behavior, diffractive cryptograms, etc. can be translated using the software directly to a high resolution micro-structure.

  19. Reconstruction dynamics of recorded holograms in photochromic glass. (United States)

    Mihailescu, Mona; Pavel, Eugen; Nicolae, Vasile B


    We have investigated the dynamics of the record-erase process of holograms in photochromic glass using continuum Nd:YVO₄ laser radiation (λ=532 nm). A bidimensional microgrid pattern was formed and visualized in photochromic glass, and its diffraction efficiency decay versus time (during reconstruction step) gave us information (D, Δn) about the diffusion process inside the material. The recording and reconstruction processes were carried out in an off-axis setup, and the images of the reconstructed object were recorded by a CCD camera. Measurements realized on reconstructed object images using holograms recorded at a different incident power laser have shown a two-stage process involved in silver atom kinetics.

  20. Phase holograms formed by silver halide (sensitized) gelatin processing. (United States)

    Graver, W R; Gladden, J W; Eastes, J W


    A novel recording process for the formation of phase volume holograms at up to 1500 cycles/mm is described. The term silver halide (sensitized) gelatin or SHG denotes an all-gelatin phase material, which records the initial image information through photon absorption by the silver halide. Our process uses a reversal bleach that dissolves the developed silver image and cross-links the gelatin molecules in the vicinity of the developed image. Experiments have determined the stored image as refractive-index differences within the remaining gelatin. The major attributes of SHG holograms are (1) panchromatic response, (2) 100:1 greater light sensitivity than dichromate (sensitized) gelatin, and (3) elimination of darkening (printout) effects.

  1. Rotating Detonation Combustion: A Computational Study for Stationary Power Generation (United States)

    Escobar, Sergio

    The increased availability of gaseous fossil fuels in The US has led to the substantial growth of stationary Gas Turbine (GT) usage for electrical power generation. In fact, from 2013 to 2104, out of the 11 Tera Watts-hour per day produced from fossil fuels, approximately 27% was generated through the combustion of natural gas in stationary GT. The thermodynamic efficiency for simple-cycle GT has increased from 20% to 40% during the last six decades, mainly due to research and development in the fields of combustion science, material science and machine design. However, additional improvements have become more costly and more difficult to obtain as technology is further refined. An alternative to improve GT thermal efficiency is the implementation of a combustion regime leading to pressure-gain; rather than pressure loss across the combustor. One concept being considered for such purpose is Rotating Detonation Combustion (RDC). RDC refers to a combustion regime in which a detonation wave propagates continuously in the azimuthal direction of a cylindrical annular chamber. In RDC, the fuel and oxidizer, injected from separated streams, are mixed near the injection plane and are then consumed by the detonation front traveling inside the annular gap of the combustion chamber. The detonation products then expand in the azimuthal and axial direction away from the detonation front and exit through the combustion chamber outlet. In the present study Computational Fluid Dynamics (CFD) is used to predict the performance of Rotating Detonation Combustion (RDC) at operating conditions relevant to GT applications. As part of this study, a modeling strategy for RDC simulations was developed. The validation of the model was performed using benchmark cases with different levels of complexity. First, 2D simulations of non-reactive shock tube and detonation tubes were performed. The numerical predictions that were obtained using different modeling parameters were compared with

  2. Interpreting USANS intensities from computer generated fractal structures

    International Nuclear Information System (INIS)

    Bertram, W.K.


    Full text: Recent developments in the technique of high resolution Ultra Small Angle Neutron Scattering (USANS) have made this an important tool for investigating the microstructure of a wide variety of materials, in particular those that exhibit scale invariance over a range of scale lengths. The USANS spectrum from a material may show scale invariance that is indicative of a fractal structure in the material but it may also merely reflect the random nature of the sizes and shapes of the scattering entities that make up the material. USANS often allows us to measure the coherent elastic scattering cross sections well into the Guinier region. By analysing the measured scattering intensities using fractal derived models, values are obtained for certain parameters from which certain properties of the material may be obtained. In particular, the porosity can be obtained provided the average volume of the constituents of the material can be calculated. One of the parameters in the analysis is the correlation length, which may be interpreted as the scale length beyond which the material ceases to be fractal. However the relation between this parameter and an average particle size is not at all clear. To throw some light on this, we have used computer simulations to generate a number of fractal-like structures to obtain size distributions and porosities. USANS intensities were calculated from these structures and fitted using a standard fractal model to obtain values for the correlation lengths. The relation between porosity, average particle size and correlation length was investigated. Results are presented that show that the porosity of a fractal system is best calculated using the correlation length parameter to estimate the average particle volume

  3. Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories

    Directory of Open Access Journals (Sweden)

    Augusto Beléndez


    Full Text Available There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW or the Rigorous Coupled Wave theory (RCW. The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.

  4. Preliminary study of visual effect of multiplex hologram (United States)

    Fu, Huaiping; Xiong, Bingheng; Yang, Hong; Zhang, Xueguo


    The process of any movement of real object can be recorded and displayed by a multiplex holographic stereogram. An embossing multiplex holographic stereogram and a multiplex rainbow holographic stereogram have been made by us, the multiplex rainbow holographic stereogram reconstructs the dynamic 2D line drawing of speech organs, the embossing multiplex holographic stereogram reconstructs the process of an old man drinking water. In this paper, we studied the visual result of an embossing multiplex holographic stereogram made with 80 films of 2-D pictures. Forty-eight persons of aged from 13 to 67 were asked to see the hologram and then to answer some questions about the feeling of viewing. The results indicate that this kind of holograms could be accepted by human visual sense organ without any problem. This paper also discusses visual effect of the multiplex holography stereograms base on visual perceptual psychology. It is open out that the planar multiplex holograms can be recorded and present the movement of real animal and object. Not only have the human visual perceptual constancy for shape, just as that size, color, etc... but also have visual perceptual constancy for binocular parallax.

  5. Next Generation Computer Resources: Reference Model for Project Support Environments (Version 2.0)

    National Research Council Canada - National Science Library

    Brown, Alan


    The objective of the Next Generation Computer Resources (NGCR) program is to restructure the Navy's approach to acquisition of standard computing resources to take better advantage of commercial advances and investments...

  6. Three-dimensional pseudo-random number generator for implementing in hybrid computer systems

    International Nuclear Information System (INIS)

    Ivanov, M.A.; Vasil'ev, N.P.; Voronin, A.V.; Kravtsov, M.Yu.; Maksutov, A.A.; Spiridonov, A.A.; Khudyakova, V.I.; Chugunkov, I.V.


    The algorithm for generating pseudo-random numbers oriented to implementation by using hybrid computer systems is considered. The proposed solution is characterized by a high degree of parallel computing [ru

  7. Computer skills for the next generation of healthcare executives. (United States)

    Côté, Murray J; Van Enyde, Donald F; DelliFraine, Jami L; Tucker, Stephen L


    Students beginning a career in healthcare administration must possess an array of professional and management skills in addition to a strong fundamental understanding of the field of healthcare administration. Proficient computer skills are a prime example of an essential management tool for healthcare administrators. However, it is unclear which computer skills are absolutely necessary for healthcare administrators and the extent of congruency between the computer skills possessed by new graduates and the needs of senior healthcare professionals. Our objectives in this research are to assess which computer skills are the most important to senior healthcare executives and recent healthcare administration graduates and examine the level of agreement between the two groups. Based on a survey of senior healthcare executives and graduate healthcare administration students, we identify a comprehensive and pragmatic array of computer skills and categorize them into four groups, according to their importance, for making recent health administration graduates valuable in the healthcare administration workplace. Traditional parametric hypothesis tests are used to assess congruency between responses of senior executives and of recent healthcare administration graduates. For each skill, responses of the two groups are averaged to create an overall ranking of the computer skills. Not surprisingly, both groups agreed on the importance of computer skills for recent healthcare administration graduates. In particular, computer skills such as word processing, graphics and presentation, using operating systems, creating and editing databases, spreadsheet analysis, using imported data, e-mail, using electronic bulletin boards, and downloading information were among the highest ranked computer skills necessary for recent graduates. However, there were statistically significant differences in perceptions between senior executives and healthcare administration students as to the extent

  8. Multiplex hologram representations of space-variant optical systems using ground-glass encoded reference beams. (United States)

    Jones, M I; Walkup, J F; Hagler, M O


    Multiplex holograms can accurately represent 2-D space-variant optical systems if hologram-to-hologram cross talk can be reduced to an acceptable level. In the experiments reported here the combination of ground-glass diffusers with chirped wave front illumination proved to be an effective means of suppressing this cross talk. The residual cross talk distribution and intensity depend both on the signal and the optical system. The experiments also show that overlapping outputs from multiplexed holograms add coherently in amplitude. In addition, an accurate holographic representation of an extremely space-variant system is presented.

  9. Multi-wavelengths digital holography: reconstruction, synthesis and display of holograms using adaptive transformation. (United States)

    Memmolo, P; Finizio, A; Paturzo, M; Ferraro, P; Javidi, B


    A method based on spatial transformations of multiwavelength digital holograms and the correlation matching of their numerical reconstructions is proposed, with the aim to improve superimposition of different color reconstructed images. This method is based on an adaptive affine transform of the hologram that permits management of the physical parameters of numerical reconstruction. In addition, we present a procedure to synthesize a single digital hologram in which three different colors are multiplexed. The optical reconstruction of the synthetic hologram by a spatial light modulator at one wavelength allows us to display all color features of the object, avoiding loss of details.

  10. (abstract) The Nest Generation of Space Flight Computers (United States)

    Alkalaj, Leon; Panwar, Ramesh


    To meet new design objectives for drastic reductions in mass, size, and power consumption, the Flight Computer Development Group at JPL is participating in a design study and development of a light-weight, small-sized, low-power 3-D Space Flight Computer. In this paper, we will present a detailed design and tradeoff study of the proposed computer. We will also discuss a complete design of the multichip modules and their size, weight, and power consumption. Prelimimary thermal models will also be discussed.

  11. Natural language computing an English generative grammar in Prolog

    CERN Document Server

    Dougherty, Ray C


    This book's main goal is to show readers how to use the linguistic theory of Noam Chomsky, called Universal Grammar, to represent English, French, and German on a computer using the Prolog computer language. In so doing, it presents a follow-the-dots approach to natural language processing, linguistic theory, artificial intelligence, and expert systems. The basic idea is to introduce meaningful answers to significant problems involved in representing human language data on a computer. The book offers a hands-on approach to anyone who wishes to gain a perspective on natural language

  12. Computational Challenges of Next Generation Sequencing Pipelines Using Heterogeneous Systems

    NARCIS (Netherlands)

    Houtgast, E.J.; Sima, V.M.; Bertels, K.L.M.; Al-Ars, Z.


    We are rapidly entering the era of genomics. The dramatic cost reduction of DNA sequencing due to the introduction of Next Generation Sequencing (NGS) techniques has resulted in an exponential growth of genetics data. The amount of data generated, and its associated processing into useful

  13. GenOVa: a computer program to generate orientational variants


    Cayron, Cyril


    A computer program called GenOVa, written in Python, calculates the orientational variants, the operators (special types of misorientations between variants) and the composition table associated with a groupoid structure. The variants can be represented by three-dimensional shapes or by pole figures.

  14. Stereo-hologram in discrete depth of field (Conference Presentation) (United States)

    Lee, Kwanghoon; Park, Min-Chul


    In holographic space, continuous object space can be divided as several discrete spaces satisfied each of same depth of field (DoF). In the environment of wearable device using holography, specially, this concept can be applied to macroscopy filed in contrast of the field of microscopy. Since the former has not need to high depth resolution because perceiving power of eye in human visual system, it can distinguish clearly among the objects in depth space, has lower than optical power of microscopic field. Therefore continuous but discrete depth of field (DDoF) for whole object space can present the number of planes included sampled space considered its DoF. Each DoF plane has to consider the occlusion among the object's areas in its region to show the occluded phenomenon inducing by the visual axis around the eye field of view. It makes natural scene in recognition process even though the combined discontinuous DoF regions are altered to the continuous object space. Thus DDoF pull out the advantages such as saving consuming time of the calculation process making the hologram and the reconstruction. This approach deals mainly the properties of several factors required in stereo hologram HMD such as stereoscopic DoF according to the convergence, least number of DDoFs planes in normal visual circumstance (within to 10,000mm), the efficiency of saving time for taking whole holographic process under the our method compared to the existing. Consequently this approach would be applied directly to the stereo-hologram HMD field to embody a real-time holographic imaging.

  15. Phase hologram formation in highly concentrated phenanthrenequinone PMMA media (United States)

    Mahilny, U. V.; Marmysh, D. N.; Tolstik, A. L.; Matusevich, V.; Kowarschik, R.


    For phase holographic gratings in layers of polymethylmethacrylate, containing phenanthrenequinone in high concentration (nearly 3 mol%), a discrepancy between experimental (up to 9) and estimated (~45) magnitudes of the thermal diffusion amplification coefficient has been revealed. Analysis of plausible reasons of the lower experimental efficiency of the diffusion amplification has been carried out. The influence of material deformations on the reflection grating formation process was investigated experimentally. It is shown that thermoactivated amplification of holograms under high phenanthrenequinone concentration and its profound modulation are depressed by the arising density 'grating'.

  16. GPC-enhanced read-out of holograms

    DEFF Research Database (Denmark)

    Villangca, Mark Jayson; Bañas, Andrew Rafael; Palima, Darwin


    The Generalized Phase Contrast (GPC) method has been demonstrated to reshape light efficiently to match the input beam profile requirement of different illumination targets. A spatially coherent beam can be GPC-shaped into a variety of static and dynamic profiles to match e.g. fixed commercially...... available modulation systems or for more irregular and dynamic shapes such as found in advanced optogenetic light-excitations of neurons. In this work, we integrate a static GPC light shaper to illuminate a phase-only spatial light modulator encoding dynamic phase holograms. The GPC-enhanced phase...... truncation. The phase flatness of the GPC-enhanced readout beam has also been investigated....

  17. Unexpected death holograms: Animitas urban appeal in Chile

    Directory of Open Access Journals (Sweden)

    Lautaro Ojeda Ledesma


    Full Text Available This paper aims at performing an integral analysis of the relation between popular religiousness and urban space in Chilean animitas [little shrines] practices. In order to do this, we propose a multipurpose analysis scheme, holding the concept of "unexpected death hologram". This scheme puts forward three supplementary classifications: animita as a holographic subject, as a holographic object and as a holographic place. Finally, these three classifications supplemented by interviews and topologic analyses show almost all the sociospatial factors present in this practice, accounting for the urban importance that this type of popular practice has

  18. A Computational Differential Geometry Approach to Grid Generation

    CERN Document Server

    Liseikin, Vladimir D


    The process of breaking up a physical domain into smaller sub-domains, known as meshing, facilitates the numerical solution of partial differential equations used to simulate physical systems. This monograph gives a detailed treatment of applications of geometric methods to advanced grid technology. It focuses on and describes a comprehensive approach based on the numerical solution of inverted Beltramian and diffusion equations with respect to monitor metrics for generating both structured and unstructured grids in domains and on surfaces. In this second edition the author takes a more detailed and practice-oriented approach towards explaining how to implement the method by: Employing geometric and numerical analyses of monitor metrics as the basis for developing efficient tools for controlling grid properties. Describing new grid generation codes based on finite differences for generating both structured and unstructured surface and domain grids. Providing examples of applications of the codes to the genera...

  19. Resemblance of the properties of superimposed volume holograms to the properties of human memory (United States)

    Orlov, V. V.


    According to current concepts in psychology, a collection of patterns stored in human memory has the property of integrity and contains new information not contained in the individual patterns. It is shown that superimposed volume holograms possess similar properties if the information in them is written by a method that excludes the appearance of crosstalk of the holograms.

  20. The complete book of holograms how they work and how to make them

    CERN Document Server

    Kasper, Joseph Emil


    Clear, thorough account, without complicated mathematics, explains the two models of holography - the geometric and the zone plate - and different types of holograms, including transmission, reflection, phase, projection, rainbow, and multiplex. They also show basic setups for making holograms and provide step-by-step instructions so readers can make their own.

  1. A Scanning Hologram Recorded by Phase Conjugate Property of Nonlinear Crystals

    DEFF Research Database (Denmark)

    Zi-Liang, Ping; Dalsgaard, Erik


    A methode of recording a scanning hologram with phase conjugate property of nonlinear crystal is provided. The principle of recording, setup and experiments are given.......A methode of recording a scanning hologram with phase conjugate property of nonlinear crystal is provided. The principle of recording, setup and experiments are given....

  2. Condition Driven Adaptive Music Generation for Computer Games (United States)

    Naushad, Alamgir; Muhammad, Tufail


    The video game industry has grown to a multi-billion dollar, worldwide industry. The background music tends adaptively in reference to the specific game content during the game length of the play. Adaptive music should be further explored by looking at the particular condition in the game; such condition is driven by generating a specific music in the background which best fits in with the active game content throughout the length of the gameplay. This research paper outlines the use of condition driven adaptive music generation for audio and video to dynamically incorporate adaptively.

  3. Computer Image Generation: Advanced Visual/Sensor Simulation. (United States)


    levels of suboivision. 3. The cubic surface may not be appropriate for all surfaces. Flat surfaces may be rendered with slight undulations oue to aujacent...aimensions. The alternative is to present cultural objects at a subpixel level, w nicn is simpler but not as cepenoaole because suopixei-size objects wiii...34 Doctoral dissertation, University of Utah, December 1978. Carpenter, L.G., "Computer Rendering of Fractal Curves and Surfaces," Siggraph 󈨔, Special

  4. Photodeposited diffractive optical elements of computer generated masks

    International Nuclear Information System (INIS)

    Mirchin, N.; Peled, A.; Baal-Zedaka, I.; Margolin, R.; Zagon, M.; Lapsker, I.; Verdyan, A.; Azoulay, J.


    Diffractive optical elements (DOE) were synthesized on plastic substrates using the photodeposition (PD) technique by depositing amorphous selenium (a-Se) films with argon lasers and UV spectra light. The thin films were deposited typically onto polymethylmethacrylate (PMMA) substrates at room temperature. Scanned beam and contact mask modes were employed using computer-designed DOE lenses. Optical and electron micrographs characterize the surface details. The films were typically 200 nm thick

  5. Computer program for automatic generation of BWR control rod patterns

    International Nuclear Information System (INIS)

    Taner, M.S.; Levine, S.H.; Hsia, M.Y.


    A computer program named OCTOPUS has been developed to automatically determine a control rod pattern that approximates some desired target power distribution as closely as possible without violating any thermal safety or reactor criticality constraints. The program OCTOPUS performs a semi-optimization task based on the method of approximation programming (MAP) to develop control rod patterns. The SIMULATE-E code is used to determine the nucleonic characteristics of the reactor core state

  6. Photodeposited diffractive optical elements of computer generated masks

    Energy Technology Data Exchange (ETDEWEB)

    Mirchin, N. [Electrical and Electronics Engineering Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel)]. E-mail:; Peled, A. [Electrical and Electronics Engineering Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel); Baal-Zedaka, I. [Electrical and Electronics Engineering Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel); Margolin, R. [Electrical and Electronics Engineering Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel); Zagon, M. [Electrical and Electronics Engineering Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel); Lapsker, I. [Physics Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel); Verdyan, A. [Physics Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel); Azoulay, J. [Physics Department, Holon Academic Institute of Technology, 52 Golomb Street, Holon 58102 (Israel)


    Diffractive optical elements (DOE) were synthesized on plastic substrates using the photodeposition (PD) technique by depositing amorphous selenium (a-Se) films with argon lasers and UV spectra light. The thin films were deposited typically onto polymethylmethacrylate (PMMA) substrates at room temperature. Scanned beam and contact mask modes were employed using computer-designed DOE lenses. Optical and electron micrographs characterize the surface details. The films were typically 200 nm thick.

  7. Recognition of Computer-Generated Pictures on Monochrome Monitors. (United States)

    Baker, Patti R.; And Others


    This study investigated whether second, third, and fourth graders could recognize microcomputer-generated color graphics displayed on monochromatic monitors. It was found that subjects were unable to discern critical features of a color graphic displayed on a monochromatic screen unless it was designed to enhance figure/ground separation.…

  8. Computation of Superconducting Generators for Wind Turbine Applications

    DEFF Research Database (Denmark)

    Rodriguez Zermeno, Victor Manuel

    The idea of introducing a superconducting generator for offshore wind turbine applications has received increasing support. It has been proposed as a way to meet energy market requirements and policies demanding clean energy sources in the near future. However, design considerations have to take ...

  9. All-dielectric meta-holograms with holographic images transforming longitudinally

    KAUST Repository

    Wang, Qiu


    Metasurfaces are unique subwavelength geometries capable of engineering electromagnetic waves at will, delivering new opportunities for holography. Most previous meta-holograms, so-called phase-only meta-holograms, modulate only the amplitude distribution of a virtual object, and require optimizing techniques to improve the image quality. However, the phase distribution of the reconstructed image is usually overlooked in previous studies, leading to inevitable information loss. Here, we demonstrate all-dielectric meta-holograms that allow tailoring of both the phase and amplitude distributions of virtual objects. Several longitudinal manipulations of the holographic images are theoretically and experimentally demonstrated, including shifting, stretching, and rotating, enabling a large depth of focus. Furthermore, a new meta-hologram with a three-dimensional holographic design method is demonstrated with an even enhanced depth of focus. The proposed meta-holograms offer more freedom in holographic design and open new avenues for designing complex three-dimensional holography.

  10. Generation of a Reconfigurable Logical Cell Using Evolutionary Computation

    Directory of Open Access Journals (Sweden)

    I. Campos-Cantón


    Full Text Available In nature, an interesting topic is about how a cell can be reconfigured in order to achieve a different task. Another interesting topic is about the learning process that seems to be a trial and error process. In this work, we present mechanisms about how to produce a reconfigurable logical cell based on the tent map. The reconfiguration is realized by modifying its internal parameters generating several logical functions in the same structure. The logical cell is built with three blocks: the initial condition generating function, the tent map, and the output function. Furthermore, we propose a reconfigurable structure based on a chaotic system and an evolutionary algorithm is used in order to tune the parameters of the cell via trial and error process.

  11. Computer Controlled MHD Power Consolidation and Pulse Generation System (United States)


    applying the PASC technology to the diagonal generator connection. 3.2.1 Modeling the PASC Process Using EMTP 15 3.2.2 Discussion of Results 15...consolidation dc diagonal mode EMTP EPRI Faraday mode GTO Hz inversion I/O MHD Mosfet Mux MWe PASC RTX RTXEB SBC snubber SPICE SRAM Tesla...shown were obtained using the Elec- tromagnetic Transients Program ( EMTP ), to integrate the circuit configuration given. 14 The results indicate that

  12. Condition Driven Adaptive Music Generation for Computer Games


    Naushad, Alamgir


    The video game industry has grown to a multi-billion dollar, worldwide industry. The background music tends adaptively in reference to the specific game content during the game length of the play. Adaptive music should be further explored by looking at the particular condition in the game; such condition is driven by generating a specific music in the background which best fits in with the active game content throughout the length of the gameplay. This research paper outlines the use of condi...

  13. Computational Modeling of Meteor-Generated Ground Pressure Signatures (United States)

    Nemec, Marian; Aftosmis, Michael J.; Brown, Peter G.


    We present a thorough validation of a computational approach to predict infrasonic signatures of centimeter-sized meteoroids. We assume that the energy deposition along the meteor trail is dominated by atmospheric drag and simulate the steady, inviscid flow of air in thermochemical equilibrium to compute the meteoroid's near-body pressure signature. This signature is then propagated through a stratified and windy atmosphere to the ground using a methodology adapted from aircraft sonic-boom analysis. An assessment of the numerical accuracy of the near field and the far field solver is presented. The results show that when the source of the signature is the cylindrical Mach-cone, the simulations closely match the observations. The prediction of the shock rise-time, the zero-peak amplitude of the waveform, and the duration of the positive pressure phase are consistently within 10% of the measurements. Uncertainty in the shape of the meteoroid results in a poorer prediction of the trailing part of the waveform. Overall, our results independently verify energy deposition estimates deduced from optical observations.

  14. Dynamic wave field synthesis: enabling the generation of field distributions with a large space-bandwidth product. (United States)

    Kamau, Edwin N; Heine, Julian; Falldorf, Claas; Bergmann, Ralf B


    We present a novel approach for the design and fabrication of multiplexed computer generated volume holograms (CGVH) which allow for a dynamic synthesis of arbitrary wave field distributions. To achieve this goal, we developed a hybrid system that consists of a CGVH as a static element and an electronically addressed spatial light modulator as the dynamic element. We thereby derived a new model for describing the scattering process within the inhomogeneous dielectric material of the hologram. This model is based on the linearization of the scattering process within the Rytov approximation and incorporates physical constraints that account for voxel based laser-lithography using micro-fabrication of the holograms in a nonlinear optical material. In this article we demonstrate that this system basically facilitates a high angular Bragg selectivity on the order of 1°. Additionally, it allows for a qualitatively low cross-talk dynamic synthesis of predefined wave fields with a much larger space-bandwidth product (SBWP ≥ 8.7 × 10(6)) as compared to the current state of the art in computer generated holography.

  15. Efficient generation of graph states for quantum computation

    International Nuclear Information System (INIS)

    Clark, S R; Alves, C Moura; Jaksch, D


    We present an entanglement generation scheme which allows arbitrary graph states to be efficiently created in a linear quantum register via an auxiliary entangling bus (EB). The dynamical evolution of the EB is described by an effective non-interacting fermionic system undergoing mirror-inversion in which qubits, encoded as local fermionic modes, become entangled purely by Fermi statistics. We discuss a possible implementation using two species of neutral atoms stored in an optical lattice and find that the scheme is realistic in its requirements even in the presence of noise

  16. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas


    's CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient...

  17. Automatic generation of computer programs servicing TFTR console displays

    International Nuclear Information System (INIS)

    Eisenberg, H.


    A number of alternatives were considered in providing programs to support the several hundred displays required for control and monitoring of TFTR equipment. Since similar functions were performed, an automated method of creating programs was suggested. The complexity of a single program servicing as many as thirty consoles mitigated against that approach. Similarly, creation of a syntactic language while elegant, was deemed to be too time consuming, and had the disadvantage of requiring a working knowledge of the language on a programming level. It was elected to pursue a method of generating an individual program to service a particular display. A feasibility study was conducted and the Control and Monitor Display Generator system (CMDG) was developed. A Control and Monitor Display Service Program (CMDS) provides a means of performing monitor and control functions for devices associated with TFTR subsystems, as well as other user functions, via TFTR Control Consoles. This paper discusses the specific capabilities provided by CMDS in a usage context, as well as the mechanics of implementation

  18. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie; Baca, Michael J.; Partridge, L. Donald; Finnegan, Patrick Sean; Wolfley, Steven L.; Dagel, Daryl James; Spahn, Olga Blum; Harper, Jason C.; Pohl, Kenneth Roy; Mickel, Patrick R.; Lohn, Andrew; Marinella, Matthew


    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we will instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.

  19. Cloud Computing Infusion for Generating ESDRs of Visible Spectra Radiances (United States)

    Golpayegani, N.; Halem, M.; Nguyen, P.


    The AIRS and AVHRR instruments have been collecting radiances of the Earth in the visible spectrum for over 25 years. These measurements have been used to develop such useful products as NDVI, Snow cover and depth, Outgoing long wave radiation and other products. Yet, no long-term data record of the level 1b visible spectra is available in a grid form to researchers for various climate studies. We present here an Earth System Data Record observed in the visible spectrum as gridded radiance fields of 8kmx10km grid resolution for the six years in the case of AIRS and from 1981 to the present for AVHRR. The AIRS data has four visible channels from 0.41μm to 0.94μm with an IFOV of 1 km and AVHRR has two visible channels in the 0.58μm to 1.00μm range also at 1 km. In order to process such large amounts of data on demand, two components need to be implemented,(i) a processing system capable of gridding TBs of data in a reasonable amount of time and (ii) a download mechanism to access and deliver the data to the processing system. We implemented a cloud computing approach to be able to process such large amounts of data. We use Hadoop, a distributed computation system developed by the Apache Software Foundation. With Hadoop, we are able to store the data in a distributed fashion, taking advantage of Hadoop's distributed file system (dfs). We also take advantage of Hadoop's MapReduce functionality to perform as much computations as is possible on available nodes of the UMBC bluegrit Cell cluster system that contain the data. We make use of the SOAR system developed under the ACCESS program to acquire and process the AIRS and AVHRR observations. Comparisons of the AIRS data witth selected periods of MODIS visible spectral channels on the same sattelite indicate the two instruments have maintained calibration consistency and continuity of their measurements over the six year period. Our download mechanism transfers the data from these instruments into hadoop's dfs. Our

  20. Thermoelectric cooling of microelectronic circuits and waste heat electrical power generation in a desktop personal computer

    International Nuclear Information System (INIS)

    Gould, C.A.; Shammas, N.Y.A.; Grainger, S.; Taylor, I.


    Thermoelectric cooling and micro-power generation from waste heat within a standard desktop computer has been demonstrated. A thermoelectric test system has been designed and constructed, with typical test results presented for thermoelectric cooling and micro-power generation when the computer is executing a number of different applications. A thermoelectric module, operating as a heat pump, can lower the operating temperature of the computer's microprocessor and graphics processor to temperatures below ambient conditions. A small amount of electrical power, typically in the micro-watt or milli-watt range, can be generated by a thermoelectric module attached to the outside of the computer's standard heat sink assembly, when a secondary heat sink is attached to the other side of the thermoelectric module. Maximum electrical power can be generated by the thermoelectric module when a water cooled heat sink is used as the secondary heat sink, as this produces the greatest temperature difference between both sides of the module.

  1. Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology (Final Report) (United States)

    EPA announced the release of the final report, Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology. This report describes new approaches that are faster, less resource intensive, and more robust that can help ...

  2. Computing in high-energy physics: facing a new generation of experiments

    International Nuclear Information System (INIS)

    Zanella, P.


    Computing pervades nearly every aspect of activities in contemporary high-energy physics. The paper discusses the range of tasks requiring computing, and reviews the principal wyas in which these are handled. Examples are given of typical computing applications with particular reference to activities at CERN, and some attempt is made to identify the main trends. The new generation of experiments, typified by colliding beam facilities, creates new requirements for computing and distributed processing. These are discussed in the light of the new and developing computer technology, which is seen as being essential to satisfy these requirements. (Auth.)

  3. The comparative effect of individually-generated vs. collaboratively-generated computer-based concept mapping on science concept learning (United States)

    Kwon, So Young

    Using a quasi-experimental design, the researcher investigated the comparative effects of individually-generated and collaboratively-generated computer-based concept mapping on middle school science concept learning. Qualitative data were analyzed to explain quantitative findings. One hundred sixty-one students (74 boys and 87 girls) in eight, seventh grade science classes at a middle school in Southeast Texas completed the entire study. Using prior science performance scores to assure equivalence of student achievement across groups, the researcher assigned the teacher's classes to one of the three experimental groups. The independent variable, group, consisted of three levels: 40 students in a control group, 59 students trained to individually generate concept maps on computers, and 62 students trained to collaboratively generate concept maps on computers. The dependent variables were science concept learning as demonstrated by comprehension test scores, and quality of concept maps created by students in experimental groups as demonstrated by rubric scores. Students in the experimental groups received concept mapping training and used their newly acquired concept mapping skills to individually or collaboratively construct computer-based concept maps during study time. The control group, the individually-generated concept mapping group, and the collaboratively-generated concept mapping group had equivalent learning experiences for 50 minutes during five days, excepting that students in a control group worked independently without concept mapping activities, students in the individual group worked individually to construct concept maps, and students in the collaborative group worked collaboratively to construct concept maps during their study time. Both collaboratively and individually generated computer-based concept mapping had a positive effect on seventh grade middle school science concept learning but neither strategy was more effective than the other. However

  4. Assessing Human Judgment of Computationally Generated Swarming Behavior

    Directory of Open Access Journals (Sweden)

    John Harvey


    Full Text Available Computer-based swarm systems, aiming to replicate the flocking behavior of birds, were first introduced by Reynolds in 1987. In his initial work, Reynolds noted that while it was difficult to quantify the dynamics of the behavior from the model, observers of his model immediately recognized them as a representation of a natural flock. Considerable analysis has been conducted since then on quantifying the dynamics of flocking/swarming behavior. However, no systematic analysis has been conducted on human identification of swarming. In this paper, we assess subjects’ assessment of the behavior of a simplified version of Reynolds’ model. Factors that affect the identification of swarming are discussed and future applications of the resulting models are proposed. Differences in decision times for swarming-related questions asked during the study indicate that different brain mechanisms may be involved in different elements of the behavior assessment task. The relatively simple but finely tunable model used in this study provides a useful methodology for assessing individual human judgment of swarming behavior.

  5. Brain-computer interface based on generation of visual images.

    Directory of Open Access Journals (Sweden)

    Pavel Bobrov

    Full Text Available This paper examines the task of recognizing EEG patterns that correspond to performing three mental tasks: relaxation and imagining of two types of pictures: faces and houses. The experiments were performed using two EEG headsets: BrainProducts ActiCap and Emotiv EPOC. The Emotiv headset becomes widely used in consumer BCI application allowing for conducting large-scale EEG experiments in the future. Since classification accuracy significantly exceeded the level of random classification during the first three days of the experiment with EPOC headset, a control experiment was performed on the fourth day using ActiCap. The control experiment has shown that utilization of high-quality research equipment can enhance classification accuracy (up to 68% in some subjects and that the accuracy is independent of the presence of EEG artifacts related to blinking and eye movement. This study also shows that computationally-inexpensive bayesian classifier based on covariance matrix analysis yields similar classification accuracy in this problem as a more sophisticated Multi-class Common Spatial Patterns (MCSP classifier.

  6. Generation and characterization of a perfect vortex beam with a large topological charge through a digital micromirror device. (United States)

    Chen, Yue; Fang, Zhao-Xiang; Ren, Yu-Xuan; Gong, Lei; Lu, Rong-De


    Optical vortices are associated with a spatial phase singularity. Such a beam with a vortex is valuable in optical microscopy, hyper-entanglement, and optical levitation. In these applications, vortex beams with a perfect circle shape and a large topological charge are highly desirable. But the generation of perfect vortices with high topological charges is challenging. We present a novel method to create perfect vortex beams with large topological charges using a digital micromirror device (DMD) through binary amplitude modulation and a narrow Gaussian approximation. The DMD with binary holograms encoding both the spatial amplitude and the phase could generate fast switchable, reconfigurable optical vortex beams with significantly high quality and fidelity. With either the binary Lee hologram or the superpixel binary encoding technique, we were able to generate the corresponding hologram with high fidelity and create a perfect vortex with topological charge as large as 90. The physical properties of the perfect vortex beam produced were characterized through measurements of propagation dynamics and the focusing fields. The measurements show good consistency with the theoretical simulation. The perfect vortex beam produced satisfies high-demand utilization in optical manipulation and control, momentum transfer, quantum computing, and biophotonics.

  7. New generation of 3D desktop computer interfaces (United States)

    Skerjanc, Robert; Pastoor, Siegmund


    Today's computer interfaces use 2-D displays showing windows, icons and menus and support mouse interactions for handling programs and data files. The interface metaphor is that of a writing desk with (partly) overlapping sheets of documents placed on its top. Recent advances in the development of 3-D display technology give the opportunity to take the interface concept a radical stage further by breaking the design limits of the desktop metaphor. The major advantage of the envisioned 'application space' is, that it offers an additional, immediately perceptible dimension to clearly and constantly visualize the structure and current state of interrelations between documents, videos, application programs and networked systems. In this context, we describe the development of a visual operating system (VOS). Under VOS, applications appear as objects in 3-D space. Users can (graphically connect selected objects to enable communication between the respective applications. VOS includes a general concept of visual and object oriented programming for tasks ranging from, e.g., low-level programming up to high-level application configuration. In order to enable practical operation in an office or at home for many hours, the system should be very comfortable to use. Since typical 3-D equipment used, e.g., in virtual-reality applications (head-mounted displays, data gloves) is rather cumbersome and straining, we suggest to use off-head displays and contact-free interaction techniques. In this article, we introduce an autostereoscopic 3-D display and connected video based interaction techniques which allow viewpoint-depending imaging (by head tracking) and visually controlled modification of data objects and links (by gaze tracking, e.g., to pick, 3-D objects just by looking at them).

  8. A computationally simple method for cost-efficient generation rescheduling and load shedding for congestion management

    Energy Technology Data Exchange (ETDEWEB)

    Talukdar, B.K.; Sinha, A.K.; Mukhopadhyay, S. [Indian Inst. of Technology, Dept. of Electrical Engineering, Kharagpur (India); Bose, A. [Washington State Univ., Pullman, WA (United States)


    This paper describes a method of congestion management by generation rescheduling and load shedding. The sensitivities of the overloaded lines to bus injections and the costs of generation and load shedding are considered for ranking the generation and load buses. The new generation and load shedding schedule for these buses are then computed based on a simple method considering cost and sensitivity to line currents. The algorithm will help the system operator to generate the contingency plan quickly for secure operation of the system. Test results for three systems are presented. The results show that cost effective generation rescheduling and load shedding plans can be obtained to alleviate overloading of the transmission lines in a computationally efficient manner. (Author)

  9. Generation and application of bessel beams in electron microscopy

    Energy Technology Data Exchange (ETDEWEB)

    Grillo, Vincenzo, E-mail: [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); CNR-IMEM, Parco Area delle Scienze 37/A, I-43124 Parma (Italy); Harris, Jérémie [Department of Physics, University of Ottawa, 25 Templeton St., Ottawa, Ontario, Canada K1N 6N5 (Canada); Gazzadi, Gian Carlo [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); Balboni, Roberto [CNR-IMM Bologna, Via P. Gobetti 101, 40129 Bologna (Italy); Mafakheri, Erfan [Dipartimento di Fisica Informatica e Matematica, Università di Modena e Reggio Emilia, via G Campi 213/a, I-41125 Modena (Italy); Dennis, Mark R. [H.H. Wills Physics Laboratory, University of Bristol, Bristol BS8 1TL (United Kingdom); Frabboni, Stefano [CNR-Istituto Nanoscienze, Centro S3, Via G Campi 213/a, I-41125 Modena (Italy); Dipartimento di Fisica Informatica e Matematica, Università di Modena e Reggio Emilia, via G Campi 213/a, I-41125 Modena (Italy); Boyd, Robert W.; Karimi, Ebrahim [Department of Physics, University of Ottawa, 25 Templeton St., Ottawa, Ontario, Canada K1N 6N5 (Canada)


    We report a systematic treatment of the holographic generation of electron Bessel beams, with a view to applications in electron microscopy. We describe in detail the theory underlying hologram patterning, as well as the actual electron-optical configuration used experimentally. We show that by optimizing our nanofabrication recipe, electron Bessel beams can be generated with relative efficiencies reaching 37±3%. We also demonstrate by tuning various hologram parameters that electron Bessel beams can be produced with many visible rings, making them ideal for interferometric applications, or in more highly localized forms with fewer rings, more suitable for imaging. We describe the settings required to tune beam localization in this way, and explore beam and hologram configurations that allow the convergences and topological charges of electron Bessel beams to be controlled. We also characterize the phase structure of the Bessel beams generated with our technique, using a simulation procedure that accounts for imperfections in the hologram manufacturing process. - Highlights: • Bessel beams with different convergence, topological charge, visible fringes are demonstrated. • The relation between the Fresnel hologram and the probe shape is explained by detailed calculations and experiments. • Among the holograms here presented the highest relative efficiency is 37%, the best result ever reached for blazed holograms.

  10. Highly efficient volume hologram multiplexing in thick dye-doped jelly-like gelatin. (United States)

    Katarkevich, Vasili M; Rubinov, Anatoli N; Efendiev, Terlan Sh


    Dye-doped jelly-like gelatin is a thick-layer self-developing photosensitive medium that allows single and multiplexed volume phase holograms to be successfully recorded using pulsed laser radiation. In this Letter, we present a method for multiplexed recording of volume holograms in a dye-doped jelly-like gelatin, which provides significant increase in their diffraction efficiency. The method is based on the recovery of the photobleached dye molecule concentration in the hologram recording zone of gel, thanks to molecule diffusion from other unexposed gel areas. As an example, an optical recording of a multiplexed hologram consisting of three superimposed Bragg gratings with mean values of the diffraction efficiency and angular selectivity of ∼75% and ∼21', respectively, is demonstrated by using the proposed method.

  11. Computing and Comparing Effective Properties for Flow and Transport in Computer-Generated Porous Media

    KAUST Repository

    Allen, Rebecca


    We compute effective properties (i.e., permeability, hydraulic tortuosity, and diffusive tortuosity) of three different digital porous media samples, including in-line array of uniform shapes, staggered-array of squares, and randomly distributed squares. The permeability and hydraulic tortuosity are computed by solving a set of rescaled Stokes equations obtained by homogenization, and the diffusive tortuosity is computed by solving a homogenization problem given for the effective diffusion coefficient that is inversely related to diffusive tortuosity. We find that hydraulic and diffusive tortuosity can be quantitatively different by up to a factor of ten in the same pore geometry, which indicates that these tortuosity terms cannot be used interchangeably. We also find that when a pore geometry is characterized by an anisotropic permeability, the diffusive tortuosity (and correspondingly the effective diffusion coefficient) can also be anisotropic. This finding has important implications for reservoir-scale modeling of flow and transport, as it is more realistic to account for the anisotropy of both the permeability and the effective diffusion coefficient.

  12. Copyright and Computer Generated Materials – Is it Time to Reboot the Discussion About Authorship?

    Directory of Open Access Journals (Sweden)

    Anne Fitzgerald


    Full Text Available Computer generated materials are ubiquitous and we encounter them on a daily basis, even though most people are unaware that this is the case. Blockbuster movies, television weather reports and telephone directories all include material that is produced by utilising computer technologies. Copyright protection for materials generated by a programmed computer was considered by the Federal Court and Full Court of the Federal Court in Telstra Corporation Limited v Phone Directories Company Pty Ltd.  The court held that the White and Yellow pages telephone directories produced by Telstra and its subsidiary, Sensis, were not protected by copyright because they were computer-generated works which lacked the requisite human authorship.The Copyright Act 1968 (Cth does not contain specific provisions on the subsistence of copyright in computer-generated materials. Although the issue of copyright protection for computer-generated materials has been examined in Australia on two separate occasions by independently-constituted Copyright Law Review Committees over a period of 10 years (1988 to 1998, the Committees’ recommendations for legislative clarification by the enactment of specific amendments to the Copyright Act have not yet been implemented and the legal position remains unclear. In the light of the decision of the Full Federal Court in Telstra v Phone Directories it is timely to consider whether specific provisions should be enacted to clarify the position of computer-generated works under copyright law and, in particular, whether the requirement of human authorship for original works protected under Part III of the Copyright Act should now be reconceptualised to align with the realities of how copyright materials are created in the digital era.

  13. Quality of reconstruction of compressed off-axis digital holograms by frequency filtering and wavelets. (United States)

    Cheremkhin, Pavel A; Kurbatova, Ekaterina A


    Compression of digital holograms can significantly help with the storage of objects and data in 2D and 3D form, its transmission, and its reconstruction. Compression of standard images by methods based on wavelets allows high compression ratios (up to 20-50 times) with minimum losses of quality. In the case of digital holograms, application of wavelets directly does not allow high values of compression to be obtained. However, additional preprocessing and postprocessing can afford significant compression of holograms and the acceptable quality of reconstructed images. In this paper application of wavelet transforms for compression of off-axis digital holograms are considered. The combined technique based on zero- and twin-order elimination, wavelet compression of the amplitude and phase components of the obtained Fourier spectrum, and further additional compression of wavelet coefficients by thresholding and quantization is considered. Numerical experiments on reconstruction of images from the compressed holograms are performed. The comparative analysis of applicability of various wavelets and methods of additional compression of wavelet coefficients is performed. Optimum parameters of compression of holograms by the methods can be estimated. Sizes of holographic information were decreased up to 190 times.

  14. Fabrication of digital rainbow holograms and 3-D imaging using SEM based e-beam lithography. (United States)

    Firsov, An; Firsov, A; Loechel, B; Erko, A; Svintsov, A; Zaitsev, S


    Here we present an approach for creating full-color digital rainbow holograms based on mixing three basic colors. Much like in a color TV with three luminescent points per single screen pixel, each color pixel of initial image is presented by three (R, G, B) distinct diffractive gratings in a hologram structure. Change of either duty cycle or area of the gratings are used to provide proper R, G, B intensities. Special algorithms allow one to design rather complicated 3D images (that might even be replacing each other with hologram rotation). The software developed ("RainBow") provides stability of colorization of rotated image by means of equalizing of angular blur from gratings responsible for R, G, B basic colors. The approach based on R, G, B color synthesis allows one to fabricate gray-tone rainbow hologram containing white color what is hardly possible in traditional dot-matrix technology. Budgetary electron beam lithography based on SEM column was used to fabricate practical examples of digital rainbow hologram. The results of fabrication of large rainbow holograms from design to imprinting are presented. Advantages of the EBL in comparison to traditional optical (dot-matrix) technology is considered.

  15. Embedding intensity image into a binary hologram with strong noise resistant capability (United States)

    Zhuang, Zhaoyong; Jiao, Shuming; Zou, Wenbin; Li, Xia


    A digital hologram can be employed as a host image for image watermarking applications to protect information security. Past research demonstrates that a gray level intensity image can be embedded into a binary Fresnel hologram by error diffusion method or bit truncation coding method. However, the fidelity of the retrieved watermark image from binary hologram is generally not satisfactory, especially when the binary hologram is contaminated with noise. To address this problem, we propose a JPEG-BCH encoding method in this paper. First, we employ the JPEG standard to compress the intensity image into a binary bit stream. Next, we encode the binary bit stream with BCH code to obtain error correction capability. Finally, the JPEG-BCH code is embedded into the binary hologram. By this way, the intensity image can be retrieved with high fidelity by a BCH-JPEG decoder even if the binary hologram suffers from serious noise contamination. Numerical simulation results show that the image quality of retrieved intensity image with our proposed method is superior to the state-of-the-art work reported.

  16. Design, simulation, and optimization of an RGB polarization independent transmission volume hologram (United States)

    Mahamat, Adoum Hassan

    Volume phase holographic (VPH) gratings have been designed for use in many areas of science and technology such as optical communication, medical imaging, spectroscopy and astronomy. The goal of this dissertation is to design a volume phase holographic grating that provides diffraction efficiencies of at least 70% for the entire visible wavelengths and higher than 90% for red, green, and blue light when the incident light is unpolarized. First, the complete design, simulation and optimization of the volume hologram are presented. The optimization is done using a Monte Carlo analysis to solve for the index modulation needed to provide higher diffraction efficiencies. The solutions are determined by solving the diffraction efficiency equations determined by Kogelnik's two wave coupled-wave theory. The hologram is further optimized using the rigorous coupled-wave analysis to correct for effects of absorption omitted by Kogelnik's method. Second, the fabrication or recording process of the volume hologram is described in detail. The active region of the volume hologram is created by interference of two coherent beams within the thin film. Third, the experimental set up and measurement of some properties including the diffraction efficiencies of the volume hologram, and the thickness of the active region are conducted. Fourth, the polarimetric response of the volume hologram is investigated. The polarization study is developed to provide insight into the effect of the refractive index modulation onto the polarization state and diffraction efficiency of incident light.

  17. Digital adaptive optics confocal microscopy based on iterative retrieval of optical aberration from a guidestar hologram. (United States)

    Liu, Changgeng; Thapa, Damber; Yao, Xincheng


    Guidestar hologram based digital adaptive optics (DAO) is one recently emerging active imaging modality. It records each complex distorted line field reflected or scattered from the sample by an off-axis digital hologram, measures the optical aberration from a separate off-axis digital guidestar hologram, and removes the optical aberration from the distorted line fields by numerical processing. In previously demonstrated DAO systems, the optical aberration was directly retrieved from the guidestar hologram by taking its Fourier transform and extracting the phase term. For the direct retrieval method (DRM), when the sample is not coincident with the guidestar focal plane, the accuracy of the optical aberration retrieved by DRM undergoes a fast decay, leading to quality deterioration of corrected images. To tackle this problem, we explore here an image metrics-based iterative method (MIM) to retrieve the optical aberration from the guidestar hologram. Using an aberrated objective lens and scattering samples, we demonstrate that MIM can improve the accuracy of the retrieved aberrations from both focused and defocused guidestar holograms, compared to DRM, to improve the robustness of the DAO.

  18. The 'hidden image' effect in security holograms and its personalization by laser demetallization (United States)

    Bulanovs, Andrejs; Tamanis, Edmunds; Kolbjonoks, Vadims


    The given work investigates principles of recording, calculation, and security aspects of `hidden image' effect in digital holograms that are intended for security applications. Dot-matrix and image-matrix technologies of optical recording can be widely used for recording protective holograms with such type of security features. When a collimated laser beam falls on and then is reflected from the section of holograms, containing a protective `hidden image' element, a graphic image can be seen in the projection of diffracted light on the frosted screen. The present work also discusses a method of personalizing the `hidden image' effect with the help of laser demetallization. In this way the hidden image can be individualized for each hologram sticker and contain additional information such as a number, text or logotype. The attractiveness of this method is in the possibility of achieving a considerable increase of the protective characteristic of holograms and incorporating additional variable information in them, as well as in providing both visual and automatic ways of checking authenticity of a hologram.


    Directory of Open Access Journals (Sweden)

    S. A. Ivanov


    Full Text Available We have carried out calculations of recording conditions for multiplexed holograms in photo-thermo-refractive (PTR glass. The proposed calculation sets the link between such parameters as: the angle between recording beams and the angle of sample rotation, operating wavelength, the angle of incidence on the element, output angle. To study recording features of multiplexed holograms on PTR glass several elements was made. Six holograms in each element were recorded with various exposures. All samples were heat-treated at one temperature around glass transition temperature. It has been demonstrated that at the recording of several gratings with a total exposure exceeding an optimal value for a given material, the total value of the refractive index modulation amplitude (n1 reaches the maximum attainable magnitude that is equivalent to a value of a single hologram with optimal exposure. It has been found that refractive index dynamic range of the material distributes between the gratings in accordance with the ratio between exposure times if holograms exposures have significant differences. In the present paper six-channel multiplexer was recorded for a wavelength equal to 632.8 nm (He-Ne laser. The diffraction angles correspond to calculations mentioned above. The n1 value in each grating is equal to the value of the highest attainable value of the value of n1 divided by the total number of multiplexed holograms.

  20. Computer-Aided Generation of Result Text for Clinical Laboratory Texts


    Kuzmak, Peter M.; Miller, R. E.


    Efficient processing of non-numeric textual data is a frequent requirement with medical computer applications such as clinical laboratory result reporting. In such instances, it is often desirable that the computer control the generation of the text to ensure that the intended meaning is conveyed. This paper describes a technique for interactively selecting predefined text segments to form complex textual reports for laboratory tests. The approach, which uses algorithms based on augmented tra...

  1. Steam generator transient studies using a simplified two-fluid computer code

    International Nuclear Information System (INIS)

    Munshi, P.; Bhatnagar, R.; Ram, K.S.


    A simplified two-fluid computer code has been used to simulate reactor-side (or primary-side) transients in a PWR steam generator. The disturbances are modelled as ramp inputs for pressure, internal energy and mass flow-rate for the primary fluid. The CPU time for a transient duration of 4 s is approx. 10 min on a DEC-1090 computer system. The results are thermodynamically consistent and encouraging for further studies. (author)

  2. Computational Research Challenges and Opportunities for the Optimization of Fossil Energy Power Generation System

    Energy Technology Data Exchange (ETDEWEB)

    Zitney, S.E.


    Emerging fossil energy power generation systems must operate with unprecedented efficiency and near-zero emissions, while optimizing profitably amid cost fluctuations for raw materials, finished products, and energy. To help address these challenges, the fossil energy industry will have to rely increasingly on the use advanced computational tools for modeling and simulating complex process systems. In this paper, we present the computational research challenges and opportunities for the optimization of fossil energy power generation systems across the plant lifecycle from process synthesis and design to plant operations. We also look beyond the plant gates to discuss research challenges and opportunities for enterprise-wide optimization, including planning, scheduling, and supply chain technologies.

  3. Systemic functional grammar in natural language generation linguistic description and computational representation

    CERN Document Server

    Teich, Elke


    This volume deals with the computational application of systemic functional grammar (SFG) for natural language generation. In particular, it describes the implementation of a fragment of the grammar of German in the computational framework of KOMET-PENMAN for multilingual generation. The text also presents a specification of explicit well-formedness constraints on syntagmatic structure which are defined in the form of typed feature structures. It thus achieves a model of systemic functional grammar that unites both the strengths of systemics, such as stratification, functional diversification

  4. Computer-controlled High Resolution Arbitrary Waveform Generator (HRAWG) for Focusing Beamforming Applications (United States)

    Assef, Amauri Amorin; Maia, Joaquim Miguel; Costa, Eduardo Tavares

    In advanced ultrasound imaging systems, expensive high-end integrated analog front-ends have been traditionally used to support generation of arbitrary transmit waveforms, in addition to transmit focusing and apodization control. In this paper, we present a cost-effective computer-controlled reconfigurable high-resolution arbitrary waveform generator (HRAWG) that has been designed for ultrasound research, development and teaching at the Federal University of Technology (UTFPR), Brazil. The 8-channel transmit beamformer is fully controlled by a host computer in which a Matlab GUI with the Field II simulation program, allows easy and accurate control over the transmission parameters such as waveform, amplitude apodization and timing.

  5. From medical images to flow computations without user-generated meshes. (United States)

    Dillard, Seth I; Mousel, John A; Shrestha, Liza; Raghavan, Madhavan L; Vigmostad, Sarah C


    Biomedical flow computations in patient-specific geometries require integrating image acquisition and processing with fluid flow solvers. Typically, image-based modeling processes involve several steps, such as image segmentation, surface mesh generation, volumetric flow mesh generation, and finally, computational simulation. These steps are performed separately, often using separate pieces of software, and each step requires considerable expertise and investment of time on the part of the user. In this paper, an alternative framework is presented in which the entire image-based modeling process is performed on a Cartesian domain where the image is embedded within the domain as an implicit surface. Thus, the framework circumvents the need for generating surface meshes to fit complex geometries and subsequent creation of body-fitted flow meshes. Cartesian mesh pruning, local mesh refinement, and massive parallelization provide computational efficiency; the image-to-computation techniques adopted are chosen to be suitable for distributed memory architectures. The complete framework is demonstrated with flow calculations computed in two 3D image reconstructions of geometrically dissimilar intracranial aneurysms. The flow calculations are performed on multiprocessor computer architectures and are compared against calculations performed with a standard multistep route. Copyright © 2014 John Wiley & Sons, Ltd.

  6. Special purpose computer system for flow visualization using holography technology. (United States)

    Abe, Yukio; Masuda, Nobuyuki; Wakabayashi, Hideaki; Kazo, Yuta; Ito, Tomoyoshi; Satake, Shin-ichi; Kunugi, Tomoaki; Sato, Kazuho


    We have designed a special purpose computer system for visualizing fluid flow using digital holographic particle tracking velocimetry (DHPTV). This computer contains an Field Programmble Gate Array (FPGA) chip in which a pipeline for calculating the intensity of an object from a hologram by fast Fourier transform is installed. This system can produce 100 reconstructed images from a 1024 x 1024-grid hologram in 3.3 sec. It is expected that this system will contribute to fluid flow analysis.

  7. Computer Programs for Generation of NASTRAN and VIBRA-6 Aircraft Models. (United States)


    NASTRAN as a combination of DMAP alters for Solutions 3 and 10, and a user module, MODB. The user module, MODB, performs such operations as...GENERATION _ OF NASTRAN AND VIBRA-6 AIRCRAFT MODELS Steven G. Harris Anamet Laboratories, Inc.I 3400 Investment Boulevard Haywood, CA 94545-3811 April...ACCESSION NO 62601F 8809 03 26 11 TITLE (Include Security Classification) COMPUTER PROGRAMS FOR GENERATION OF NASTRAN AND VIBRA-6 AIRCRAFT MODELS 12

  8. Computer code 'WAGEN' for the generation of artificial earthquake ground motions

    International Nuclear Information System (INIS)

    Fujita, Shigeki; Baba, Osamu; Suzuki, Takehiro.


    WAGEN is a computer code for the generation of artificial Earthquake Ground Motions utilized for the aseismic design of nuclear power facilities. WAGEN is developed on the basis of the procedure proposed by Dr. Ohsaki. An artificial earthquake ground motion generated in such a way that the response spectrum shows good fitting to the design response spectrum. This report presents main features, numerical procedure, manual of the code and an example of usage. (author)

  9. DNA strand generation for DNA computing by using a multi-objective differential evolution algorithm. (United States)

    Chaves-González, José M; Vega-Rodríguez, Miguel A


    In this paper, we use an adapted multi-objective version of the differential evolution (DE) metaheuristics for the design and generation of reliable DNA libraries that can be used for computation. DNA sequence design is a very relevant task in many recent research fields, e.g. nanotechnology or DNA computing. Specifically, DNA computing is a new computational model which uses DNA molecules as information storage and their possible biological interactions as processing operators. Therefore, the possible reactions and interactions among molecules must be strictly controlled to prevent incorrect computations. The design of reliable DNA libraries for bio-molecular computing is an NP-hard combinatorial problem which involves many heterogeneous and conflicting design criteria. For this reason, we modelled DNA sequence design as a multiobjective optimization problem and we solved it by using an adapted multi-objective version of DE metaheuristics. Seven different bio-chemical design criteria have been simultaneously considered to obtain high quality DNA sequences which are suitable for molecular computing. Furthermore, we have developed the multiobjective standard fast non-dominated sorting genetic algorithm (NSGA-II) in order to perform a formal comparative study by using multi-objective indicators. Additionally, we have also compared our results with other relevant results published in the literature. We conclude that our proposal is a promising approach which is able to generate reliable real-world DNA sequences that significantly improve other DNA libraries previously published in the literature. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  10. Hologram QSAR model for the prediction of human oral bioavailability. (United States)

    Moda, Tiago L; Montanari, Carlos A; Andricopulo, Adriano D


    A drug intended for use in humans should have an ideal balance of pharmacokinetics and safety, as well as potency and selectivity. Unfavorable pharmacokinetics can negatively affect the clinical development of many otherwise promising drug candidates. A variety of in silico ADME (absorption, distribution, metabolism, and excretion) models are receiving increased attention due to a better appreciation that pharmacokinetic properties should be considered in early phases of the drug discovery process. Human oral bioavailability is an important pharmacokinetic property, which is directly related to the amount of drug available in the systemic circulation to exert pharmacological and therapeutic effects. In the present work, hologram quantitative structure-activity relationships (HQSAR) were performed on a training set of 250 structurally diverse molecules with known human oral bioavailability. The most significant HQSAR model (q(2)=0.70, r(2)=0.93) was obtained using atoms, bond, connection, and chirality as fragment distinction. The predictive ability of the model was evaluated by an external test set containing 52 molecules not included in the training set, and the predicted values were in good agreement with the experimental values. The HQSAR model should be useful for the design of new drug candidates having increased bioavailability as well as in the process of chemical library design, virtual screening, and high-throughput screening.

  11. Novel ligands of Choline Acetyltransferase designed by in silico molecular docking, hologram QSAR and lead optimization (United States)

    Kumar, Rajnish; Långström, Bengt; Darreh-Shori, Taher


    Recent reports have brought back the acetylcholine synthesizing enzyme, choline acetyltransferase in the mainstream research in dementia and the cholinergic anti-inflammatory pathway. Here we report, a specific strategy for the design of novel ChAT ligands based on molecular docking, Hologram Quantitative Structure Activity Relationship (HQSAR) and lead optimization. Molecular docking was performed on a series of ChAT inhibitors to decipher the molecular fingerprint of their interaction with the active site of ChAT. Then robust statistical fragment HQSAR models were developed. A library of novel ligands was generated based on the pharmacophoric and shape similarity scoring function, and evaluated in silico for their molecular interactions with ChAT. Ten of the top scoring invented compounds are reported here. We confirmed the activity of α-NETA, the only commercially available ChAT inhibitor, and one of the seed compounds in our model, using a new simple colorimetric ChAT assay (IC50 ~ 88 nM). In contrast, α-NETA exhibited an IC50 of ~30 μM for the ACh-degrading cholinesterases. In conclusion, the overall results may provide useful insight for discovering novel ChAT ligands and potential positron emission tomography tracers as in vivo functional biomarkers of the health of central cholinergic system in neurodegenerative disorders, such as Alzheimer’s disease.

  12. Teaching French Transformational Grammar by Means of Computer-Generated Video-Tapes. (United States)

    Adler, Alfred; Thomas, Jean Jacques

    This paper describes a pilot program in an integrated media presentation of foreign languages and the production and usage of seven computer-generated video tapes which demonstrate various aspects of French syntax. This instructional set could form the basis for CAI lessons in which the student is presented images identical to those on the video…

  13. Radio data and computer simulations for shock waves generated by solar flares

    International Nuclear Information System (INIS)

    Maxwell, A.; Dryer, M.


    Solar radio bursts of spectral type II provide a prime diagnostic for the passage of shock waves, generated by solar flares, through the solar corona. In this investigation the authors compare radio data on the shocks with computer simulations for the propagation of fast-mode MHD shocks through the solar corona. (Auth.)

  14. Computer-generated versus nurse-determined strategy for incubator humidity and time to regain birthweight

    NARCIS (Netherlands)

    Helder, Onno K.; Mulder, Paul G. H.; van Goudoever, Johannes B.


    To compare effects on premature infants' weight gain of a computer-generated and a nurse-determined incubator humidity strategy. An optimal humidity protocol is thought to reduce time to regain birthweight. Prospective randomized controlled design. Level IIIC neonatal intensive care unit in the

  15. Computational Analysis of Intersubject Variability and Thrombin Generation in Dilutional Coagulopathy (United States)


    Mann15,30-32 and S. Dia- mond ,33,34 as well as by our own research group.16 We used the computational model to calculate the five thrombin generation...approach to modeling dilution did not take into account the specific type of liquid that dilutes the blood. While it is known that the type of

  16. THREE-PEE SAMPLING THEORY and program 'THRP' for computer generation of selection criteria (United States)

    L. R. Grosenbaugh


    Theory necessary for sampling with probability proportional to prediction ('three-pee,' or '3P,' sampling) is first developed and then exemplified by numerical comparisons of several estimators. Program 'T RP' for computer generation of appropriate 3P-sample-selection criteria is described, and convenient random integer dispensers are...

  17. Generation of a suite of 3D computer-generated breast phantoms from a limited set of human subject data

    International Nuclear Information System (INIS)

    Hsu, Christina M. L.; Palmeri, Mark L.; Segars, W. Paul; Veress, Alexander I.; Dobbins, James T. III


    Purpose: The authors previously reported on a three-dimensional computer-generated breast phantom, based on empirical human image data, including a realistic finite-element based compression model that was capable of simulating multimodality imaging data. The computerized breast phantoms are a hybrid of two phantom generation techniques, combining empirical breast CT (bCT) data with flexible computer graphics techniques. However, to date, these phantoms have been based on single human subjects. In this paper, the authors report on a new method to generate multiple phantoms, simulating additional subjects from the limited set of original dedicated breast CT data. The authors developed an image morphing technique to construct new phantoms by gradually transitioning between two human subject datasets, with the potential to generate hundreds of additional pseudoindependent phantoms from the limited bCT cases. The authors conducted a preliminary subjective assessment with a limited number of observers (n= 4) to illustrate how realistic the simulated images generated with the pseudoindependent phantoms appeared. Methods: Several mesh-based geometric transformations were developed to generate distorted breast datasets from the original human subject data. Segmented bCT data from two different human subjects were used as the “base” and “target” for morphing. Several combinations of transformations were applied to morph between the “base’ and “target” datasets such as changing the breast shape, rotating the glandular data, and changing the distribution of the glandular tissue. Following the morphing, regions of skin and fat were assigned to the morphed dataset in order to appropriately assign mechanical properties during the compression simulation. The resulting morphed breast was compressed using a finite element algorithm and simulated mammograms were generated using techniques described previously. Sixty-two simulated mammograms, generated from morphing

  18. An efficient method for multiple radiative transfer computations and the lookup table generation

    International Nuclear Information System (INIS)

    Wang Menghua


    An efficient method for the multiple radiative-transfer computations is proposed. The method is based on the fact that, in the radiative-transfer computation, most of the CPU time is used in the numerical integration for the Fourier components of the scattering phase function. With the new method, the lookup tables, which are usually needed to convert the spaceborne and the airborne sensor-measured signals to the desired physical and optical quantities, can be generated efficiently. We use the ocean color remote sensor Sea-viewing Wide Field-of-view Sensor as an example to show that, with the new approach, the CPU time can be reduced significantly for the generation of the lookup tables. The new scheme is useful and effective for the multiple radiative-transfer computations

  19. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom


    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  20. Experimental and computational study on thermoelectric generators using thermosyphons with phase change as heat exchangers

    International Nuclear Information System (INIS)

    Araiz, M.; Martínez, A.; Astrain, D.; Aranguren, P.


    Highlights: • Thermosyphon with phase change heat exchanger computational model. • Construction and experimentation of a prototype. • ±9% of maximum deviation from experimental values of the main outputs. • Influence of the auxiliary equipment on the net power generation. - Abstract: An important issue in thermoelectric generators is the thermal design of the heat exchangers since it can improve their performance by increasing the heat absorbed or dissipated by the thermoelectric modules. Due to its several advantages, compared to conventional dissipation systems, a thermosyphon heat exchanger with phase change is proposed to be placed on the cold side of thermoelectric generators. Some of these advantages are: high heat-transfer rates; absence of moving parts and lack of auxiliary consumption (because fans or pumps are not required); and the fact that these systems are wickless. A computational model is developed to design and predict the behaviour of this heat exchangers. Furthermore, a prototype has been built and tested in order to demonstrate its performance and validate the computational model. The model predicts the thermal resistance of the heat exchanger with a relative error in the interval [−8.09; 7.83] in the 95% of the cases. Finally, the use of thermosyphons with phase change in thermoelectric generators has been studied in a waste-heat recovery application, stating that including them on the cold side of the generators improves the net thermoelectric production by 36% compared to that obtained with finned dissipators under forced convection.

  1. An Evaluation Of Holograms In Training And As Job Performance Aids (United States)

    Frey, Allan H.


    Experimentation was carried out to evaluate holograms for use in training and as job aids. Holograms were compared against line drawings and photographs as methods of presenting visual information needed to accomplish a number of tasks. The dependent variables were assembly speed and assembly errors with people unstressed, assembly speed and assembly errors with people stressed, the percentage of discovered errors in assemblies, the number of correct assemblies misidentified as erroneous, and information extraction. Holograms generally were as good as or better visual aids than either photographs or line drawings. The use of holograms tends to reduce errors rather than speed assembly time in the assembly tasks used in these experiments. They also enhance the discovery of errors when the subject is attempting to locate assembly errors in a construction. The results of this experimentation suggest that serious consideration should be given to the use of holography in the development of job aids and in training. Besides these advantages for job aids, other advantages we found are that when page formated information is stored in man-readable holograms they are still useable when scratched or damaged even when similarly damaged microfilm is unuseable. Holography can also be used to store man and machine readable data simultaneously. Such storage would provide simplified backup in the event of machine failure, and it would permit the development of compatible machine and manual systems for job aid applications.

  2. Towards pattern generation and chaotic series prediction with photonic reservoir computers (United States)

    Antonik, Piotr; Hermans, Michiel; Duport, François; Haelterman, Marc; Massar, Serge


    Reservoir Computing is a bio-inspired computing paradigm for processing time dependent signals that is particularly well suited for analog implementations. Our team has demonstrated several photonic reservoir computers with performance comparable to digital algorithms on a series of benchmark tasks such as channel equalisation and speech recognition. Recently, we showed that our opto-electronic reservoir computer could be trained online with a simple gradient descent algorithm programmed on an FPGA chip. This setup makes it in principle possible to feed the output signal back into the reservoir, and thus highly enrich the dynamics of the system. This will allow to tackle complex prediction tasks in hardware, such as pattern generation and chaotic and financial series prediction, which have so far only been studied in digital implementations. Here we report simulation results of our opto-electronic setup with an FPGA chip and output feedback applied to pattern generation and Mackey-Glass chaotic series prediction. The simulations take into account the major aspects of our experimental setup. We find that pattern generation can be easily implemented on the current setup with very good results. The Mackey-Glass series prediction task is more complex and requires a large reservoir and more elaborate training algorithm. With these adjustments promising result are obtained, and we now know what improvements are needed to match previously reported numerical results. These simulation results will serve as basis of comparison for experiments we will carry out in the coming months.

  3. Fracture resistance of computer-aided design/computer-aided manufacturing-generated composite resin-based molar crowns. (United States)

    Harada, Akio; Nakamura, Keisuke; Kanno, Taro; Inagaki, Ryoichi; Örtengren, Ulf; Niwano, Yoshimi; Sasaki, Keiichi; Egusa, Hiroshi


    The aim of this study was to investigate whether different fabrication processes, such as the computer-aided design/computer-aided manufacturing (CAD/CAM) system or the manual build-up technique, affect the fracture resistance of composite resin-based crowns. Lava Ultimate (LU), Estenia C&B (EC&B), and lithium disilicate glass-ceramic IPS e.max press (EMP) were used. Four types of molar crowns were fabricated: CAD/CAM-generated composite resin-based crowns (LU crowns); manually built-up monolayer composite resin-based crowns (EC&B-monolayer crowns); manually built-up layered composite resin-based crowns (EC&B-layered crowns); and EMP crowns. Each type of crown was cemented to dies and the fracture resistance was tested. EC&B-layered crowns showed significantly lower fracture resistance compared with LU and EMP crowns, although there was no significant difference in flexural strength or fracture toughness between LU and EC&B materials. Micro-computed tomography and fractographic analysis showed that decreased strength probably resulted from internal voids in the EC&B-layered crowns introduced by the layering process. There was no significant difference in fracture resistance among LU, EC&B-monolayer, and EMP crowns. Both types of composite resin-based crowns showed fracture loads of >2000 N, which is higher than the molar bite force. Therefore, CAD/CAM-generated crowns, without internal defects, may be applied to molar regions with sufficient fracture resistance. © 2015 Eur J Oral Sci.

  4. An analytical computation of magnetic field generated from a cylinder ferromagnet (United States)

    Taniguchi, Tomohiro


    An analytical formulation to compute a magnetic field generated from an uniformly magnetized cylinder ferromagnet is developed. Exact solutions of the magnetic field generated from the magnetization pointing in an arbitrary direction are derived, which are applicable both inside and outside the ferromagnet. The validities of the present formulas are confirmed by comparing them with demagnetization coefficients estimated in earlier works. The results will be useful for designing practical applications, such as high-density magnetic recording and microwave generators, where nanostructured ferromagnets are coupled to each other through the dipole interactions and show cooperative phenomena such as synchronization. As an example, the magnetic field generated from a spin torque oscillator for magnetic recording based on microwave assisted magnetization reversal is studied.

  5. X-ray fluorescence hologram data collection with a cooled avalanche photodiode

    CERN Document Server

    Hayashi, K; Matsubara, E I; Kishimoto, S; Mori, T; Tanaka, M


    A high counting rate X-ray detector with an appropriate energy resolution is desired for high quality X-ray fluorescence hologram measurements because a holographic pattern is detected as extremely small intensity variations of X-ray fluorescence on a large intensity background. A cooled avalanche photodiode (APD), which has about 10% energy resolution and is designed for a high counting rate, fits the above requirements. Reconstructed atomic images from experimental holograms using the APD system provide us a clear view of the first and second neighbor atoms around an emitter. The present result proved that a combination of this APD system and a synchrotron X-ray source enables us to measure a high quality hologram for a reasonable measurement time.

  6. Evaluation of polycarbonate substrate hologram recording medium regarding implication of birefringence and thermal expansion (United States)

    Toishi, Mitsuru; Tanaka, Tomiji; Fukumoto, Atsushi; Sugiki, Mikio; Watanabe, Kenjiro


    In this paper, we evaluate photopolymer media using a polycarbonate (PC) substrate. In holographic data storage medium, substrates that sandwich the photopolymer material are needed to protect the photopolymer material against exogenous shock and open air. An optical glass such as BK-7 is normally used as a substrate, but a PC substrate has a cost advantage and is easy to fabricate compared with optical glass. For holographic recording and reading, however, the high birefringence and high thermal expansion of a PC substrate are significant problems. First, we analyze the degree of degradation of output power by the polarization change and estimate the threshold value of birefringence to record hologram normally. Next, we estimate the temperature tolerance of hologram readout with polycarbonate substrate hologram medium. These analyses results indicate the possible usage of the PC substrate as holographic recording media.

  7. Symmetric and asymmetric hybrid cryptosystem based on compressive sensing and computer generated holography (United States)

    Ma, Lihong; Jin, Weimin


    A novel symmetric and asymmetric hybrid optical cryptosystem is proposed based on compressive sensing combined with computer generated holography. In this method there are six encryption keys, among which two decryption phase masks are different from the two random phase masks used in the encryption process. Therefore, the encryption system has the feature of both symmetric and asymmetric cryptography. On the other hand, because computer generated holography can flexibly digitalize the encrypted information and compressive sensing can significantly reduce data volume, what is more, the final encryption image is real function by phase truncation, the method favors the storage and transmission of the encryption data. The experimental results demonstrate that the proposed encryption scheme boosts the security and has high robustness against noise and occlusion attacks.

  8. Computer-Generated Abstract Paintings Oriented by the Color Composition of Images

    Directory of Open Access Journals (Sweden)

    Mao Li


    Full Text Available Designers and artists often require reference images at authoring time. The emergence of computer technology has provided new conditions and possibilities for artistic creation and research. It has also expanded the forms of artistic expression and attracted many artists, designers and computer experts to explore different artistic directions and collaborate with one another. In this paper, we present an efficient k-means-based method to segment the colors of an original picture to analyze the composition ratio of the color information and calculate individual color areas that are associated with their sizes. This information is transformed into regular geometries to reconstruct the colors of the picture to generate abstract images. Furthermore, we designed an application system using the proposed method and generated many works; some artists and designers have used it as an auxiliary tool for art and design creation. The experimental results of datasets demonstrate the effectiveness of our method and can give us inspiration for our work.

  9. Computation of Locational and Hourly Maximum Output of a Distributed Generator Connected to a Distribution Feeder (United States)

    Hayashi, Yasuhiro; Matsuki, Junya; Hanai, Yuji; Hosokawa, Shinpei; Kobayashi, Naoki

    Recently, the total number of distributed generation such as photovoltaic generation system and wind turbine generation system connected to distribution network is drastically increased. Distributed generation utilizing renewable energy can reduce the distribution loss and emission of CO2. However the distribution network with the distributed generators must be operated keeping reliability of power supply and power quality. In this paper, the authors propose a computation method to determine the maximum output of a distributed generator under the operational constrains ((1) voltage limit, (2) line current capacity, and (3) no reverse flow to bank) at arbitrary connection point and hourly period. In the proposed method, three-phase iterative load flow calculation is applied to evaluate the above operational constraints. Three-phase iterative load flow calculation has two simple procedures: (Procedure1) addition of load currents from terminal node of feeder to root one, and (Procedure2) subtraction of voltage drop from root node of feeder to terminal one. In order to check the validity of the proposed method, numerical simulations are accomplished for a distribution system model. Furthermore, characteristics of locational and hourly maximum output of distributed generator connected to distribution feeder are analyzed by several numerical examples.

  10. Computer Simulations Support a Morphological Contribution to BDNF Enhancement of Action Potential Generation

    Directory of Open Access Journals (Sweden)

    Domenico F Galati


    Full Text Available Abstract Brain-derived neurotrophic factor (BDNF regulates both action potential (AP generation and neuron morphology. However, whether BDNF-induced changes in neuron morphology directly impact AP generation is unclear. We quantified BDNF’s effect on cultured cortical neuron morphological parameters and found that BDNF stimulates dendrite growth and addition of dendrites while increasing both excitatory and inhibitory presynaptic inputs in a spatially restricted manner. To gain insight into how these combined changes in neuron structure and synaptic input impact AP generation, we used the morphological parameters we gathered to generate computational models. Simulations suggest that BDNF-induced neuron morphologies generate more APs under a wide variety of conditions. Synapse and dendrite addition have the greatest impact on AP generation. However, subtle alterations in excitatory/inhibitory synapse ratio and strength have a significant impact on AP generation when synaptic activity is low. Consistent with these simulations, BDNF rapidly enhances spontaneous activity in cortical cultures. We propose that BDNF promotes neuron morphologies that are intrinsically more efficient at translating barrages of synaptic activity into APs, which is a previously unexplored aspect of BDNF’s function.

  11. WIMSTAR-4: a computer program for generating WIMS library data from ENDF/B

    International Nuclear Information System (INIS)

    Wilkin, G.B.


    WIMSTAR (Version 4) is a FORTRAN-IV computer program developed to generate data files for the WIMS lattice code library from the ENDF/B data base. The program must be used in conjunction with the AMPX-II system and has been designed for implementation as a module of that system. This report describes the structure, implementation and use of the AMPX/WIMSTAR system

  12. Human-computer interaction for the generation of image processing applications


    Clouard, Régis; Renouf, Arnaud; Revenu, Marinette


    International audience; The development of customized image processing applications is time consuming and requires high level skills. This paper describes the design of an interactive application generation system oriented towards producing image processing software programs. The description is focused on two models which constitute the core of the human-computer interaction. First, the formulation model identifies and organizes information that is assumed necessary and sufficient for develop...

  13. Integration of distributed plant process computer systems to nuclear power generation facilities

    International Nuclear Information System (INIS)

    Bogard, T.; Finlay, K.


    Many operating nuclear power generation facilities are replacing their plant process computer. Such replacement projects are driven by equipment obsolescence issues and associated objectives to improve plant operability, increase plant information access, improve man machine interface characteristics, and reduce operation and maintenance costs. This paper describes a few recently completed and on-going replacement projects with emphasis upon the application integrated distributed plant process computer systems. By presenting a few recent projects, the variations of distributed systems design show how various configurations can address needs for flexibility, open architecture, and integration of technological advancements in instrumentation and control technology. Architectural considerations for optimal integration of the plant process computer and plant process instrumentation ampersand control are evident from variations of design features

  14. Applications of automatic mesh generation and adaptive methods in computational medicine

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, J.A.; Macleod, R.S. [Univ. of Utah, Salt Lake City, UT (United States); Johnson, C.R.; Eason, J.C. [Duke Univ., Durham, NC (United States)


    Important problems in Computational Medicine exist that can benefit from the implementation of adaptive mesh refinement techniques. Biological systems are so inherently complex that only efficient models running on state of the art hardware can begin to simulate reality. To tackle the complex geometries associated with medical applications we present a general purpose mesh generation scheme based upon the Delaunay tessellation algorithm and an iterative point generator. In addition, automatic, two- and three-dimensional adaptive mesh refinement methods are presented that are derived from local and global estimates of the finite element error. Mesh generation and adaptive refinement techniques are utilized to obtain accurate approximations of bioelectric fields within anatomically correct models of the heart and human thorax. Specifically, we explore the simulation of cardiac defibrillation and the general forward and inverse problems in electrocardiography (ECG). Comparisons between uniform and adaptive refinement techniques are made to highlight the computational efficiency and accuracy of adaptive methods in the solution of field problems in computational medicine.

  15. Assessing the validity of a computer-generated cognitive screening instrument for patients with multiple sclerosis. (United States)

    Lapshin, Helen; Lanctôt, Krista L; O'Connor, Paul; Feinstein, Anthony


    Neuropsychological testing requires considerable time, expense, and expertise to administer. These factors can limit patient access. Computerized cognitive testing has been proposed as an alternative. The objective of this paper is to validate a brief, simple-to-use computer-generated cognitive assessment screening battery for multiple sclerosis (MS) patients that has minimal motor involvement. A sample of 96 MS patients and 98 healthy controls completed a computer-generated battery that included the Stroop, Symbol Digit Modalities Test (C-SDMT), a two- and four-second visual analog of the Paced Auditory Serial Addition Test (PVSAT-2, PVSAT-4), and simple and choice reaction time tests. The Minimal Assessment of Cognitive Function in MS was used to define cognitive impairment in the MS sample. Each newly developed test successfully distinguished between cognitively impaired patients and healthy controls as well as cognitively intact patients. A combination of three computerized tests (C-SDMT, PVSAT-2, PVSAT-4) with a mean administration time of 10 minutes had a sensitivity of 82.5% and specificity of 87.5% in detecting cognitive impairment. Good test-retest reliability was obtained for each measure. Good sensitivity and specificity, brevity, ease of administration, and a limited motor component highlight the feasibility of introducing this computer-generated cognitive screening instrument in a busy MS clinic.

  16. Generating rate equations for complex enzyme systems by a computer-assisted systematic method

    Directory of Open Access Journals (Sweden)

    Beard Daniel A


    Full Text Available Abstract Background While the theory of enzyme kinetics is fundamental to analyzing and simulating biochemical systems, the derivation of rate equations for complex mechanisms for enzyme-catalyzed reactions is cumbersome and error prone. Therefore, a number of algorithms and related computer programs have been developed to assist in such derivations. Yet although a number of algorithms, programs, and software packages are reported in the literature, one or more significant limitation is associated with each of these tools. Furthermore, none is freely available for download and use by the community. Results We have implemented an algorithm based on the schematic method of King and Altman (KA that employs the topological theory of linear graphs for systematic generation of valid reaction patterns in a GUI-based stand-alone computer program called KAPattern. The underlying algorithm allows for the assumption steady-state, rapid equilibrium-binding, and/or irreversibility for individual steps in catalytic mechanisms. The program can automatically generate MathML and MATLAB output files that users can easily incorporate into simulation programs. Conclusion A computer program, called KAPattern, for generating rate equations for complex enzyme system is a freely available and can be accessed at

  17. Compression of digital hologram for three-dimensional object using Wavelet-Bandelets transform. (United States)

    Bang, Le Thanh; Ali, Zulfiqar; Quang, Pham Duc; Park, Jae-Hyeung; Kim, Nam


    In the transformation based compression algorithms of digital hologram for three-dimensional object, the balance between compression ratio and normalized root mean square (NRMS) error is always the core of algorithm development. The Wavelet transform method is efficient to achieve high compression ratio but NRMS error is also high. In order to solve this issue, we propose a hologram compression method using Wavelet-Bandelets transform. Our simulation and experimental results show that the Wavelet-Bandelets method has a higher compression ratio than Wavelet methods and all the other methods investigated in this paper, while it still maintains low NRMS error.

  18. Speckle reduction of reconstructions of digital holograms using three dimensional filtering (United States)

    Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.


    We report on a new digital signal processing technique that reduces speckle in reconstructions of digital holograms. This is achieved by convolving the three dimensional intensity pattern (the intensity of the propagated DH at a series of different distances) with a 3D point spread function in all three dimensions (x,y,z). It is based on the fact that the addition of different independent speckle images on an intensity basis reduces the speckle content. We provide quantitative results in terms of speckle index and resolution, and show that filtering in the z direction has the added benefit of an increase in the depth of focus of the digital hologram reconstruction.

  19. Fast CGH computation using S-LUT on GPU. (United States)

    Pan, Yuechao; Xu, Xuewu; Solanki, Sanjeev; Liang, Xinan; Tanjung, Ridwan Bin Adrian; Tan, Chiwei; Chong, Tow-Chong


    In computation of full-parallax computer-generated hologram (CGH), balance between speed and memory usage is always the core of algorithm development. To solve the speed problem of coherent ray trace (CRT) algorithm and memory problem of look-up table (LUT) algorithm without sacrificing reconstructed object quality, we develop a novel algorithm with split look-up tables (S-LUT) and implement it on graphics processing unit (GPU). Our results show that S-LUT on GPU has the fastest speed among all the algorithms investigated in this paper, while it still maintaining low memory usage. We also demonstrate high quality objects reconstructed from CGHs computed with S-LUT on GPU. The GPU implementation of our new algorithm may enable real-time and interactive holographic 3D display in the future.

  20. Generation of three-dimensional prototype models based on cone beam computed tomography

    International Nuclear Information System (INIS)

    Lambrecht, J.T.; Berndt, D.C.; Zehnder, M.; Schumacher, R.


    The purpose of this study was to generate three-dimensional models based on digital volumetric data that can be used in basic and advanced education. Four sets of digital volumetric data were established by cone beam computed tomography (CBCT) (Accuitomo, J. Morita, Kyoto, Japan). Datasets were exported as Dicom formats and imported into Mimics and Magic software programs to separate the different tissues such as nerve, tooth and bone. These data were transferred to a Polyjet 3D Printing machine (Eden 330, Object, Israel) to generate the models. Three-dimensional prototype models of certain limited anatomical structures as acquired volumetrically were fabricated. Generating three-dimensional models based on CBCT datasets is possible. Automated routine fabrication of these models, with the given infrastructure, is too time-consuming and therefore too expensive. (orig.)

  1. Generation of three-dimensional prototype models based on cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Lambrecht, J.T.; Berndt, D.C.; Zehnder, M. [University of Basel, Department of Oral Surgery, University Hospital for Oral Surgery, Oral Radiology and Oral Medicine, Basel (Switzerland); Schumacher, R. [University of Applied Sciences Northwestern Switzerland, School of Life Sciences, Institute for Medical and Analytical Technologies, Muttenz (Switzerland)


    The purpose of this study was to generate three-dimensional models based on digital volumetric data that can be used in basic and advanced education. Four sets of digital volumetric data were established by cone beam computed tomography (CBCT) (Accuitomo, J. Morita, Kyoto, Japan). Datasets were exported as Dicom formats and imported into Mimics and Magic software programs to separate the different tissues such as nerve, tooth and bone. These data were transferred to a Polyjet 3D Printing machine (Eden 330, Object, Israel) to generate the models. Three-dimensional prototype models of certain limited anatomical structures as acquired volumetrically were fabricated. Generating three-dimensional models based on CBCT datasets is possible. Automated routine fabrication of these models, with the given infrastructure, is too time-consuming and therefore too expensive. (orig.)

  2. System-level tools and reconfigurable computing for next-generation HWIL systems (United States)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy


    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  3. Application of computer generated color graphic techniques to the processing and display of three dimensional fluid dynamic data (United States)

    Anderson, B. H.; Putt, C. W.; Giamati, C. C.


    Color coding techniques used in the processing of remote sensing imagery were adapted and applied to the fluid dynamics problems associated with turbofan mixer nozzles. The computer generated color graphics were found to be useful in reconstructing the measured flow field from low resolution experimental data to give more physical meaning to this information and in scanning and interpreting the large volume of computer generated data from the three dimensional viscous computer code used in the analysis.

  4. Fan interaction noise reduction using a wake generator: experiments and computational aeroacoustics (United States)

    Polacsek, C.; Desbois-Lavergne, F.


    A control grid (wake generator) aimed at reducing rotor-stator interaction modes in fan engines when mounted upstream of the rotor has been studied here. This device complements other active noise control systems currently proposed. The compressor model of the instrumented ONERA CERF-rig is used to simulate suitable conditions. The design of the grid is drafted out using semi-empirical models for wake and potential flow, and experimentally achieved. Cylindrical rods are able to generate a spinning mode of the same order and similar level as the interaction mode. Mounting the rods on a rotating ring allows for adjusting the phase of the control mode so that an 8 dB sound pressure level (SPL) reduction at the blade passing frequency is achieved when the two modes are out of phase. Experimental results are assessed by a numerical approach using computational fluid dynamics (CFD). A Reynolds averaged Navier-Stokes 2-D solver, developed at ONERA, is used to provide the unsteady force components on blades and vanes required for acoustics. The loading noise source term of the Ffowcs Williams and Hawkings equation is used to model the interaction noise between the sources, and an original coupling to a boundary element method (BEM) code is realized to take account of the inlet geometry effects on acoustic in-duct propagation. Calculations using the classical analytical the Green function of an infinite annular duct are also addressed. Simple formulations written in the frequency domain and expanded into modes are addressed and used to compute an in-duct interaction mode and to compare with the noise reduction obtained during the tests. A fairly good agreement between predicted and measured SPL is found when the inlet geometry effects are part of the solution (by coupling with the BEM). Furthermore, computed aerodynamic penalties due to the rods are found to be negligible. These results partly validate the computation chain and highlight the potential of the wake generator

  5. A method of computer aided design with self-generative models in NX Siemens environment (United States)

    Grabowik, C.; Kalinowski, K.; Kempa, W.; Paprocka, I.


    Currently in CAD/CAE/CAM systems it is possible to create 3D design virtual models which are able to capture certain amount of knowledge. These models are especially useful in an automation of routine design tasks. These models are known as self-generative or auto generative and they can behave in an intelligent way. The main difference between the auto generative and fully parametric models consists in the auto generative models ability to self-organizing. In this case design model self-organizing means that aside from the possibility of making of automatic changes of model quantitative features these models possess knowledge how these changes should be made. Moreover they are able to change quality features according to specific knowledge. In spite of undoubted good points of self-generative models they are not so often used in design constructional process which is mainly caused by usually great complexity of these models. This complexity makes the process of self-generative time and labour consuming. It also needs a quite great investment outlays. The creation process of self-generative model consists of the three stages it is knowledge and information acquisition, model type selection and model implementation. In this paper methods of the computer aided design with self-generative models in NX Siemens CAD/CAE/CAM software are presented. There are the five methods of self-generative models preparation in NX with: parametric relations model, part families, GRIP language application, knowledge fusion and OPEN API mechanism. In the paper examples of each type of the self-generative model are presented. These methods make the constructional design process much faster. It is suggested to prepare this kind of self-generative models when there is a need of design variants creation. The conducted research on assessing the usefulness of elaborated models showed that they are highly recommended in case of routine tasks automation. But it is still difficult to distinguish

  6. Computer soundcard as an AC signal generator and oscilloscope for the physics laboratory (United States)

    Sinlapanuntakul, Jinda; Kijamnajsuk, Puchong; Jetjamnong, Chanthawut; Chotikaprakhan, Sutharat


    The purpose of this paper is to develop both an AC signal generator and a dual-channel oscilloscope based on standard personal computer equipped with sound card as parts of the laboratory of the fundamental physics and the introduction to electronics classes. The setup turns the computer into the two channel measured device which can provides sample rate, simultaneous sampling, frequency range, filters and others essential capabilities required to perform amplitude, phase and frequency measurements of AC signal. The AC signal also generate from the same computer sound card output simultaneously in any waveform such as sine, square, triangle, saw-toothed pulsed, swept sine and white noise etc. These can convert an inexpensive PC sound card into powerful device, which allows the students to measure physical phenomena with their own PCs either at home or at university attendance. A graphic user interface software was developed for control and analysis, including facilities for data recording, signal processing and real time measurement display. The result is expanded utility of self-learning for the students in the field of electronics both AC and DC circuits, including the sound and vibration experiments.

  7. Structural variation discovery in the cancer genome using next generation sequencing: Computational solutions and perspectives (United States)

    Liu, Biao; Conroy, Jeffrey M.; Morrison, Carl D.; Odunsi, Adekunle O.; Qin, Maochun; Wei, Lei; Trump, Donald L.; Johnson, Candace S.; Liu, Song; Wang, Jianmin


    Somatic Structural Variations (SVs) are a complex collection of chromosomal mutations that could directly contribute to carcinogenesis. Next Generation Sequencing (NGS) technology has emerged as the primary means of interrogating the SVs of the cancer genome in recent investigations. Sophisticated computational methods are required to accurately identify the SV events and delineate their breakpoints from the massive amounts of reads generated by a NGS experiment. In this review, we provide an overview of current analytic tools used for SV detection in NGS-based cancer studies. We summarize the features of common SV groups and the primary types of NGS signatures that can be used in SV detection methods. We discuss the principles and key similarities and differences of existing computational programs and comment on unresolved issues related to this research field. The aim of this article is to provide a practical guide of relevant concepts, computational methods, software tools and important factors for analyzing and interpreting NGS data for the detection of SVs in the cancer genome. PMID:25849937

  8. Computing Aerodynamic Performance of a 2D Iced Airfoil: Blocking Topology and Grid Generation (United States)

    Chi, X.; Zhu, B.; Shih, T. I.-P.; Slater, J. W.; Addy, H. E.; Choo, Yung K.; Lee, Chi-Ming (Technical Monitor)


    The ice accrued on airfoils can have enormously complicated shapes with multiple protruded horns and feathers. In this paper, several blocking topologies are proposed and evaluated on their ability to produce high-quality structured multi-block grid systems. A transition layer grid is introduced to ensure that jaggedness on the ice-surface geometry do not to propagate into the domain. This is important for grid-generation methods based on hyperbolic PDEs (Partial Differential Equations) and algebraic transfinite interpolation. A 'thick' wrap-around grid is introduced to ensure that grid lines clustered next to solid walls do not propagate as streaks of tightly packed grid lines into the interior of the domain along block boundaries. For ice shapes that are not too complicated, a method is presented for generating high-quality single-block grids. To demonstrate the usefulness of the methods developed, grids and CFD solutions were generated for two iced airfoils: the NLF0414 airfoil with and without the 623-ice shape and the B575/767 airfoil with and without the 145m-ice shape. To validate the computations, the computed lift coefficients as a function of angle of attack were compared with available experimental data. The ice shapes and the blocking topologies were prepared by NASA Glenn's SmaggIce software. The grid systems were generated by using a four-boundary method based on Hermite interpolation with controls on clustering, orthogonality next to walls, and C continuity across block boundaries. The flow was modeled by the ensemble-averaged compressible Navier-Stokes equations, closed by the shear-stress transport turbulence model in which the integration is to the wall. All solutions were generated by using the NPARC WIND code.

  9. The extinct animal show: the paleoimagery tradition and computer generated imagery in factual television programs. (United States)

    Campbell, Vincent


    Extinct animals have always been popular subjects for the media, in both fiction, and factual output. In recent years, a distinctive new type of factual television program has emerged in which computer generated imagery is used extensively to bring extinct animals back to life. Such has been the commercial audience success of these programs that they have generated some public and academic debates about their relative status as science, documentary, and entertainment, as well as about their reflection of trends in factual television production, and the aesthetic tensions in the application of new media technologies. Such discussions ignore a crucial contextual feature of computer generated extinct animal programs, namely the established tradition of paleoimagery. This paper examines a selection of extinct animal shows in terms of the dominant frames of the paleoimagery genre. The paper suggests that such an examination has two consequences. First, it allows for a more context-sensitive evaluation of extinct animal programs, acknowledging rather than ignoring relevant representational traditions. Second, it allows for an appraisal and evaluation of public and critical reception of extinct animal programs above and beyond the traditional debates about tensions between science, documentary, entertainment, and public understanding.

  10. Framework for generating expert systems to perform computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.


    At Los Alamos we are developing a framework to generate knowledge-based expert systems for performing automated risk analyses upon a subject system. The expert system is a computer program that models experts' knowledge about a topic, including facts, assumptions, insights, and decision rationale. The subject system, defined as the collection of information, procedures, devices, and real property upon which the risk analysis is to be performed, is a member of the class of systems that have three identifying characteristics: a set of desirable assets (or targets), a set of adversaries (or threats) desiring to obtain or to do harm to the assets, and a set of protective mechanisms to safeguard the assets from the adversaries. Risk analysis evaluates both vulnerability to and the impact of successful threats against the targets by determining the overall effectiveness of the subject system safeguards, identifying vulnerabilities in that set of safeguards, and determining cost-effective improvements to the safeguards. As a testbed, we evaluate the inherent vulnerabilities and risks in a system of computer security safeguards. The method considers safeguards protecting four generic targets (physical plant of the computer installation, its hardware, its software, and its documents and displays) against three generic threats (natural hazards, direct human actions requiring the presence of the adversary, and indirect human actions wherein the adversary is not on the premises-perhaps using such access tools as wiretaps, dialup lines, and so forth). Our automated procedure to assess the effectiveness of computer security safeguards differs from traditional risk analysis methods

  11. Phase transition and computational complexity in a stochastic prime number generator

    Energy Technology Data Exchange (ETDEWEB)

    Lacasa, L; Luque, B [Departamento de Matematica Aplicada y EstadIstica, ETSI Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, Madrid 28040 (Spain); Miramontes, O [Departamento de Sistemas Complejos, Instituto de FIsica, Universidad Nacional Autonoma de Mexico, Mexico 01415 DF (Mexico)], E-mail:


    We introduce a prime number generator in the form of a stochastic algorithm. The character of this algorithm gives rise to a continuous phase transition which distinguishes a phase where the algorithm is able to reduce the whole system of numbers into primes and a phase where the system reaches a frozen state with low prime density. In this paper, we firstly present a broader characterization of this phase transition, both in analytical and numerical terms. Critical exponents are calculated, and data collapse is provided. Further on, we redefine the model as a search problem, fitting it in the hallmark of computational complexity theory. We suggest that the system belongs to the class NP. The computational cost is maximal around the threshold, as is common in many algorithmic phase transitions, revealing the presence of an easy-hard-easy pattern. We finally relate the nature of the phase transition to an average-case classification of the problem.

  12. The benefits of computer-generated feedback for mathematics problem solving. (United States)

    Fyfe, Emily R; Rittle-Johnson, Bethany


    The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Data acquisition with the personal computer to the microwaves generator of the microtron MT-25

    International Nuclear Information System (INIS)

    Rivero Ramirez, D.; Benavides Benitez, J. I.; Quiles Latorre, F. J.; Pahor, J.; Ponikvar, D.; Lago, G.


    The following paper includes the description of the design, construction and completion of a data acquisition system. The system is destined to the sampling of the work parameters of the generator of microwaves of the Microtron-25 that will settle in the High Institute of Nuclear Sciences and Technology, Havana, Cuba. In order to guarantee the suitable operation of the system a monitor program in assembler language has been developed. This program allows the communication of the system with one personal computer through the interface RS-232, as well as executes the commands received through it. Also the development of a program of attention to the system from one personal computer using the methods of the virtual instrumentation is included in this paper

  14. Identifying Computer-Generated Portraits: The Importance of Training and Incentives. (United States)

    Mader, Brandon; Banks, Martin S; Farid, Hany


    The past two decades have seen remarkable advances in photo-realistic rendering of everything from inanimate objects to landscapes, animals, and humans. We previously showed that despite these tremendous advances, human observers remain fairly good at distinguishing computer-generated from photographic images. Building on these results, we describe a series of follow-up experiments that reveal how to improve observer performance. Of general interest to anyone performing psychophysical studies on Mechanical Turk or similar platforms, we find that observer performance can be significantly improved with the proper incentives.

  15. Self-Motion Perception: Assessment by Real-Time Computer Generated Animations (United States)

    Parker, Donald E.


    Our overall goal is to develop materials and procedures for assessing vestibular contributions to spatial cognition. The specific objective of the research described in this paper is to evaluate computer-generated animations as potential tools for studying self-orientation and self-motion perception. Specific questions addressed in this study included the following. First, does a non- verbal perceptual reporting procedure using real-time animations improve assessment of spatial orientation? Are reports reliable? Second, do reports confirm expectations based on stimuli to vestibular apparatus? Third, can reliable reports be obtained when self-motion description vocabulary training is omitted?

  16. Computer code to simulate transients in a steam generator of PWR nuclear power plants

    International Nuclear Information System (INIS)

    Silva, J.M. da.


    A digital computer code KIBE was developed to simulate the transient behavior of a Steam Generator used in Pressurized Water Reactor Power PLants. The equations of Conservation of mass, energy and momentum were numerically integrated by an implicit method progressively in the several axial sections into which the Steam Generator was divided. Forced convection heat transfer was assumed on the primary side, while on the secondary side all the different modes of heat transfer were permitted and deternined from the various correlations. The stability of the stationary state was verified by its reproducibility during the integration of the conservation equation without any pertubation. Transient behavior resulting from pertubations in the flow and the internal energy (temperature) at the inlet of the primary side were simulated. The results obtained exhibited satisfactory behaviour. (author) [pt

  17. Computational Approach to Annotating Variants of Unknown Significance in Clinical Next Generation Sequencing. (United States)

    Schulz, Wade L; Tormey, Christopher A; Torres, Richard


    Next generation sequencing (NGS) has become a common technology in the clinical laboratory, particularly for the analysis of malignant neoplasms. However, most mutations identified by NGS are variants of unknown clinical significance (VOUS). Although the approach to define these variants differs by institution, software algorithms that predict variant effect on protein function may be used. However, these algorithms commonly generate conflicting results, potentially adding uncertainty to interpretation. In this review, we examine several computational tools used to predict whether a variant has clinical significance. In addition to describing the role of these tools in clinical diagnostics, we assess their efficacy in analyzing known pathogenic and benign variants in hematologic malignancies. Copyright© by the American Society for Clinical Pathology (ASCP).

  18. Generating high gray-level resolution monochrome displays with conventional computer graphics cards and color monitors. (United States)

    Li, Xiangrui; Lu, Zhong-Lin; Xu, Pengjing; Jin, Jianzhong; Zhou, Yifeng


    Display systems based on conventional computer graphics cards are capable of generating images with about 8-bit luminance resolution. However, most vision experiments require more than 12 bits of luminance resolution. Pelli and Zhang [Spatial Vis. 10 (1997) 443] described a video attenuator for generating high luminance resolution displays on a monochrome monitor, or for driving just the green gun of a color monitor. Here we show how to achieve a white display by adding video amplifiers to duplicate the monochrome signal to drive all three guns of any color monitor. Because of the lack of the availability of high quality monochrome monitors, our method provides an inexpensive way to achieve high-resolution monochromatic displays using conventional, easy-to-get equipment. We describe the design principles, test results, and a few additional functionalities.

  19. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    International Nuclear Information System (INIS)

    Klimentov, A; Maeno, T; Nilsson, P; Panitkin, S; Wenaus, T; Buncic, P; De, K; Oleynik, D; Petrosyan, A; Jha, S; Mount, R; Porter, R J; Read, K F; Wells, J C; Vaniachine, A


    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2 ) sites, O(10 5 ) cores, O(10 8 ) jobs per year, O(10 3 ) users, and ATLAS data volume is O(10 17 ) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center 'Kurchatov Institute' together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the

  20. Development of a computer program to predict structural integrity against fretting wear of steam generator tubes: PIAT (program for integrity assessment of steam generator tubes)

    International Nuclear Information System (INIS)

    Park, Chi-Yong; Ryu, Ki-Wahn; Rhee, Huinam


    Highlights: ► We develop a computer code to assess the structural integrity of steam generator tubes. ► Flow-induced vibration of whole steam generator tubes can be analyzed systematically. ► The wear map is obtained to predict the wear depth of whole steam generator tubes. ► The structural integrity of steam generator tubes can be improved significantly. -- Abstract: Flow induced vibration of steam generator tubes potentially causes excessive fretting wear at the supports such as anti-vibration bars and tube support plates. For a reliable design of tubes against the flow-induced vibration related failure, the prediction of vibration and wear of tubes should be performed through complicated steps including the thermal-hydraulic analysis, dynamic modal analysis, evaluation of fluid-elastic instability, prediction of turbulence-induced vibration and wear depth for thousands of tubes. However, entire tubes cannot be evaluated within a limited time of design engineering by the conventional analysis methodology. In this paper, we describe an efficient computer program to assess the structural integrity of steam generator tubes against the flow-induced vibration related failure in a very systematic way. The program contains all the necessary thermal-hydraulic database of typical steam generators. It has a very special function to perform modal analysis for all thousands of tubes of a steam generator much faster than the conventional method. The program also performs fluid-elastic instability analysis and calculates the vibrational response to the turbulent flow excitation, and then can predict the wear depth for all tubes of a steam generator. Finally, we can generate the wear prediction map for whole tubes so that an efficient and practical steam generator maintenance management program is feasible. The utilization of the developed computer program for the design and maintenance of steam generators can significantly increase the structural integrity of steam

  1. Computational Models Describing Possible Mechanisms for Generation of Excessive Beta Oscillations in Parkinson's Disease.

    Directory of Open Access Journals (Sweden)

    Alex Pavlides


    Full Text Available In Parkinson's disease, an increase in beta oscillations within the basal ganglia nuclei has been shown to be associated with difficulty in movement initiation. An important role in the generation of these oscillations is thought to be played by the motor cortex and by a network composed of the subthalamic nucleus (STN and the external segment of globus pallidus (GPe. Several alternative models have been proposed to describe the mechanisms for generation of the Parkinsonian beta oscillations. However, a recent experimental study of Tachibana and colleagues yielded results which are challenging for all published computational models of beta generation. That study investigated how the presence of beta oscillations in a primate model of Parkinson's disease is affected by blocking different connections of the STN-GPe circuit. Due to a large number of experimental conditions, the study provides strong constraints that any mechanistic model of beta generation should satisfy. In this paper we present two models consistent with the data of Tachibana et al. The first model assumes that Parkinsonian beta oscillation are generated in the cortex and the STN-GPe circuits resonates at this frequency. The second model additionally assumes that the feedback from STN-GPe circuit to cortex is important for maintaining the oscillations in the network. Predictions are made about experimental evidence that is required to differentiate between the two models, both of which are able to reproduce firing rates, oscillation frequency and effects of lesions carried out by Tachibana and colleagues. Furthermore, an analysis of the models reveals how the amplitude and frequency of the generated oscillations depend on parameters.

  2. Avatars alive! The integration of physiology models and computer generated avatars in a multiplayer online simulation. (United States)

    Kusumoto, Laura; Heinrichs, Wm Leroy; Dev, Parvati; Youngblood, Patricia


    In a mass casualty incident, injured and at-risk patients will pass through a continuum of care from many different providers acting as a team in a clinical environment. As presented at MMVR 14 [Kaufman, et al 2006], formative evaluations have shown that simulation practice is nearly as good as, and in some cases better than, live exercises for stimulating learners to integrate their procedural knowledge in new circumstances through experiential practice. However, to date, multiplayer game technologies have given limited physiological fidelity to their characters, thus limiting the realism and complexity of the scenarios that can be practiced by medical professionals. This paper describes the status of a follow-on program to merge medical and gaming technologies so that computer generated, but human-controlled, avatars used in a simulated, mass casualty training environment will exhibit realistic life signs. This advance introduces a new level of medical fidelity to simulated mass casualty scenarios that can represent thousands of injuries. The program is identifying the critical instructional challenges and related system engineering issues associated with the incorporation of multiple state-of-the-art physiological models into the computer generated synthetic representation of patients. The work is a collaboration between Forterra Systems and the SUMMIT group of Stanford University Medical School, and is sponsored by the US Army Medical Command's Telemedicine and Advanced Technologies Research Center (TATRC).

  3. Computer electric design of synchronous generators; Diseno electrico de generadores sincronos por computadora

    Energy Technology Data Exchange (ETDEWEB)

    Patlan Frausto, Jose O.; Acosta Aradillas, Juan [Instituto de Investigaciones Electricas, Cuernavaca (Mexico)


    In this article are presented, in a conceptual manner the main elements that participate in the electric design of synchronous generators. Also, other options are presented for computers utilization as a design support. Afterwards, a computer program developed by the Instituto de Investigaciones Electricas (IIE) for the electric design of projecting poles generators is presented, oriented towards obtaining their electro-magnetic dimensioning. Finally, with the purpose of validating this program, the results obtained in a specific case, are presented. [Espanol] En este articulo se presentan, de manera conceptual, los principales elementos que participan en el proceso de diseno electrico de generadores sincronos. Tambien se plantean otras opciones para utilizar las computadoras como apoyo de diseno. Posteriormente, se describe un programa de computo desarrollado en el Instituto de Investigaciones Electricas (IIE) para el diseno electrico de generadores de polos salientes, orientado a obtener su dimensionamiento electromagnetico. Por ultimo, con el proposito de validar este programa, se presentan los resultados obtenidos en un caso particular.

  4. Computer Software Configuration Item-Specific Flight Software Image Transfer Script Generator (United States)

    Bolen, Kenny; Greenlaw, Ronald


    A K-shell UNIX script enables the International Space Station (ISS) Flight Control Team (FCT) operators in NASA s Mission Control Center (MCC) in Houston to transfer an entire or partial computer software configuration item (CSCI) from a flight software compact disk (CD) to the onboard Portable Computer System (PCS). The tool is designed to read the content stored on a flight software CD and generate individual CSCI transfer scripts that are capable of transferring the flight software content in a given subdirectory on the CD to the scratch directory on the PCS. The flight control team can then transfer the flight software from the PCS scratch directory to the Electronically Erasable Programmable Read Only Memory (EEPROM) of an ISS Multiplexer/ Demultiplexer (MDM) via the Indirect File Transfer capability. The individual CSCI scripts and the CSCI Specific Flight Software Image Transfer Script Generator (CFITSG), when executed a second time, will remove all components from their original execution. The tool will identify errors in the transfer process and create logs of the transferred software for the purposes of configuration management.

  5. Using Python to generate AHPS-based precipitation simulations over CONUS using Amazon distributed computing (United States)

    Machalek, P.; Kim, S. M.; Berry, R. D.; Liang, A.; Small, T.; Brevdo, E.; Kuznetsova, A.


    We describe how the Climate Corporation uses Python and Clojure, a language impleneted on top of Java, to generate climatological forecasts for precipitation based on the Advanced Hydrologic Prediction Service (AHPS) radar based daily precipitation measurements. A 2-year-long forecasts is generated on each of the ~650,000 CONUS land based 4-km AHPS grids by constructing 10,000 ensembles sampled from a 30-year reconstructed AHPS history for each grid. The spatial and temporal correlations between neighboring AHPS grids and the sampling of the analogues are handled by Python. The parallelization for all the 650,000 CONUS stations is further achieved by utilizing the MAP-REDUCE framework ( Each full scale computational run requires hundreds of nodes with up to 8 processors each on the Amazon Elastic MapReduce ( distributed computing service resulting in 3 terabyte datasets. We further describe how we have productionalized a monthly run of the simulations process at full scale of the 4km AHPS grids and how the resultant terabyte sized datasets are handled.

  6. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road. (United States)

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka


    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  7. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    Directory of Open Access Journals (Sweden)

    Iñaki Bildosola

    Full Text Available Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  8. Two-phase flow steam generator simulations on parallel computers using domain decomposition method

    International Nuclear Information System (INIS)

    Belliard, M.


    Within the framework of the Domain Decomposition Method (DDM), we present industrial steady state two-phase flow simulations of PWR Steam Generators (SG) using iteration-by-sub-domain methods: standard and Adaptive Dirichlet/Neumann methods (ADN). The averaged mixture balance equations are solved by a Fractional-Step algorithm, jointly with the Crank-Nicholson scheme and the Finite Element Method. The algorithm works with overlapping or non-overlapping sub-domains and with conforming or nonconforming meshing. Computations are run on PC networks or on massively parallel mainframe computers. A CEA code-linker and the PVM package are used (master-slave context). SG mock-up simulations, involving up to 32 sub-domains, highlight the efficiency (speed-up, scalability) and the robustness of the chosen approach. With the DDM, the computational problem size is easily increased to about 1,000,000 cells and the CPU time is significantly reduced. The difficulties related to industrial use are also discussed. (author)

  9. PVT: an efficient computational procedure to speed up next-generation sequence analysis. (United States)

    Maji, Ranjan Kumar; Sarkar, Arijita; Khatua, Sunirmal; Dasgupta, Subhasis; Ghosh, Zhumur


    High-throughput Next-Generation Sequencing (NGS) techniques are advancing genomics and molecular biology research. This technology generates substantially large data which puts up a major challenge to the scientists for an efficient, cost and time effective solution to analyse such data. Further, for the different types of NGS data, there are certain common challenging steps involved in analysing those data. Spliced alignment is one such fundamental step in NGS data analysis which is extremely computational intensive as well as time consuming. There exists serious problem even with the most widely used spliced alignment tools. TopHat is one such widely used spliced alignment tools which although supports multithreading, does not efficiently utilize computational resources in terms of CPU utilization and memory. Here we have introduced PVT (Pipelined Version of TopHat) where we take up a modular approach by breaking TopHat's serial execution into a pipeline of multiple stages, thereby increasing the degree of parallelization and computational resource utilization. Thus we address the discrepancies in TopHat so as to analyze large NGS data efficiently. We analysed the SRA dataset (SRX026839 and SRX026838) consisting of single end reads and SRA data SRR1027730 consisting of paired-end reads. We used TopHat v2.0.8 to analyse these datasets and noted the CPU usage, memory footprint and execution time during spliced alignment. With this basic information, we designed PVT, a pipelined version of TopHat that removes the redundant computational steps during 'spliced alignment' and breaks the job into a pipeline of multiple stages (each comprising of different step(s)) to improve its resource utilization, thus reducing the execution time. PVT provides an improvement over TopHat for spliced alignment of NGS data analysis. PVT thus resulted in the reduction of the execution time to ~23% for the single end read dataset. Further, PVT designed for paired end reads showed an

  10. Enhancing Next-Generation Sequencing-Guided Cancer Care Through Cognitive Computing. (United States)

    Patel, Nirali M; Michelini, Vanessa V; Snell, Jeff M; Balu, Saianand; Hoyle, Alan P; Parker, Joel S; Hayward, Michele C; Eberhard, David A; Salazar, Ashley H; McNeillie, Patrick; Xu, Jia; Huettner, Claudia S; Koyama, Takahiko; Utro, Filippo; Rhrissorrakrai, Kahn; Norel, Raquel; Bilal, Erhan; Royyuru, Ajay; Parida, Laxmi; Earp, H Shelton; Grilley-Olson, Juneko E; Hayes, D Neil; Harvey, Stephen J; Sharpless, Norman E; Kim, William Y


    Using next-generation sequencing (NGS) to guide cancer therapy has created challenges in analyzing and reporting large volumes of genomic data to patients and caregivers. Specifically, providing current, accurate information on newly approved therapies and open clinical trials requires considerable manual curation performed mainly by human "molecular tumor boards" (MTBs). The purpose of this study was to determine the utility of cognitive computing as performed by Watson for Genomics (WfG) compared with a human MTB. One thousand eighteen patient cases that previously underwent targeted exon sequencing at the University of North Carolina (UNC) and subsequent analysis by the UNCseq informatics pipeline and the UNC MTB between November 7, 2011, and May 12, 2015, were analyzed with WfG, a cognitive computing technology for genomic analysis. Using a WfG-curated actionable gene list, we identified additional genomic events of potential significance (not discovered by traditional MTB curation) in 323 (32%) patients. The majority of these additional genomic events were considered actionable based upon their ability to qualify patients for biomarker-selected clinical trials. Indeed, the opening of a relevant clinical trial within 1 month prior to WfG analysis provided the rationale for identification of a new actionable event in nearly a quarter of the 323 patients. This automated analysis took potentially improve patient care by providing a rapid, comprehensive approach for data analysis and consideration of up-to-date availability of clinical trials. The results of this study demonstrate that the interpretation and actionability of somatic next-generation sequencing results are evolving too rapidly to rely solely on human curation. Molecular tumor boards empowered by cognitive computing can significantly improve patient care by providing a fast, cost-effective, and comprehensive approach for data analysis in the delivery of precision medicine. Patients and physicians who

  11. Evaluation and characterization of fetal exposures to low frequency magnetic fields generated by laptop computers. (United States)

    Zoppetti, Nicola; Andreuccetti, Daniele; Bellieni, Carlo; Bogi, Andrea; Pinto, Iole


    Portable - or "laptop" - computers (LCs) are widely and increasingly used all over the world. Since LCs are often used in tight contact with the body even by pregnant women, fetal exposures to low frequency magnetic fields generated by these units can occur. LC emissions are usually characterized by complex waveforms and are often generated by the main AC power supply (when connected) and by the display power supply sub-system. In the present study, low frequency magnetic field emissions were measured for a set of five models of portable computers. For each of them, the magnetic flux density was characterized in terms not just of field amplitude, but also of the so called "weighted peak" (WP) index, introduced in the 2003 ICNIRP Statement on complex waveforms and confirmed in the 2010 ICNIRP Guidelines for low frequency fields. For the model of LC presenting the higher emission, a deeper analysis was also carried out, using numerical dosimetry techniques to calculate internal quantities (current density and in-situ electric field) with reference to a digital body model of a pregnant woman. Since internal quantities have complex waveforms too, the concept of WP index was extended to them, considering the ICNIRP basic restrictions defined in the 1998 Guidelines for the current density and in the 2010 Guidelines for the in-situ electric field. Induced quantities and WP indexes were computed using an appropriate original formulation of the well known Scalar Potential Finite Difference (SPFD) numerical method for electromagnetic dosimetry in quasi-static conditions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Quantitative measurement of mean inner potential and specimen thickness from high-resolution off-axis electron holograms of ultra-thin layered WSe{sub 2}

    Energy Technology Data Exchange (ETDEWEB)

    Winkler, Florian, E-mail: [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons (ER-C), Forschungszentrum Jülich, D-52425 Jülich (Germany); Peter Grünberg Institute 5 (PGI-5), Forschungszentrum Jülich, D-52425 Jülich (Germany); Tavabi, Amir H. [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons (ER-C), Forschungszentrum Jülich, D-52425 Jülich (Germany); Peter Grünberg Institute 5 (PGI-5), Forschungszentrum Jülich, D-52425 Jülich (Germany); Barthel, Juri [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons (ER-C), Forschungszentrum Jülich, D-52425 Jülich (Germany); Gemeinschaftslabor für Elektronenmikroskopie (GFE), RWTH Aachen University, D-52074 Aachen (Germany); Duchamp, Martial [Ernst Ruska-Centre for Microscopy and Spectroscopy with Electrons (ER-C), Forschungszentrum Jülich, D-52425 Jülich (Germany); Peter Grünberg Institute 5 (PGI-5), Forschungszentrum Jülich, D-52425 Jülich (Germany); Yucelen, Emrah [FEI Company, Achtseweg Noord 5, Eindhoven 5600 KA (Netherlands); Borghardt, Sven; Kardynal, Beata E. [Peter Grünberg Institute 9 (PGI-9), Forschungszentrum Jülich, D-52425 Jülich (Germany); and others


    The phase and amplitude of the electron wavefunction that has passed through ultra-thin flakes of WSe{sub 2} is measured from high-resolution off-axis electron holograms. Both the experimental measurements and corresponding computer simulations are used to show that, as a result of dynamical diffraction, the spatially averaged phase does not increase linearly with specimen thickness close to an [001] zone axis orientation even when the specimen has a thickness of only a few layers. It is then not possible to infer the local specimen thickness of the WSe{sub 2} from either the phase or the amplitude alone. Instead, we show that the combined analysis of phase and amplitude from experimental measurements and simulations allows an accurate determination of the local specimen thickness. The relationship between phase and projected potential is shown to be approximately linear for extremely thin specimens that are tilted by several degrees in certain directions from the [001] zone axis. A knowledge of the specimen thickness then allows the electrostatic potential to be determined from the measured phase. By using this combined approach, we determine a value for the mean inner potential of WSe{sub 2} of 18.9±0.8 V, which is 12% lower than the value calculated from neutral atom scattering factors. - Highlights: • Quantitative analysis of high resolution electron holograms of WSe{sub 2}. • Local specimen thickness determination and estimation of tilt angle. • Mean inner potential evaluation of WSe2 avoiding dynamical diffraction.

  13. Advanced airborne 3D computer image generation systems technologies for the year 2000 (United States)

    Bridges, Alan L.


    An airborne 3-D computer image generation system (CIGS) is a modular avionics box that receives commands from and sends status information to other avionics units. The CIGS maintains a large amount of data in secondary storage systems and simultaneously drives several display units. Emerging requirements for CIGS include: advanced avionics system architecture requirements and BIT/fault tolerance; real-time operating systems and graphic interface languages in Ada; and geometric/pixel processing functions, rendering system, and frame buffers/display controllers for pictorial displays. In addition, podded sensors (FLIR, LLTV, radar, etc.) will require multiplexing of high-resolution sensor video with graphics overlays. A combination of head-down AMLCD flat panels, helmet-mounted display (HMD), and Head-Up Display (HUD) will require highly parallel graphics generation technology. Generation of high-resolution, real-time 2-D/3-D displays with anti-aliasing, transparency, shading, and motion, however, emphasizes compute-intensive processing. High-performance graphics engines, powerful floating point processors, and parallel architectures are needed to increase the rendering speed, functionality and reliability, while reducing power, space requirements, and cost. The CIGS of the future will feature special high speed busses geared toward real-time graphics processing. The CIG system will be multi-channel, will have a high addressable resolution to drive HUD, 3-D displays in 4-pi-steradian virtual space, and 3-D panoramic displays; and will include fiber optics video distribution between CIG and display units. The head-down display (HDD) is by far the most complex display in that both background and overlay display elements are required. The background is usually generated from terrain/cultural features data. Terrain data is used to generate 2-D map backgrounds or 3-D perspective views duplicating or substituting for the pilot's out-the-window view. Performance of 150

  14. Brushless DC motor control system responsive to control signals generated by a computer or the like (United States)

    Packard, D. T.


    A control system for a brushless DC motor responsive to digital control signals is disclosed. The motor includes a multiphase wound stator and a permanent magnet rotor. The motor is arranged so that each phase winding, when energized from a DC source, will drive the rotor through a predetermined angular position or step. A commutation signal generator responsive to the shaft position provides a commutation signal for each winding. A programmable control signal generator such as a computer or microprocessor produces individual digital control signals for each phase winding. The control signals and commutation signals associated with each winding are applied to an AND gate for that phase winding. Each gate controls a switch connected in series with the associated phase winding and the DC source so that each phase winding is energized only when the commutation signal and the control signal associated with that phase winding are present. The motor shaft may be advanced one step at a time to a desired position by applying a predetermined number of control signals in the proper sequence to the AND gates and the torque generated by the motor be regulated by applying a separate control signal and each AND gate which is pulse width modulated to control the total time that each switch connects its associated winding to the DC source during each commutation period.

  15. Brushless DC motor control system responsive to control signals generated by a computer or the like (United States)

    Packard, Douglas T. (Inventor); Schmitt, Donald E. (Inventor)


    A control system for a brushless DC motor responsive to digital control signals is disclosed. The motor includes a multiphase wound stator and a permanent magnet rotor. The rotor is arranged so that each phase winding, when energized from a DC source, will drive the rotor through a predetermined angular position or step. A commutation signal generator responsive to the shaft position provides a commutation signal for each winding. A programmable control signal generator such as a computer or microprocessor produces individual digital control signals for each phase winding. The control signals and commutation signals associated with each winding are applied to an AND gate for that phase winding. Each gate controls a switch connected in series with the associated phase winding and the DC source so that each phase winding is energized only when the commutation signal and the control signal associated with that phase winding are present. The motor shaft may be advanced one step at a time to a desired position by applying a predetermined number of control signals in the proper sequence to the AND gates and the torque generated by the motor may be regulated by applying a separate control signal to each AND gate which is pulse width modulated to control the total time that each switch connects its associated winding to the DC source during each commutation period.

  16. Computation of the bluff-body sound generation by a self-consistent mean flow formulation (United States)

    Fani, A.; Citro, V.; Giannetti, F.; Auteri, F.


    The sound generated by the flow around a circular cylinder is numerically investigated by using a finite-element method. In particular, we study the acoustic emissions generated by the flow past the bluff body at low Mach and Reynolds numbers. We perform a global stability analysis by using the compressible linearized Navier-Stokes equations. The resulting direct global mode provides detailed information related to the underlying hydrodynamic instability and data on the acoustic field generated. In order to recover the intensity of the produced sound, we apply the self-consistent model for non-linear saturation proposed by Mantič-Lugo, Arratia, and Gallaire ["Self-consistent mean flow description of the nonlinear saturation of the vortex shedding in the cylinder wake," Phys. Rev. Lett. 113, 084501 (2014)]. The application of this model allows us to compute the amplitude of the resulting linear mode and the effects of saturation on the mode structure and acoustic field. Our results show excellent agreement with those obtained by a full compressible simulation direct numerical simulation and those derived by the application of classical acoustic analogy formulations.

  17. A fast point-cloud computing method based on spatial symmetry of Fresnel field (United States)

    Wang, Xiangxiang; Zhang, Kai; Shen, Chuan; Zhu, Wenliang; Wei, Sui


    Aiming at the great challenge for Computer Generated Hologram (CGH) duo to the production of high spatial-bandwidth product (SBP) is required in the real-time holographic video display systems. The paper is based on point-cloud method and it takes advantage of the propagating reversibility of Fresnel diffraction in the propagating direction and the fringe pattern of a point source, known as Gabor zone plate has spatial symmetry, so it can be used as a basis for fast calculation of diffraction field in CGH. A fast Fresnel CGH method based on the novel look-up table (N-LUT) method is proposed, the principle fringe patterns (PFPs) at the virtual plane is pre-calculated by the acceleration algorithm and be stored. Secondly, the Fresnel diffraction fringe pattern at dummy plane can be obtained. Finally, the Fresnel propagation from dummy plan to hologram plane. The simulation experiments and optical experiments based on Liquid Crystal On Silicon (LCOS) is setup to demonstrate the validity of the proposed method under the premise of ensuring the quality of 3D reconstruction the method proposed in the paper can be applied to shorten the computational time and improve computational efficiency.

  18. Cascaded adaptive-mask algorithm for twin-image removal and its application to digital holograms of ice crystals. (United States)

    Raupach, Sebastian M F


    An iterative Gerchberg-Saxton-type algorithm with a support constraint for twin-image removal from reconstructed Gabor inline holograms of single plane objects is described. It is applied to simulated holograms and to holograms of ice crystals recorded in the laboratory and in atmospheric clouds in situ. The algorithm is characterized by a distinction between object and background region and an iterative adaption of the object mask. Applying the algorithm to recorded inline holograms of atmospheric objects, the twin-image artifacts are removed successfully, for the first time allowing for a proper access to the in situ phase information on atmospheric ice crystals. It is also demonstrated that, after application of the algorithm, previously indiscernible internal object features can become visible for large Fresnel numbers.

  19. A strong adaptable autofocusing approach of off-axis infrared digital holography under different quality conditions of holograms (United States)

    Liu, Ning; Yang, Chao


    In this paper, we present an innovative autofocusing criterion for the reconstruction of infrared digital holograms. This criterion has the advantages of fast, efficient and precision when determining the reconstruction distance of off-axis digital holography. This criterion is a mean-free high frequency calculation process. We focus on the problem of mean value drifting found in previous published methods and design our new approach to solve it. Unlike the previous methods perform well only with high quality holograms, our method is effective for both high and low quality holograms. Even when hologram is degraded by destructive interference, our method still performs well. This method helps to automatically determine the precise reconstruction distance, and we are sure that this technology can be applied in industrial applications in the future.

  20. Computational model of neuron-astrocyte interactions during focal seizure generation

    Directory of Open Access Journals (Sweden)

    Davide eReato


    Full Text Available Empirical research in the last decade revealed that astrocytes can respond to neurotransmitters with Ca2+ elevations and generate feedback signals to neurons which modulate synaptic transmission and neuronal excitability. This discovery changed our basic understanding of brain function and provided new perspectives for how astrocytes can participate not only to information processing, but also to the genesis of brain disorders, such as epilepsy. Epilepsy is a neurological disorder characterized by recurrent seizures that can arise focally at restricted areas and propagate throughout the brain. Studies in brain slice models suggest that astrocytes contribute to epileptiform activity by increasing neuronal excitability through a Ca2+-dependent release of glutamate. The underlying mechanism remains, however, unclear. In this study, we implemented a parsimonious network model of neurons and astrocytes. The model consists of excitatory and inhibitory neurons described by Izhikevich's neuron dynamics. The experimentally observed Ca2+ change in astrocytes in response to neuronal activity was modeled with linear equations. We considered that glutamate is released from astrocytes above certain intracellular Ca2+ concentrations thus providing a non-linear positive feedback signal to neurons. Propagating seizure-like ictal discharges (IDs were reliably evoked in our computational model by repeatedly exciting a small area of the network, which replicates experimental results in a slice model of focal ID in entorhinal cortex. We found that the threshold of focal ID generation was lowered when an excitatory feedback-loop between astrocytes and neurons was included. Simulations show that astrocytes can contribute to ID generation by directly affecting the excitatory/inhibitory balance of the neuronal network. Our model can be used to obtain mechanistic insights into the distinct contributions of the different signaling pathways to the generation and

  1. Scaffolding High School Students' Divergent Idea Generation in a Computer-Mediated Design and Technology Learning Environment (United States)

    Yeo, Tiong-Meng; Quek, Choon-Lang


    This comparative study investigates how two groups of design and technology students generated ideas in an asynchronous computer-mediated communication setting. The generated ideas were design ideas in the form of sketches. Each group comprised five students who were all 15 years of age. All the students were from the same secondary school but…

  2. Computer generated structures of grain boundaries in Li2-type ordered alloys

    International Nuclear Information System (INIS)

    DeHosson, J.Th.M.; Pestman, B.J.; Schapink, F.W.; Tichelaar, F.D.


    In recent years, the influence of the establishment of long-range order in cubic alloys on the structure of grain boundaries in Li 2 alloys has been considered. Thus, for example, for the Σ = 5 (310) tilt boundary the various possible structures have been investigated that are generated upon ordering, starting from plausible structures in the disordered state. However, apart from some rough energy estimates based upon nearest neighbor interactions, no reliable energy calculations have been performed of these different possible structures. In this paper, computer calculations based upon interatomic pair potentials constructed in such a way that the Li 2 structure is stable with respect to disordering, are reported for the Σ = 5 (310) boundary. The relative stability of various possible structures, with associated different boundary compositions, has been investigated

  3. Judgement heuristics and bias in evidence interpretation: The effects of computer generated exhibits. (United States)

    Norris, Gareth


    The increasing use of multi-media applications, trial presentation software and computer generated exhibits (CGE) has raised questions as to the potential impact of the use of presentation technology on juror decision making. A significant amount of the commentary on the manner in which CGE exerts legal influence is largely anecdotal; empirical examinations too are often devoid of established theoretical rationalisations. This paper will examine a range of established judgement heuristics (for example, the attribution error, representativeness, simulation), in order to establish their appropriate application for comprehending legal decisions. Analysis of both past cases and empirical studies will highlight the potential for heuristics and biases to be restricted or confounded by the use of CGE. The paper will conclude with some wider discussion on admissibility, access to justice, and emerging issues in the use of multi-media in court. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Potentially Low Cost Solution to Extend Use of Early Generation Computed Tomography

    Directory of Open Access Journals (Sweden)

    Tonna, Joseph E


    Full Text Available In preparing a case report on Brown-Séquard syndrome for publication, we made the incidental finding that the inexpensive, commercially available three-dimensional (3D rendering software we were using could produce high quality 3D spinal cord reconstructions from any series of two-dimensional (2D computed tomography (CT images. This finding raises the possibility that spinal cord imaging capabilities can be expanded where bundled 2D multi-planar reformats and 3D reconstruction software for CT are not available and in situations where magnetic resonance imaging (MRI is either not available or appropriate (e.g. metallic implants. Given the worldwide burden of trauma and considering the limited availability of MRI and advanced generation CT scanners, we propose an alternative, potentially useful approach to imaging spinal cord that might be useful in areas where technical capabilities and support are limited. [West J Emerg Med. 2010; 11(5:463-466.

  5. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms.

    Directory of Open Access Journals (Sweden)

    Gabriel Oltean

    Full Text Available The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms, efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer, and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination. The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each

  6. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta


    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  7. Experimental and computational studies of thermal mixing in next generation nuclear reactors (United States)

    Landfried, Douglas Tyler

    The Very High Temperature Reactor (VHTR) is a proposed next generation nuclear power plant. The VHTR utilizes helium as a coolant in the primary loop of the reactor. Helium traveling through the reactor mixes below the reactor in a region known as the lower plenum. In this region there exists large temperature and velocity gradients due to non-uniform heat generation in the reactor core. Due to these large gradients, concern should be given to reducing thermal striping in the lower plenum. Thermal striping is the phenomena by which temperature fluctuations in the fluid and transferred to and attenuated by surrounding structures. Thermal striping is a known cause of long term material failure. To better understand and predict thermal striping in the lower plenum two separate bodies of work have been conducted. First, an experimental facility capable of predictably recreating some aspects of flow in the lower plenum is designed according to scaling analysis of the VHTR. Namely the facility reproduces jets issuing into a crossflow past a tube bundle. Secondly, extensive studies investigate the mixing of a non-isothermal parallel round triple-jet at two jet-to-jet spacings was conducted. Experimental results were validation with an open source computational fluid dynamics package, OpenFOAMRTM. Additional care is given to understanding the implementation of the realizable k-a and Launder Gibson RSM turbulence Models in OpenFOAMRTM. In order to measure velocity and temperature in the triple-jet experiment a detailed investigation of temperature compensated hotwire anemometry is carried out with special concern being given to quantify the error with the measurements. Finally qualitative comparisons of trends in the experimental results and the computational results is conducted. A new and unexpected physical behavior was observed in the center jet as it appeared to spread unexpectedly for close spacings (S/Djet = 1.41).

  8. Computational Intelligence and Wavelet Transform Based Metamodel for Efficient Generation of Not-Yet Simulated Waveforms. (United States)

    Oltean, Gabriel; Ivanciu, Laura-Nicoleta


    The design and verification of complex electronic systems, especially the analog and mixed-signal ones, prove to be extremely time consuming tasks, if only circuit-level simulations are involved. A significant amount of time can be saved if a cost effective solution is used for the extensive analysis of the system, under all conceivable conditions. This paper proposes a data-driven method to build fast to evaluate, but also accurate metamodels capable of generating not-yet simulated waveforms as a function of different combinations of the parameters of the system. The necessary data are obtained by early-stage simulation of an electronic control system from the automotive industry. The metamodel development is based on three key elements: a wavelet transform for waveform characterization, a genetic algorithm optimization to detect the optimal wavelet transform and to identify the most relevant decomposition coefficients, and an artificial neuronal network to derive the relevant coefficients of the wavelet transform for any new parameters combination. The resulted metamodels for three different waveform families are fully reliable. They satisfy the required key points: high accuracy (a maximum mean squared error of 7.1x10-5 for the unity-based normalized waveforms), efficiency (fully affordable computational effort for metamodel build-up: maximum 18 minutes on a general purpose computer), and simplicity (less than 1 second for running the metamodel, the user only provides the parameters combination). The metamodels can be used for very efficient generation of new waveforms, for any possible combination of dependent parameters, offering the possibility to explore the entire design space. A wide range of possibilities becomes achievable for the user, such as: all design corners can be analyzed, possible worst-case situations can be investigated, extreme values of waveforms can be discovered, sensitivity analyses can be performed (the influence of each parameter on the

  9. Applying Ancestry and Sex Computation as a Quality Control Tool in Targeted Next-Generation Sequencing. (United States)

    Mathias, Patrick C; Turner, Emily H; Scroggins, Sheena M; Salipante, Stephen J; Hoffman, Noah G; Pritchard, Colin C; Shirts, Brian H


    To apply techniques for ancestry and sex computation from next-generation sequencing (NGS) data as an approach to confirm sample identity and detect sample processing errors. We combined a principal component analysis method with k-nearest neighbors classification to compute the ancestry of patients undergoing NGS testing. By combining this calculation with X chromosome copy number data, we determined the sex and ancestry of patients for comparison with self-report. We also modeled the sensitivity of this technique in detecting sample processing errors. We applied this technique to 859 patient samples with reliable self-report data. Our k-nearest neighbors ancestry screen had an accuracy of 98.7% for patients reporting a single ancestry. Visual inspection of principal component plots was consistent with self-report in 99.6% of single-ancestry and mixed-ancestry patients. Our model demonstrates that approximately two-thirds of potential sample swaps could be detected in our patient population using this technique. Patient ancestry can be estimated from NGS data incidentally sequenced in targeted panels, enabling an inexpensive quality control method when coupled with patient self-report. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail:

  10. A New Computational Technique for the Generation of Optimised Aircraft Trajectories (United States)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto


    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  11. A computer program for quantification of SH groups generated after reduction of monoclonal antibodies

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Normando Iznaga; Morales, Alejo; Nunez, Gilda


    Reduction of disulfide bonds to sulfhydryl (SH) groups for direct radiolabeling of antibodies for immunoscintigraphic studies of colorectal and other cancers continues to be of considerable research interest. We have developed a general strategy and a versatile computer program for the quantification of the number of SH per molecule of antibody (Ab) generated after the treatment of monoclonal antibodies (MAbs) with reducing agents such as 2-mercaptoethanol (2-ME), stannous chloride (SnCl{sub 2}), dithiothreitol (DTT), dithioerythritol (DTE), ascorbic acid (AA), and the like. The program we describe here performs an unweighted least-squares regression analysis of the cysteine standard curve and interpolates the cysteine concentration of the samples. The number of SH groups per molecule of antibody in the 2-mercaptoethanol and in the other reducing agents was calculated from the cysteine standard curve using Ellman's reagent to develop the yellow color. The linear least-squares method fit the standard data with a high degree of accuracy and with the correlation coefficient r of 0.999. A program has been written for the IBM PC compatible computer utilizing a friendly menu to interact with the users. The package allows the user to change parameters of the assay, to calculate regression coefficients slope, intercept and its standard errors, to perform statistical analysis, together with detailed analysis of variance, and to produce an output of the results in a printed format.

  12. Specificity of Correlation Pattern Recognition Methods Application in Security Holograms Identity Control Apparatus (United States)

    Zlokazov, E. Yu.; Starikov, R. S.; Odinokov, S. B.; Tsyganov, I. K.; Talalaev, V. E.; Koluchkin, V. V.

    Automatic inspection of security hologram (SH) identity is highly demanded issue due high distribution of SH worldwide to protect documents such as passports, driving licenses, banknotes etc. While most of the known approaches use inspection of SH design features none of these approaches inspect the features of its surface relief that is a direct contribution to original master matrix used for these holograms production. In our previous works we represented the device that was developed to provide SH identification by processing of coherent responses of its surface elements. Most of the algorithms used in this device are based on application of correlation pattern recognition methods. The main issue of the present article is a description of these methods application specificities.

  13. Double-resolution electron holography with simple Fourier transform of fringe-shifted holograms. (United States)

    Volkov, V V; Han, M G; Zhu, Y


    We propose a fringe-shifting holographic method with an appropriate image wave recovery algorithm leading to exact solution of holographic equations. With this new method the complex object image wave recovered from holograms appears to have much less traditional artifacts caused by the autocorrelation band present practically in all Fourier transformed holograms. The new analytical solutions make possible a double-resolution electron holography free from autocorrelation band artifacts and thus push the limits for phase resolution. The new image wave recovery algorithm uses a popular Fourier solution of the side band-pass filter technique, while the fringe-shifting holographic method is simple to implement in practice. Published by Elsevier B.V.

  14. Computing confidence intervals on solution costs for stochastic grid generation expansion problems.

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, David L..; Watson, Jean-Paul


    A range of core operations and planning problems for the national electrical grid are naturally formulated and solved as stochastic programming problems, which minimize expected costs subject to a range of uncertain outcomes relating to, for example, uncertain demands or generator output. A critical decision issue relating to such stochastic programs is: How many scenarios are required to ensure a specific error bound on the solution cost? Scenarios are the key mechanism used to sample from the uncertainty space, and the number of scenarios drives computational difficultly. We explore this question in the context of a long-term grid generation expansion problem, using a bounding procedure introduced by Mak, Morton, and Wood. We discuss experimental results using problem formulations independently minimizing expected cost and down-side risk. Our results indicate that we can use a surprisingly small number of scenarios to yield tight error bounds in the case of expected cost minimization, which has key practical implications. In contrast, error bounds in the case of risk minimization are significantly larger, suggesting more research is required in this area in order to achieve rigorous solutions for decision makers.

  15. Computing confidence intervals on solution costs for stochastic grid generation expansion problems

    International Nuclear Information System (INIS)

    Woodruff, David L.; Watson, Jean-Paul


    A range of core operations and planning problems for the national electrical grid are naturally formulated and solved as stochastic programming problems, which minimize expected costs subject to a range of uncertain outcomes relating to, for example, uncertain demands or generator output. A critical decision issue relating to such stochastic programs is: How many scenarios are required to ensure a specific error bound on the solution cost? Scenarios are the key mechanism used to sample from the uncertainty space, and the number of scenarios drives computational difficultly. We explore this question in the context of a long-term grid generation expansion problem, using a bounding procedure introduced by Mak, Morton, and Wood. We discuss experimental results using problem formulations independently minimizing expected cost and down-side risk. Our results indicate that we can use a surprisingly small number of scenarios to yield tight error bounds in the case of expected cost minimization, which has key practical implications. In contrast, error bounds in the case of risk minimization are significantly larger, suggesting more research is required in this area in order to achieve rigorous solutions for decision makers.

  16. A Computational Model of Torque Generation: Neural, Contractile, Metabolic and Musculoskeletal Components (United States)

    Callahan, Damien M.; Umberger, Brian R.; Kent-Braun, Jane A.


    The pathway of voluntary joint torque production includes motor neuron recruitment and rate-coding, sarcolemmal depolarization and calcium release by the sarcoplasmic reticulum, force generation by motor proteins within skeletal muscle, and force transmission by tendon across the joint. The direct source of energetic support for this process is ATP hydrolysis. It is possible to examine portions of this physiologic pathway using various in vivo and in vitro techniques, but an integrated view of the multiple processes that ultimately impact joint torque remains elusive. To address this gap, we present a comprehensive computational model of the combined neuromuscular and musculoskeletal systems that includes novel components related to intracellular bioenergetics function. Components representing excitatory drive, muscle activation, force generation, metabolic perturbations, and torque production during voluntary human ankle dorsiflexion were constructed, using a combination of experimentally-derived data and literature values. Simulation results were validated by comparison with torque and metabolic data obtained in vivo. The model successfully predicted peak and submaximal voluntary and electrically-elicited torque output, and accurately simulated the metabolic perturbations associated with voluntary contractions. This novel, comprehensive model could be used to better understand impact of global effectors such as age and disease on various components of the neuromuscular system, and ultimately, voluntary torque output. PMID:23405245

  17. Reconstruction of a three-dimensional object from its conoscopic hologram (United States)

    Mugnier, Laurent M.; Sirat, Gabriel Y.


    Conoscopic holography is a method for recording holograms with incoherent light, first presented in 1985. Its applications range from 3D microscopy to 3D satellite imaging and include robotics. The Point Spread Function (PSF) is a Gabor Zone Pattern, which is known to have zeros in Fourier space. We present an experimental technique to obtain an invertible PSF with an experimental image reconstruction, and an original algorithm to find the object shape, validated with both simulations and first experimental results.

  18. Wavelet processing and digital interferometric contrast to improve reconstructions from X-ray Gabor holograms. (United States)

    Aguilar, Juan C; Misawa, Masaki; Matsuda, Kiyofumi; Suzuki, Yoshio; Takeuchi, Akihisa; Yasumoto, Masato


    In this work, the application of an undecimated wavelet transformation together with digital interferometric contrast to improve the resulting reconstructions in a digital hard X-ray Gabor holographic microscope is shown. Specifically, the starlet transform is used together with digital Zernike contrast. With this contrast, the results show that only a small set of scales from the hologram are, in effect, useful, and it is possible to enhance the details of the reconstruction.

  19. Hologram representation of design data in an expert system knowledge base (United States)

    Shiva, S. G.; Klon, Peter F.


    A novel representational scheme for design object descriptions is presented. An abstract notion of modules and signals is developed as a conceptual foundation for the scheme. This abstraction relates the objects to the meaning of system descriptions. Anchored on this abstraction, a representational model which incorporates dynamic semantics for these objects is presented. This representational model is called a hologram scheme since it represents dual level information, namely, structural and semantic. The benefits of this scheme are presented.

  20. Consensus hologram QSAR modeling for the prediction of human intestinal absorption. (United States)

    Moda, Tiago L; Andricopulo, Adriano D


    Consistent in silico models for ADME properties are useful tools in early drug discovery. Here, we report the hologram QSAR modeling of human intestinal absorption using a dataset of 638 compounds with experimental data associated. The final validated models are consistent and robust for the consensus prediction of this important pharmacokinetic property and are suitable for virtual screening applications. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Analysis of the Holart Report project: recording and publishing sales data for fine art holograms (United States)

    Zellerbach, Gary A.


    At Lake Forest College's Fourth International Symposium on Display Holography (July 1991), the author first formulated an idea to promote fine art holography by recording and publishing sale prices for art holograms. The idea was mentioned to several prominent artists in attendance, and the response was enthusiastic. The author formed a new company to publish the world's first journal of international art hologram sales, the Holart Report. Holart Report published four quarterly issues, beginning in May 1992. During that time, the publisher created a significant database of hologram art sales and reported tens of thousands of dollars in holographic art transactions. In February 1993 the author's new job obligations and a general lack of support for the project forced him to suspend publication of Holart Report. This paper attempts to answer serious questions surrounding the experience. What problems were encountered? What benefits, if any, did Holart provide during its short lifetime? Why were many in the holographic art community reluctant to support the project? In retrospect, what should have been done differently to ensure greater success? Lastly, the author states his belief that the idea remains feasible and valuable. The database is intact and the publishing template established. The lessons learned can be used to produce a much improved new version of Holart Report or a similar publication.

  2. A comparison of computer- and hand-generated clinical dental notes with statutory regulations in record keeping. (United States)

    McAndrew, R; Ban, J; Playle, R


    Dental patient records should be of high quality, contain information to allow for good continuity of care and clinical defence (should the need ever arise) and, ideally, facilitate clinical audit. Handwritten dental records have been assessed for their compliance to statutory regulations, but the same cannot be levelled at computer-generated notes. This study aimed to compare and analyse the compliance of both methods of data recording with statutory regulations. Fifty consecutive sets of handwritten notes and 50 sets of computer-generated notes were audited for compliance with a number of legal requirements and desirable characteristics for dental records and the results compared. The standard set for compliance with all characteristics was 100%. The computer-generated notes satisfied the set standard for 8 of the 11 legal requirements and three of six desirable characteristics. The handwritten notes satisfied the set standard for 1 of 11 legal requirements and none of the desirable characteristics. A statistical difference (using a 95% confidence interval) between the two methods was observed in 5 of 11 legal characteristics and three of six desirable characteristics, all of which were in favour of computer-generated notes. Within the limitations of this study, computer-generated notes achieved a much higher compliance rate with the set parameters, making defence in cases of litigation, continuity of care and clinical audit easier and more efficient. © 2011 John Wiley & Sons A/S.

  3. Interferometric key readable security holograms with secrete-codes

    Indian Academy of Sciences (India)

    recording step. Reading of SH through KH generates a specific Moiré pattern at the observation plane OP. Proper repositioning of SH results in the disappearance of these Moiré fringes. The secrete-code information is still invisible and a demod- ulating process is required to convert the invisible phase information into a ...

  4. Surgical technique: Computer-generated custom jigs improve accuracy of wide resection of bone tumors. (United States)

    Khan, Fazel A; Lipman, Joseph D; Pearle, Andrew D; Boland, Patrick J; Healey, John H


    Manual techniques of reproducing a preoperative plan for primary bone tumor resection using rudimentary devices and imprecise localization techniques can result in compromised margins or unnecessary removal of unaffected tissue. We examined whether a novel technique using computer-generated custom jigs more accurately reproduces a preoperative resection plan than a standard manual technique. Using CT images and advanced imaging, reverse engineering, and computer-assisted design software, custom jigs were designed to precisely conform to a specific location on the surface of partially skeletonized cadaveric femurs. The jigs were used to perform a hemimetaphyseal resection. We performed CT scans on six matched pairs of cadaveric femurs. Based on a primary bone sarcoma model, a joint-sparing, hemimetaphyseal wide resection was precisely outlined on each femur. For each pair, the resection was performed using the standard manual technique on one specimen and the custom jig-assisted technique on the other. Superimposition of preoperative and postresection images enabled quantitative analysis of resection accuracy. The mean maximum deviation from the preoperative plan was 9.0 mm for the manual group and 2.0 mm for the custom-jig group. The percentages of times the maximum deviation was greater than 3 mm and greater than 4 mm was 100% and 72% for the manual group and 5.6% and 0.0% for the custom-jig group, respectively. Our findings suggest that custom-jig technology substantially improves the accuracy of primary bone tumor resection, enabling a surgeon to reproduce a given preoperative plan reliably and consistently.

  5. Predicting Complex Organic Mixture Atmospheric Chemistry Using Computer-Generated Reaction Models (United States)

    Klein, M. T.; Broadbelt, L. J.; Mazurek, M. A.


    matrix transforms the reactants to products. The computer-generated reaction models provide mechanistic details and to predict the product spectrum of atmospheric organic reaction families. Precursor compounds for discrete reaction families will have predicted product spectrums. This capability is useful, for example, in modeling the atmospheric chemistry of certain classes of chemical emissions from specific source categories. One other possible benefit of the computer-assisted model building, although more challenging, would be the enhanced ability for attributing plausible emission source chemistry to observed atmospheric organic chemistry ("reverse" interpretation) using the same reaction matrix and rules.

  6. FORIG: a computer code for calculating radionuclide generation and depletion in fusion and fission reactors. User's manual

    International Nuclear Information System (INIS)

    Blink, J.A.


    In this manual we describe the use of the FORIG computer code to solve isotope-generation and depletion problems in fusion and fission reactors. FORIG runs on a Cray-1 computer and accepts more extensive activation cross sections than ORIGEN2 from which it was adapted. This report is an updated and a combined version of the previous ORIGEN2 and FORIG manuals. 7 refs., 15 figs., 13 tabs

  7. Computational efficiency study of a semi-analytical technique for the angular integral estimation found in transfer matrix generation

    International Nuclear Information System (INIS)

    Garcia, R.D.M.


    The computational efficiency of a semi-analytic technique recently proposed for the evaluation of certain angular integrals encountered in the generation of the isotropic and linearly anisotropic components of elastic and discrete inelastic transfer matrices is studied. It is concluded from a comparison with results obtained with the use of numerical quadratures that the technique has certain computational advantages that recommend its implementation. (Author) [pt

  8. Fracture resistance of computer-aided design/computer-aided manufacturing-generated composite resin-based molar crowns


    Harada, A; Nakamura, Keisuke; Kanno, Taro; Inagaki, R.; Ørtengren, Ulf Thore; Niwano, Y.; Sasaki, Keiichi; Egusa, Hiroshi


    Accepted manuscript version.Published version available at The aim of this study was to investigate whether different fabrication processes, such as the computer-aided design/computer-aided manufacturing (CAD/CAM) system or the manual build-up technique, affect the fracture resistance of composite resin-based crowns. Lava Ultimate (LU), Estenia C&B (EC&B), and lithium disilicate glass-ceramic IPS e.max press (EMP) were used. Four types of molar ...

  9. The Computer Generated Art/Contemporary Cinematography And The Remainder Of The Art History. A Critical Approach

    Directory of Open Access Journals (Sweden)

    Modesta Lupașcu


    Full Text Available The paper analyses the re-conceptualization of the intermedial trope of computer generated images/VFX in recent 3D works/cinema scenes through several examples from art history, which are connected with. The obvious connections between art history and images are not conceived primarily as an embodiment of a painting, the introduction of the real into the image, but prove the reconstructive tendencies of contemporary post-postmodern art. The intellectual, the casual, or the obsessive interaction with art history shown by the new film culture, is already celebrated trough 3D computer generated art, focused to a consistently pictorialist cinematography.

  10. A Organization for High-Level Interactive Programmatic Control of Computer-Generated Sound. (United States)

    Das, Sumit

    The state of computer generated sound has advanced rapidly, and there exist many different ways of conceptualizing the abstract sound structures that comprise music and other complex organizations of sound. Many of these methods are radically different from one another, and so are not usually used within the same system. One problem that almost all methods share is one of control, as large amounts of data are needed to specify sounds. How do we create, examine, and modify these complex structures? The problem is exacerbated if we consider the realm of interactively controlled sound. This paper presents an organization which, rather than forcing a particular way of thinking about sound, allows multiple arbitrarily high-level views to coexist, all sharing a common interface. The methods or algorithms are abstracted into objects called auditory actors. This encapsulation allows different algorithms to be used concurrently. All communication with and between these actors is carried out through message-passing, which allows arbitrary types of information (such as other messages) to be easily communicated. This standardizes control without limiting it to a particular type of data. A prototype system was implemented using this model. This system was used by a number of different developers to create audio interfaces for interactive virtual reality applications, which were demonstrated at the SIGGRAPH 94 conference in Orlando, Florida. Compared to earlier systems, developers were able to create more complex audio interfaces in a shorter time.

  11. Computer software program for monitoring the availability of systems and components of electric power generating systems

    International Nuclear Information System (INIS)

    Petersen, T.A.; Hilsmeier, T.A.; Kapinus, D.M.


    As availabilities of electric power generating stations systems and components become more and more important from a financial, personnel safety, and regulatory requirements standpoint, it is evident that a comprehensive, yet simple and user-friendly program for system and component tracking and monitoring is needed to assist in effectively managing the large volume of systems and components with their large numbers of associated maintenance/availability records. A user-friendly computer software program for system and component availability monitoring has been developed that calculates, displays and monitors selected component and system availabilities. This is a Windows trademark based (Graphical User Interface) program that utilizes a system flow diagram for the data input screen which also provides a visual representation of availability values and limits for the individual components and associated systems. This program can be customized to the user's plant-specific system and component selections and configurations. As will be discussed herein, this software program is well suited for availability monitoring and ultimately providing valuable information for improving plant performance and reducing operating costs

  12. Imaging membrane potential changes from dendritic spines using computer-generated holography. (United States)

    Tanese, Dimitrii; Weng, Ju-Yun; Zampini, Valeria; De Sars, Vincent; Canepari, Marco; Rozsa, Balazs; Emiliani, Valentina; Zecevic, Dejan


    Electrical properties of neuronal processes are extraordinarily complex, dynamic, and, in the general case, impossible to predict in the absence of detailed measurements. To obtain such a measurement one would, ideally, like to be able to monitor electrical subthreshold events as they travel from synapses on distal dendrites and summate at particular locations to initiate action potentials. It is now possible to carry out these measurements at the scale of individual dendritic spines using voltage imaging. In these measurements, the voltage-sensitive probes can be thought of as transmembrane voltmeters with a linear scale, which directly monitor electrical signals. Grinvald et al. were important early contributors to the methodology of voltage imaging, and they pioneered some of its significant results. We combined voltage imaging and glutamate uncaging using computer-generated holography. The results demonstrated that patterned illumination, by reducing the surface area of illuminated membrane, reduces photodynamic damage. Additionally, region-specific illumination practically eliminated the contamination of optical signals from individual spines by the scattered light from the parent dendrite. Finally, patterned illumination allowed one-photon uncaging of glutamate on multiple spines to be carried out in parallel with voltage imaging from the parent dendrite and neighboring spines.

  13. Computer generation of cobalt-60 single beam dose distribution using an analytical beam model

    International Nuclear Information System (INIS)

    Jayaraman, S.


    A beam dose calculation model based on evaluation of tissue air ratios (TAR) and scatter air ratios (SAR) for cobalt-60 beams of rectangular cross section has been developed. Off-central axis fall-off of primary radiation intensity is derived by an empirical formulation involving an arctangent function with the slope of the geometrical penumbra acting as an essential constant. Central axis TAR and SAR values are assessed by semi-empirical polynomial expressions employing the two sides of the rectangular field as the bariables. The model utilises a minimum number of parametric constants and is useful for computer generation of isodose curves. The model is capable of accounting for situations where wedge filters or split field shielding blocks, are encountered. Further it could be widely applied with minor modifications to several makes of the currently available cobalt-60 units. The paper explains the model and shows examples of the results obtained in comparison with the corresponding experimentally determined dose distributions. (orig.) [de

  14. Self-motion perception: assessment by real-time computer-generated animations (United States)

    Parker, D. E.; Phillips, J. O.


    We report a new procedure for assessing complex self-motion perception. In three experiments, subjects manipulated a 6 degree-of-freedom magnetic-field tracker which controlled the motion of a virtual avatar so that its motion corresponded to the subjects' perceived self-motion. The real-time animation created by this procedure was stored using a virtual video recorder for subsequent analysis. Combined real and illusory self-motion and vestibulo-ocular reflex eye movements were evoked by cross-coupled angular accelerations produced by roll and pitch head movements during passive yaw rotation in a chair. Contrary to previous reports, illusory self-motion did not correspond to expectations based on semicircular canal stimulation. Illusory pitch head-motion directions were as predicted for only 37% of trials; whereas, slow-phase eye movements were in the predicted direction for 98% of the trials. The real-time computer-generated animations procedure permits use of naive, untrained subjects who lack a vocabulary for reporting motion perception and is applicable to basic self-motion perception studies, evaluation of motion simulators, assessment of balance disorders and so on.

  15. Temporal analysis of laser beam propagation in the atmosphere using computer-generated long phase screens. (United States)

    Dios, Federico; Recolons, Jaume; Rodríguez, Alejandro; Batet, Oscar


    Temporal analysis of the irradiance at the detector plane is intended as the first step in the study of the mean fade time in a free optical communication system. In the present work this analysis has been performed for a Gaussian laser beam propagating in the atmospheric turbulence by means of computer simulation. To this end, we have adapted a previously known numerical method to the generation of long phase screens. The screens are displaced in a transverse direction as the wave is propagated, in order to simulate the wind effect. The amplitude of the temporal covariance and its power spectrum have been obtained at the optical axis, at the beam centroid and at a certain distance from these two points. Results have been worked out for weak, moderate and strong turbulence regimes and when possible they have been compared with theoretical models. These results show a significant contribution of beam wander to the temporal behaviour of the irradiance, even in the case of weak turbulence. We have also found that the spectral bandwidth of the covariance is hardly dependent on the Rytov variance.

  16. Optics in neural computation (United States)

    Levene, Michael John

    In all attempts to emulate the considerable powers of the brain, one is struck by both its immense size, parallelism, and complexity. While the fields of neural networks, artificial intelligence, and neuromorphic engineering have all attempted oversimplifications on the considerable complexity, all three can benefit from the inherent scalability and parallelism of optics. This thesis looks at specific aspects of three modes in which optics, and particularly volume holography, can play a part in neural computation. First, holography serves as the basis of highly-parallel correlators, which are the foundation of optical neural networks. The huge input capability of optical neural networks make them most useful for image processing and image recognition and tracking. These tasks benefit from the shift invariance of optical correlators. In this thesis, I analyze the capacity of correlators, and then present several techniques for controlling the amount of shift invariance. Of particular interest is the Fresnel correlator, in which the hologram is displaced from the Fourier plane. In this case, the amount of shift invariance is limited not just by the thickness of the hologram, but by the distance of the hologram from the Fourier plane. Second, volume holography can provide the huge storage capacity and high speed, parallel read-out necessary to support large artificial intelligence systems. However, previous methods for storing data in volume holograms have relied on awkward beam-steering or on as-yet non- existent cheap, wide-bandwidth, tunable laser sources. This thesis presents a new technique, shift multiplexing, which is capable of very high densities, but which has the advantage of a very simple implementation. In shift multiplexing, the reference wave consists of a focused spot a few millimeters in front of the hologram. Multiplexing is achieved by simply translating the hologram a few tens of microns or less. This thesis describes the theory for how shift

  17. ASIC chipset design to generate block-based complex holographic video. (United States)

    Seo, Young-Ho; Lee, Yoon-Hyuk; Kim, Dong-Wook


    In this paper, we propose a new hardware architecture implemented as a very large scaled integrated circuit by using an application-specific integrated circuit technology, where block-based calculations are used to generate holograms. The proposed hardware is structured to produce a part of a hologram in the block units in parallel. A block of a hologram is calculated using an object point, and then the calculation is repeated for all object points to obtain intermediate results that are accumulated to produce the final block of a hologram. This structure can be used to produce holograms of various sizes in real time with optimized memory access. The proposed hardware was implemented using the Hynix 0.18 μm CMOS technology of Magna Chip, Inc., and it has about 448 K gate counts and a silicon size of 3.592  mm×3.592  mm. It can generate complex holograms and operate in a stable manner at a clock frequency of 200 MHz.

  18. Advanced Computational Materials Science: Application to Fusion and Generation IV Fission Reactors (Workshop Report)

    Energy Technology Data Exchange (ETDEWEB)

    Stoller, RE


    The ''Workshop on Advanced Computational Materials Science: Application to Fusion and Generation IV Fission Reactors'' was convened to determine the degree to which an increased effort in modeling and simulation could help bridge the gap between the data that is needed to support the implementation of these advanced nuclear technologies and the data that can be obtained in available experimental facilities. The need to develop materials capable of performing in the severe operating environments expected in fusion and fission (Generation IV) reactors represents a significant challenge in materials science. There is a range of potential Gen-IV fission reactor design concepts and each concept has its own unique demands. Improved economic performance is a major goal of the Gen-IV designs. As a result, most designs call for significantly higher operating temperatures than the current generation of LWRs to obtain higher thermal efficiency. In many cases, the desired operating temperatures rule out the use of the structural alloys employed today. The very high operating temperature (up to 1000 C) associated with the NGNP is a prime example of an attractive new system that will require the development of new structural materials. Fusion power plants represent an even greater challenge to structural materials development and application. The operating temperatures, neutron exposure levels and thermo-mechanical stresses are comparable to or greater than those for proposed Gen-IV fission reactors. In addition, the transmutation products created in the structural materials by the high energy neutrons produced in the DT plasma can profoundly influence the microstructural evolution and mechanical behavior of these materials. Although the workshop addressed issues relevant to both Gen-IV and fusion reactor materials, much of the discussion focused on fusion; the same focus is reflected in this report. Most of the physical models and computational methods

  19. Changing a Generation's Way of Thinking: Teaching Computational Thinking through Programming (United States)

    Buitrago Flórez, Francisco; Casallas, Rubby; Hernández, Marcela; Reyes, Alejandro; Restrepo, Silvia; Danies, Giovanna


    Computational thinking (CT) uses concepts that are essential to computing and information science to solve problems, design and evaluate complex systems, and understand human reasoning and behavior. This way of thinking has important implications in computer sciences as well as in almost every other field. Therefore, we contend that CT should be…

  20. Idea Generation in Student Writing: Computational Assessments and Links to Successful Writing (United States)

    Crossley, Scott A.; Muldner, Kasia; McNamara, Danielle S.


    Idea generation is an important component of most major theories of writing. However, few studies have linked idea generation in writing samples to assessments of writing quality or examined links between linguistic features in a text and idea generation. This study uses human ratings of idea generation, such as "idea fluency, idea…


    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  2. Perceived image quality for autostereoscopic holograms in healthcare training (United States)

    Goldiez, Brian; Abich, Julian; Carter, Austin; Hackett, Matthew


    The current state of dynamic light field holography requires further empirical investigation to ultimately advance this developing technology. This paper describes a user-centered design approach for gaining insight into the features most important to clinical personnel using emerging dynamic holographic displays. The approach describes the generation of a high quality holographic model of a simulated traumatic amputation above the knee using 3D scanning. Using that model, a set of static holographic prints will be created varying in color or monochrome, contrast ratio, and polygon density. Leveraging methods from image quality research, the goal for this paper is to describe an experimental approach wherein participants are asked to provide feedback regarding the elements previously mentioned in order to guide the ongoing evolution of holographic displays.

  3. SPLPKG WFCMPR WFAPPX, Wilson-Fowler Spline Generator for Computer Aided Design And Manufacturing (CAD/CAM) Systems

    International Nuclear Information System (INIS)

    Fletcher, S.K.


    1 - Description of program or function: The three programs SPLPKG, WFCMPR, and WFAPPX provide the capability for interactively generating, comparing and approximating Wilson-Fowler Splines. The Wilson-Fowler spline is widely used in Computer Aided Design and Manufacturing (CAD/CAM) systems. It is favored for many applications because it produces a smooth, low curvature fit to planar data points. Program SPLPKG generates a Wilson-Fowler spline passing through given nodes (with given end conditions) and also generates a piecewise linear approximation to that spline within a user-defined tolerance. The program may be used to generate a 'desired' spline against which to compare other Splines generated by CAD/CAM systems. It may also be used to generate an acceptable approximation to a desired spline in the event that an acceptable spline cannot be generated by the receiving CAD/CAM system. SPLPKG writes an IGES file of points evaluated on the spline and/or a file containing the spline description. Program WFCMPR computes the maximum difference between two Wilson-Fowler Splines and may be used to verify the spline recomputed by a receiving system. It compares two Wilson-Fowler Splines with common nodes and reports the maximum distance between curves (measured perpendicular to segments) and the maximum difference of their tangents (or normals), both computed along the entire length of the Splines. Program WFAPPX computes the maximum difference between a Wilson- Fowler spline and a piecewise linear curve. It may be used to accept or reject a proposed approximation to a desired Wilson-Fowler spline, even if the origin of the approximation is unknown. The maximum deviation between these two curves, and the parameter value on the spline where it occurs are reported. 2 - Restrictions on the complexity of the problem - Maxima of: 1600 evaluation points (SPLPKG), 1000 evaluation points (WFAPPX), 1000 linear curve breakpoints (WFAPPX), 100 spline Nodes

  4. Chemistry for Kids: Generating Carbon Dioxide in Elementary School Chemistry and Using a Computer To Write about It. (United States)

    Schlenker, Richard M.; Yoshida, Sarah

    This material describes an activity using vinegar and baking soda to generate carbon dioxide, and writing a report using the Appleworks word processing program for grades 3 to 8 students. Time requirement, relevant process skills, vocabulary, mathematics skills, computer skills, and materials are listed. Activity procedures including class…


    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  6. Cognitive map in patients with mild Alzheimer's disease: a computer-generated arena study. (United States)

    Jheng, Sheng-Siang; Pai, Ming-Chyi


    In addition to memory impairment, a tendency to get lost is among the initial symptoms in patients with Alzheimer disease (AD). At least two kinds of wayfinding strategies, egocentric and allocentric, have been proposed. It is believed that people may form a cognitive map after repeated movement in a specific environment, and are able to use it as an aid to navigation. In the present study, we investigated the cognitive maps in early AD patients and their application in a computer-generated arena (CGA). We invited very mild AD (CDR 0.5) patients and normal controls (NCs) to participate in the current study. Hand-drawing tests were used to assess their supposedly previously formed cognitive maps of familiar environments, and CGA was used to measure their new environment learning as well as the application of the old map. Nineteen patients (8 females, mean age 67.6 years old, education 9.7 years, and MMSE 24) and 18 NCs (10 females, mean age 66.4 years old, education 8.8 years, and MMSE 27) completed the study. In the hand-drawing map part, both groups did equally well. In the new environment learning, NCs did better than the AD group on the third of six trials. As for the old environment navigation experiment, the AD group spent more time than the NCs in finding the target, but showed no difference to NC regarding the path traveled in the target quadrant. Early AD patients maintain their ability to use a cognitive map and keep pretty good allocentric representation of their familiar environments as well as NC do, but probably both groups do not routinely use their cognitive map to navigate in everyday life properly.

  7. Hologram QSAR Studies of Antiprotozoal Activities of Sesquiterpene Lactones

    Directory of Open Access Journals (Sweden)

    Gustavo H. G. Trossini


    Full Text Available Infectious diseases such as trypanosomiasis and leishmaniasis are considered neglected tropical diseases due the lack for many years of research and development into new drug treatments besides the high incidence of mortality and the lack of current safe and effective drug therapies. Natural products such as sesquiterpene lactones have shown activity against T. brucei and L. donovani, the parasites responsible for these neglected diseases. To evaluate structure activity relationships, HQSAR models were constructed to relate a series of 40 sesquiterpene lactones (STLs with activity against T. brucei, T. cruzi, L. donovani and P. falciparum and also with their cytotoxicity. All constructed models showed good internal (leave-one-out q2 values ranging from 0.637 to 0.775 and external validation coefficients (r2test values ranging from 0.653 to 0.944. From HQSAR contribution maps, several differences between the most and least potent compounds were found. The fragment contribution of PLS-generated models confirmed the results of previous QSAR studies that the presence of α,β-unsatured carbonyl groups is fundamental to biological activity. QSAR models for the activity of these compounds against T. cruzi, L. donovani and P. falciparum are reported here for the first time. The constructed HQSAR models are suitable to predict the activity of untested STLs.

  8. Laser memory (hologram) and coincident redundant multiplex memory (CRM-memory)

    International Nuclear Information System (INIS)

    Ostojic, Branko


    It is shown that besides the memory which remembers the object by memorising of the phases of the interferenting waves of the light (i.e. hologram) it is possible to construct the memory which remembers the object by memorising of the phases of the interferenting impulses (CFM-memory). It is given the mathematical description of the memory, based on the experimental model. Although in the paper only the technical aspect of CRM memory is given. It is mentioned the possibility that the human memory has the same principle and that the invention of CRM memory is due to cybernetical analysis of the system human eye-visual cortex

  9. Studying the Recent Improvements in Holograms for Three-Dimensional Display

    Directory of Open Access Journals (Sweden)

    Hamed Abbasi


    Full Text Available Displayers tend to become three-dimensional. The most advantage of holographic 3D displays is the possibility to observe 3D images without using glasses. The quality of created images by this method has surprised everyone. In this paper, the experimental steps of making a transmission hologram have been mentioned. In what follows, current advances of this science-art will be discussed. The aim of this paper is to study the recent improvements in creating three-dimensional images and videos by means of holographic techniques. In the last section we discuss the potentials of holography to be applied in future.

  10. Signal intensity enhancement of laser ablated volume holograms (United States)

    Versnel, J. M.; Williams, C.; Davidson, C. A. B.; Wilkinson, T. D.; Lowe, C. R.


    Conventional volume holographic gratings (VHGs) fabricated in photosensitive emulsions such as gelatin containing silver salts enable the facile visualization of the holographic image in ambient lighting. However, for the fabrication of holographic sensors, which require more defined and chemically-functionalised polymer matrices, laser ablation has been introduced to create the VHGs and thereby broaden their applications, although the replay signal can be challenging to detect in ambient lighting. When traditional photochemical bleaching solutions used to reduce light scattering and modulate refractive index within the VHG are applied to laser ablated volume holographic gratings, these procedures decrease the holographic peak intensity. This is postulated to occur because both light and dark fringes contain a proportion of metal particles, which upon solubilisation are converted immediately to silver iodide, yielding no net refractive index modulation. This research advances a hypothesis that the reduced intensity of holographic replay signals is linked to a gradient of different sized metal particles within the emulsion, which reduces the holographic signal and may explain why traditional bleaching processes result in a reduction in intensity. In this report, a novel experimental protocol is provided, along with simulations based on an effective medium periodic 1D stack, that offers a solution to increase peak signal intensity of holographic sensors by greater than 200%. Nitric acid is used to etch the silver nanoparticles within the polymer matrix and is thought to remove the smaller particles to generate more defined metal fringes containing a soluble metal salt. Once the grating efficiency has been increased, this salt can be converted to a silver halide, to modulate the refractive index and increase the intensity of the holographic signal. This new protocol has been tested in a range of polymer chemistries; those containing functional groups that help to

  11. High-Quality Random Number Generation Software for High-Performance Computing Project (United States)

    National Aeronautics and Space Administration — Random number (RN) generation is the key software component that permits random sampling. Software for parallel RN generation (RNG) should be based on RNGs that are...

  12. Getting the story right: making computer-generated stories more entertaining

    NARCIS (Netherlands)

    Oinonen, K.M.; Theune, Mariet; Nijholt, Antinus; Heylen, Dirk K.J.; Maybury, Mark; Stock, Oliviero; Wahlster, Wolfgang


    In this paper we describe our efforts to increase the entertainment value of the stories generated by our story generation system, the Virtual Storyteller, at the levels of plot creation, discourse generation and spoken language presentation. We also discuss the construction of a story database that

  13. Computational modeling of spike generation in serotonergic neurons of the dorsal raphe nucleus. (United States)

    Tuckwell, Henry C; Penington, Nicholas J


    Serotonergic neurons of the dorsal raphe nucleus, with their extensive innervation of limbic and higher brain regions and interactions with the endocrine system have important modulatory or regulatory effects on many cognitive, emotional and physiological processes. They have been strongly implicated in responses to stress and in the occurrence of major depressive disorder and other psychiatric disorders. In order to quantify some of these effects, detailed mathematical models of the activity of such cells are required which describe their complex neurochemistry and neurophysiology. We consider here a single-compartment model of these neurons which is capable of describing many of the known features of spike generation, particularly the slow rhythmic pacemaking activity often observed in these cells in a variety of species. Included in the model are 11 kinds of ion channels: a fast sodium current INa, a delayed rectifier potassium current IKDR, a transient potassium current IA, a slow non-inactivating potassium current IM, a low-threshold calcium current IT, two high threshold calcium currents IL and IN, small and large conductance potassium currents ISK and IBK, a hyperpolarization-activated cation current IH and a leak current ILeak. In Sections 3-8, each current type is considered in detail and parameters estimated from voltage clamp data where possible. Three kinds of model are considered for the BK current and two for the leak current. Intracellular calcium ion concentration Cai is an additional component and calcium dynamics along with buffering and pumping is discussed in Section 9. The remainder of the article contains descriptions of computed solutions which reveal both spontaneous and driven spiking with several parameter sets. Attention is focused on the properties usually associated with these neurons, particularly long duration of action potential, steep upslope on the leading edge of spikes, pacemaker-like spiking, long-lasting afterhyperpolarization

  14. Dual-wavelength phase-shifting digital holography selectively extracting wavelength information from wavelength-multiplexed holograms. (United States)

    Tahara, Tatsuki; Mori, Ryota; Kikunaga, Shuhei; Arai, Yasuhiko; Takaki, Yasuhiro


    Dual-wavelength phase-shifting digital holography that selectively extracts wavelength information from five wavelength-multiplexed holograms is presented. Specific phase shifts for respective wavelengths are introduced to remove the crosstalk components and extract only the object wave at the desired wavelength from the holograms. Object waves in multiple wavelengths are selectively extracted by utilizing 2π ambiguity and the subtraction procedures based on phase-shifting interferometry. Numerical results show the validity of the proposed technique. The proposed technique is also experimentally demonstrated.

  15. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing. (United States)

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav


    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  16. The GLOBE-Consortium: The Erasmus Computing Grid and The Next Generation Genome Viewer

    NARCIS (Netherlands)

    T.A. Knoch (Tobias)


    markdownabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live-science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  17. Measurement and Evidence of Computer-Based Task Switching and Multitasking by "Net Generation" Students (United States)

    Judd, Terry; Kennedy, Gregor


    Logs of on-campus computer and Internet usage were used to conduct a study of computer-based task switching and multitasking by undergraduate medical students. A detailed analysis of over 6000 individual sessions revealed that while a majority of students engaged in both task switching and multitasking behaviours, they did so less frequently than…

  18. A comparative study between xerographic, computer-assisted overlay generation and animated-superimposition methods in bite mark analyses. (United States)

    Tai, Meng Wei; Chong, Zhen Feng; Asif, Muhammad Khan; Rahmat, Rabiah A; Nambiar, Phrabhakaran


    This study was to compare the suitability and precision of xerographic and computer-assisted methods for bite mark investigations. Eleven subjects were asked to bite on their forearm and the bite marks were photographically recorded. Alginate impressions of the subjects' dentition were taken and their casts were made using dental stone. The overlays generated by xerographic method were obtained by photocopying the subjects' casts and the incisal edge outlines were then transferred on a transparent sheet. The bite mark images were imported into Adobe Photoshop® software and printed to life-size. The bite mark analyses using xerographically generated overlays were done by comparing an overlay to the corresponding printed bite mark images manually. In computer-assisted method, the subjects' casts were scanned into Adobe Photoshop®. The bite mark analyses using computer-assisted overlay generation were done by matching an overlay and the corresponding bite mark images digitally using Adobe Photoshop®. Another comparison method was superimposing the cast images with corresponding bite mark images employing the Adobe Photoshop® CS6 and GIF-Animator©. A score with a range of 0-3 was given during analysis to each precision-determining criterion and the score was increased with better matching. The Kruskal Wallis H test showed significant difference between the three sets of data (H=18.761, p<0.05). In conclusion, bite mark analysis using the computer-assisted animated-superimposition method was the most accurate, followed by the computer-assisted overlay generation and lastly the xerographic method. The superior precision contributed by digital method is discernible despite the human skin being a poor recording medium of bite marks. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Development of a computer-generated model for the coronary arterial tree based on multislice CT and morphometric data (United States)

    Fung, George S. K.; Segars, W. Paul; Taguchi, Katsuyuki; Fishman, Elliot K.; Tsui, Benjamin M. W.


    A detailed four-dimensional model of the coronary artery tree has great potential in a wide variety of applications especially in biomedical imaging. We developed a computer generated three-dimensional model for the coronary arterial tree based on two datasets: (1) gated multi-slice computed tomography (MSCT) angiographic data obtained from a normal human subject and (2) statistical morphometric data obtained from porcine hearts. The main coronary arteries and heart structures were segmented from the MSCT data to define the initial segments of the vasculature and geometrical details of the boundaries. An iterative rule-based computer generation algorithm was then developed to extend the coronary artery tree beyond the initial segmented branches. The algorithm was governed by the following factors: (1) the statistical morphometric measurements of the connectivities, lengths, and diameters of the arterial segments, (2) repelling forces from other segments and boundaries, and (3) optimality principles to minimize the drag force at each bifurcation in the generated tree. Using this algorithm, the segmented coronary artery tree from the MSCT data was optimally extended to create a 3D computational model of the largest six orders of the coronary arterial tree. The new method for generating the 3D model is effective in imposing the constraints of anatomical and physiological characteristics of coronary vasculature. When combined with the 4D NCAT phantom, a computer model for the human anatomy and cardiac and respiratory motions, the new model will provide a unique tool to study cardiovascular characteristics and diseases through direct and medical imaging simulation studies.

  20. Synergistic Computational and Microstructural Design of Next- Generation High-Temperature Austenitic Stainless Steels

    Energy Technology Data Exchange (ETDEWEB)

    Karaman, Ibrahim [Texas A& M Engineering Experiment Station, College Station, TX (United States); Arroyave, Raymundo [Texas A& M Engineering Experiment Station, College Station, TX (United States)


    The purpose of this project was to: 1) study deformation twinning, its evolution, thermal stability, and the contribution on mechanical response of the new advanced stainless steels, especially at elevated temperatures; 2) study alumina-scale formation on the surface, as an alternative for conventional chromium oxide, that shows better oxidation resistance, through alloy design; and 3) design new generation of high temperature stainless steels that form alumina scale and have thermally stable nano-twins. The work involved few baseline alloys for investigating the twin formation under tensile loading, thermal stability of these twins, and the role of deformation twins on the mechanical response of the alloys. These baseline alloys included Hadfield Steel (Fe-13Mn-1C), 316, 316L and 316N stainless steels. Another baseline alloy was studied for alumina-scale formation investigations. Hadfield steel showed twinning but undesired second phases formed at higher temperatures. 316N stainless steel did not show signs of deformation twinning. Conventional 316 stainless steel demonstrated extensive deformation twinning at room temperature. Investigations on this alloy, both in single crystalline and polycrystalline forms, showed that deformation twins evolve in a hierarchical manner, consisting of micron–sized bundles of nano-twins. The width of nano-twins stays almost constant as the extent of strain increases, but the width and number of the bundles increase with increasing strain. A systematic thermomechanical cycling study showed that the twins were stable at temperatures as high as 900°C, after the dislocations are annealed out. Using such cycles, volume fraction of the thermally stable deformation twins were increased up to 40% in 316 stainless steel. Using computational thermodynamics and kinetics calculations, we designed two generations of advanced austenitic stainless steels. In the first generation, Alloy 1, which had been proposed as an alumina

  1. Detection of viability of micro-algae cells by optofluidic hologram pattern. (United States)

    Wang, Junsheng; Yu, Xiaomei; Wang, Yanjuan; Pan, Xinxiang; Li, Dongqing


    A rapid detection of micro-algae activity is critical for analysis of ship ballast water. A new method for detecting micro-algae activity based on lens-free optofluidic holographic imaging is presented in this paper. A compact lens-free optofluidic holographic imaging device was developed. This device is mainly composed of a light source, a small through-hole, a light propagation module, a microfluidic chip, and an image acquisition and processing module. The excited light from the light source passes through a small hole to reach the surface of the micro-algae cells in the microfluidic chip, and a holographic image is formed by the diffraction light of surface of micro-algae cells. The relation between the characteristics in the hologram pattern and the activity of micro-algae cells was investigated by using this device. The characteristics of the hologram pattern were extracted to represent the activity of micro-algae cells. To demonstrate the accuracy of the presented method and device, four species of micro-algae cells were employed as the test samples and the comparison experiments between the alive and dead cells of four species of micro-algae were conducted. The results show that the developed method and device can determine live/dead microalgae cells accurately.

  2. The computer vision in the service of safety and reliability in steam generators inspection services

    International Nuclear Information System (INIS)

    Pineiro Fernandez, P.; Garcia Bueno, A.; Cabrera Jordan, E.


    The actual computational vision has matured very quickly in the last ten years by facilitating new developments in various areas of nuclear application allowing to automate and simplify processes and tasks, instead or in collaboration with the people and equipment efficiently. The current computer vision (more appropriate than the artificial vision concept) provides great possibilities of also improving in terms of the reliability and safety of NPPS inspection systems.

  3. The Next Generation of Lab and Classroom Computing - The Silver Lining (United States)


    more physical memory , processors, and more storage space as compared to a personal computer. Severs may also be equipped with additional Some popular on-demand computing services utilized on a daily basis from the cloud are on-line data storage (including music , videos, photos...include storage, processing, memory , and network bandwidth.  Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases

  4. Computational and Experimental Evaluation of a Complex Inlet Swirl Pattern Generation System (POSTPRINT) (United States)


    pattern generation system composed of continuous patterns of turning vanes. Successful demonstration of the evaluation methods required acceptable...CFD analysis to evaluate swirling flow downstream of a swirl pattern generation system composed of continuous patterns of turning vanes. Successful...through a constant diameter spacer duct before it passes the flow measurement plane. The configuration in Fig. 4 includes the flow measurement

  5. Generational Learning Style Preferences Based on Computer-Based Healthcare Training (United States)

    Knight, Michaelle H.


    Purpose. The purpose of this mixed-method study was to determine the degree of perceived differences for auditory, visual and kinesthetic learning styles of Traditionalist, Baby Boomers, Generation X and Millennial generational healthcare workers participating in technology-assisted healthcare training. Methodology. This mixed-method research…

  6. Using Computer-Generated Random Numbers to Calculate the Lifetime of a Comet. (United States)

    Danesh, Iraj


    An educational technique to calculate the lifetime of a comet using software-generated random numbers is introduced to undergraduate physiques and astronomy students. Discussed are the generation and eligibility of the required random numbers, background literature related to the problem, and the solution to the problem using random numbers.…

  7. Next-generation sequence assembly: four stages of data processing and computational challenges.

    Directory of Open Access Journals (Sweden)

    Sara El-Metwally

    Full Text Available Decoding DNA symbols using next-generation sequencers was a major breakthrough in genomic research. Despite the many advantages of next-generation sequencers, e.g., the high-throughput sequencing rate and relatively low cost of sequencing, the assembly of the reads produced by these sequencers still remains a major challenge. In this review, we address the basic framework of next-generation genome sequence assemblers, which comprises four basic stages: preprocessing filtering, a graph construction process, a graph simplification process, and postprocessing filtering. Here we discuss them as a framework of four stages for data analysis and processing and survey variety of techniques, algorithms, and software tools used during each stage. We also discuss the challenges that face current assemblers in the next-generation environment to determine the current state-of-the-art. We recommend a layered architecture approach for constructing a general assembler that can handle the sequences generated by different sequencing platforms.

  8. The Computer Generated Art/Contemporary Cinematography And The Remainder Of The Art History. A Critical Approach


    Modesta Lupașcu


    The paper analyses the re-conceptualization of the intermedial trope of computer generated images/VFX in recent 3D works/cinema scenes through several examples from art history, which are connected with. The obvious connections between art history and images are not conceived primarily as an embodiment of a painting, the introduction of the real into the image, but prove the reconstructive tendencies of contemporary post-postmodern art. The intellectual, the casual, or the obsessive interacti...

  9. A computer program for estimating the power-density spectrum of advanced continuous simulation language generated time histories (United States)

    Dunn, H. J.


    A computer program for performing frequency analysis of time history data is presented. The program uses circular convolution and the fast Fourier transform to calculate power density spectrum (PDS) of time history data. The program interfaces with the advanced continuous simulation language (ACSL) so that a frequency analysis may be performed on ACSL generated simulation variables. An example of the calculation of the PDS of a Van de Pol oscillator is presented.

  10. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator (United States)

    Ma, J.; Liu, Q.


    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  11. Towards the Education of the Future: Computational Thinking as a Generative Learning Mechanism

    Directory of Open Access Journals (Sweden)

    Eduardo SEGREDO


    Full Text Available The transformation of traditional education into a Sensitive, Manageable, Adaptable, Responsive and Timely (SMART education involves the comprehensive modernisation of all educational processes. For such a transformation, smart pedagogies are needed as a methodological issue while smart learning environments represent the technological issue, both having as an ultimate goal to cultivate smart learners. Smart learners need to develop 21st century skills so that they can become into smart citizens of our changing world. Technology and computers are an essential aspect for this modernisation, not only in terms of technological support for smart environments but also in terms of offering new methodologies for smart pedagogy and the development of smart skills. In this context, computational thinking appears as a promising mechanism to encourage core skills since it offers tools that fit learners’ interests and gives them the possibility to better understand the foundations of our ICT-based society and environments. In this work, we raise to make an effort to encourage the development of computational thinking as an opportunity to transform traditional pedagogies to smarter methodologies. We provide a general background about computational thinking and analyse the current state-of-the-art of smart education, emphasizing that there is a lack of smart methodologies which can support the training of 21st century smart skills. Finally, we provide —to those educators interested in pursuing the philosophy of smart education— information about initiatives devoted to the dissemination or promotion of computational thinking; existing tools or materials which support educators for the development of computational thinking among the students; and previous experiences and results about the application of computational thinking in educational environments.

  12. Computer simulation of airflow through a multi-generation tracheobronchial conducting airway

    Energy Technology Data Exchange (ETDEWEB)

    Fan, B.; Cheng, Yung-Sung; Yeh, Hsu-Chi


    Knowledge of airflow patterns in the human lung is important for an analysis of lung diseases and drug delivery of aerosolized medicine for medical treatment. However, very little systematic information is available on the pattern of airflow in the lung and on how this pattern affects the deposition of toxicants in the lung, and the efficacy of aerosol drug therapy. Most previous studies have only considered the airflow through a single bifurcating airway. However, the flow in a network of more than one bifurcation is more complicated due to the effect of interrelated lung generations. Because of the variation of airway geometry and flow condition from generation to generation, a single bifurcating airway cannot be taken as a representative for the others in different generations. The flow in the network varies significantly with airway generations because of a redistribution of axial momentum by the secondary flow motions. The influence of the redistribution of flow is expected in every generation. Therefore, a systematic information of the airflow through a multi-generation tracheobronchial conducting airway is needed, and it becomes the purpose of this study. This study has provided information on airflow in a lung model which is necessary to the study of the deposition of toxicants and therapeutic aerosols.

  13. Now And Next Generation Sequencing Techniques: Future of Sequence Analysis using Cloud Computing

    Directory of Open Access Journals (Sweden)

    Radhe Shyam Thakur


    Full Text Available Advancements in the field of sequencing techniques resulted in the huge sequenced data to be produced at a very faster rate. It is going cumbersome for the datacenter to maintain the databases. Data mining and sequence analysis approaches needs to analyze the databases several times to reach any efficient conclusion. To cope with such overburden on computer resources and to reach efficient and effective conclusions quickly, the virtualization of the resources and computation on pay as you go concept was introduced and termed as cloud computing. The datacenter’s hardware and software is collectively known as cloud which when available publicly is termed as public cloud. The datacenter’s resources are provided in a virtual mode to the clients via a service provider like Amazon, Google and Joyent which charges on pay as you go manner. The workload is shifted to the provider which is maintained by the required hardware and software upgradation. The service provider manages it by upgrading the requirements in the virtual mode. Basically a virtual environment is created according to the need of the user by taking permission from datacenter via internet, the task is performed and the environment is deleted after the task is over. In this discussion, we are focusing on the basics of cloud computing, the prerequisites and overall working of clouds. Furthermore, briefly the applications of cloud computing in biological systems, especially in comparative genomics, genome informatics and SNP detection with reference to traditional workflow are discussed.

  14. Determination of refractive index and absorbance modulation amplitudes from angular selectivity of holograms in polymer material with phenanthrenequinone (United States)

    Borisov, Vladimir; Veniaminov, Andrey


    Amplitude and phase contributions to mixed volume holographic gratings were extracted from measured contours of angular selectivity. Holograms for the investigation were recorded in the glassy polymer material with phenan-threnequinone (PQ) using the DPSS CW laser (532 nm) and then self-developed due to molecular diffusion of PQ, reaching diffraction efficiency about 40%. Refractive index and absorbance modulation amplitudes of those holograms were obtained as adjustable parameters from theoretical equations by fitting angular dependencies of zeros and 1st orders diffraction efficiency measured at 450, 473, 532, and 633 nm at the different stages of hologram development. Mixed gratings manifest themselves in asymmetrical transmittance selectivity contours with one minimum and one maximum shifted with respect to the Bragg angle, while symmetrical contours with a minimum or a maximum at the Bragg angle are characteristic of pure phase and amplitude gratings, respectively. In the course of a hologram development, it converts from a predominantly amplitude-mixed to almost purely phase one in the case of readout using a light within the absorption band of PQ and maintains the phase nature besides it. The value of refractive index amplitude is ranging from 5×10-6 to 10-4 and the value of absorbance amplitude is up to 140 m-1.

  15. 3D real holographic image movies are projected into a volumetric display using dynamic digital micromirror device (DMD) holograms. (United States)

    Huebschman, Michael L.; Hunt, Jeremy; Garner, Harold R.


    The Texas Instruments Digital Micromirror Device (DMD) is being used as the recording medium for display of pre-calculated digital holograms. The high intensity throughput of the reflected laser light from DMD holograms enables volumetric display of projected real images as well as virtual images. A single DMD and single laser projector system has been designed to reconstruct projected images in a 6''x 6''x 4.5'' volumetric display. The volumetric display is composed of twenty-four, 6''-square, PSCT liquid crystal plates which are each cycled on and off to reduce unnecessary scatter in the volume. The DMD is an XGA format array, 1024x768, with 13.6 micron pitch mirrors. This holographic projection system has been used in the assessment of hologram image resolution, maximum image size, optical focusing of the real image, image look-around, and physiological depth cues. Dynamic movement images are projected by transferring the appropriately sequenced holograms to the DMD at movie frame rates.

  16. HOLRED - a machine to reproduce and photograph real images from holograms taken in the 15 foot bubble chamber at Fermilab

    International Nuclear Information System (INIS)

    Nailor, P.R.


    The aim of this paper is to describe the design criteria and philosophy behind a machine to reproject and photograph the real images of neutrino interactions from holograms taken there in the coming run. Detailed analysis of the vertex region of these events will be done from the photographs. (orig./HSI)

  17. Digital hologram transformations for RGB color holographic display with independent image magnification and translation in 3D. (United States)

    Makowski, Piotr L; Zaperty, Weronika; Kozacki, Tomasz


    A new framework for in-plane transformations of digital holograms (DHs) is proposed, which provides improved control over basic geometrical features of holographic images reconstructed optically in full color. The method is based on a Fourier hologram equivalent of the adaptive affine transformation technique [Opt. Express18, 8806 (2010)OPEXFF1094-408710.1364/OE.18.008806]. The solution includes four elementary geometrical transformations that can be performed independently on a full-color 3D image reconstructed from an RGB hologram: (i) transverse magnification; (ii) axial translation with minimized distortion; (iii) transverse translation; and (iv) viewing angle rotation. The independent character of transformations (i) and (ii) constitutes the main result of the work and plays a double role: (1) it simplifies synchronization of color components of the RGB image in the presence of mismatch between capture and display parameters; (2) provides improved control over position and size of the projected image, particularly the axial position, which opens new possibilities for efficient animation of holographic content. The approximate character of the operations (i) and (ii) is examined both analytically and experimentally using an RGB circular holographic display system. Additionally, a complex animation built from a single wide-aperture RGB Fourier hologram is presented to demonstrate full capabilities of the developed toolset.

  18. A low-cost alternative to the optical experiment commercially known as “Magic Hologram – 3D Mirage”

    Directory of Open Access Journals (Sweden)

    Osmar Henrique Moura Silva


    Full Text Available This work presents a low-cost alternative to the commercialized experiment called “Magic Hologram – Mirage 3D”, which reproduces the real image of an object that is seen three-dimensionally in the air. An analysis of this alternative is carried out, indicating educational aspects of its use in the classroom in quantitative terms.

  19. Influence of Softening Temperature of Azobenzene Polymers and External Electric Field on Diffraction Efficiency of Polarization Holograms

    Directory of Open Access Journals (Sweden)

    Nicolay Davidenko


    Full Text Available Growth of the diffraction efficiency and recording velocity was found in the films of copolymers 4-((2- nitrophenyl diazeniylphenylmethacrylate with octylmethacrylate at room temperature holographic recording for copolymer with less softening temperature. Effect of strengthening of the diffraction efficiency was observed when charging surface of the films with recorded hologram in crown discharge.

  20. Animated computer graphics models of space and earth sciences data generated via the massively parallel processor (United States)

    Treinish, Lloyd A.; Gough, Michael L.; Wildenhain, W. David


    The capability was developed of rapidly producing visual representations of large, complex, multi-dimensional space and earth sciences data sets via the implementation of computer graphics modeling techniques on the Massively Parallel Processor (MPP) by employing techniques recently developed for typically non-scientific applications. Such capabilities can provide a new and valuable tool for the understanding of complex scientific data, and a new application of parallel computing via the MPP. A prototype system with such capabilities was developed and integrated into the National Space Science Data Center's (NSSDC) Pilot Climate Data System (PCDS) data-independent environment for computer graphics data display to provide easy access to users. While developing these capabilities, several problems had to be solved independently of the actual use of the MPP, all of which are outlined.

  1. Mesh Generation and Adaption for High Reynolds Number RANS Computations, Phase I (United States)

    National Aeronautics and Space Administration — This proposal offers to provide NASA with an automatic mesh generator for the simulation of aerodynamic flows using Reynolds-Averages Navier-Stokes (RANS) models....

  2. Mesh Generation and Adaption for High Reynolds Number RANS Computations, Phase II (United States)

    National Aeronautics and Space Administration — This proposal offers to provide NASA with an automatic mesh generator for the simulation of aerodynamic flows using Reynolds-Averages Navier-Stokes (RANS) models....

  3. Mesh Generation and Adaption for High Reynolds Number RANS Computations Project (United States)

    National Aeronautics and Space Administration — This proposal offers to provide NASA with an automatic mesh generator for the simulation of aerodynamic flows using Reynolds-Averages Navier-Stokes (RANS) models....

  4. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    International Nuclear Information System (INIS)

    Virtanen, E.; Haapalehto, T.; Kouhia, J.


    Three experiments were conducted to study the behaviour of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes so that the results may be compared. Only the steam generator was modeled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments. (orig.)

  5. Analysis of steam generator loss-of-feedwater experiments with APROS and RELAP5/MOD3.1 computer codes

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, E.; Haapalehto, T. [Lappeenranta Univ. of Technology, Lappeenranta (Finland); Kouhia, J. [VTT Energy, Nuclear Energy, Lappeenranta (Finland)


    Three experiments were conducted to study the behavior of the new horizontal steam generator construction of the PACTEL test facility. In the experiments the secondary side coolant level was reduced stepwise. The experiments were calculated with two computer codes RELAP5/MOD3.1 and APROS version 2.11. A similar nodalization scheme was used for both codes to that the results may be compared. Only the steam generator was modelled and the rest of the facility was given as a boundary condition. The results show that both codes calculate well the behaviour of the primary side of the steam generator. On the secondary side both codes calculate lower steam temperatures in the upper part of the heat exchange tube bundle than was measured in the experiments.

  6. A human-assisted computer generated LA-grammar for simple ...

    African Journals Online (AJOL)

    An example LAG for simple Southern Sotho sentences is shown. Hausser's LAGs are extended with variables and conditionals, making the grammar rules much more powerful while maintaining ease of use for computer parsing. This paper presents the results of the first experiment in a series leading up to a translators' aid ...

  7. Pseudo-random Trees: Multiple Independent Sequence Generators for Parallel and Branching Computations (United States)

    Halton, John H.


    A class of families of linear congruential pseudo-random sequences is defined, for which it is possible to branch at any event without changing the sequence of random numbers used in the original random walk and for which the sequences in different branches show properties analogous to mutual statistical independence. This is a hitherto unavailable, and computationally desirable, tool.

  8. Wave-front reconstruction without twin-image blurring by two arbitrary step digital holograms. (United States)

    Chen, Gu L; Lin, Ching Yang; Yau, Hon Fai; Kuo, Ming Kuei; Chang, Chi Ching


    We discuss a novel approach for numerical wave-front reconstruction which utilizes arbitrary phase step digital holography. Our experimental results demonstrate that only two digital holograms and a simple estimation procedure are required for twin-image suppression, and for numerical reconstruction. One advantage of this approach is its simplicity. Only one estimate equation needs be applied. In addition the optical system can be constructed from inexpensive, generally available elements. Another advantage is the effectiveness of the method. The tolerance of the estimated value is less than 1% different than the actual value. This means that the quality of the reconstructed image is superior. This novel approach should make the application of digital holography easier and more widely available.

  9. Quantum hologram of macroscopically entangled light via the mechanism of diffuse light storage

    International Nuclear Information System (INIS)

    Gerasimov, L V; Sokolov, I M; Kupriyanov, D V; Havey, M D


    In this paper, we consider a quantum memory scheme for light diffusely propagating through a spatially disordered atomic gas. A unique characteristic is enhanced trapping of the signal light pulse by quantum multiple scattering, which can be naturally integrated with the mechanism of stimulated Raman conversion into a long-lived spin coherence. Then, the quantum state of the light can be mapped onto the disordered atomic spin subsystem and can be stored in it for a relatively long time. The proposed memory scheme can be applicable for storage of the macroscopic analogue of the Ψ (−) Bell state and the prepared entangled atomic state performs its quantum hologram, which suggests the possibility of further quantum information processing. (paper)

  10. Reconstruction of Double-Exposed Terahertz Hologram of Non-isolated Object (United States)

    Hu, Jiaqi; Li, Qi; Chen, Guanghao


    When the non-isolated imaging objects are complex or critical imaging precision is required, single-exposure digital hologram technique may be insufficient. So in this paper, double-exposure method is adopted in 2.52-THz inline digital holography simulations and experiments. Experimental results indicate that, compared with the results reconstructed by single-exposure amplitude-constrained phase retrieval algorithm (S-APRA), double-exposure phase-constrained phase retrieval algorithm (D-PPRA) increases the contrast of the reconstruction image by 0.146 and double-exposure amplitude-constrained phase retrieval algorithm (D-APRA) increases the contrast by 0.225. In addition, when applied to non-isolated object reconstruction, phase retrieval algorithms with only amplitude constraint on the object plane work better than those with both amplitude and phase constraints.

  11. Digital holography super-resolution for accurate three-dimensional reconstruction of particle holograms. (United States)

    Verrier, Nicolas; Fournier, Corinne


    In-line digital holography (DH) is used in many fields to locate and size micro or nano-objects spread in a volume. To reconstruct simple shaped objects, the optimal approach is to fit an imaging model to accurately estimate their position and their characteristic parameters. Increasing the accuracy of the reconstruction is a big issue in DH, particularly when the pixel is large or the signal-to-noise ratio is low. We suggest exploiting the information redundancy of videos to improve the reconstruction of the holograms by jointly estimating the position of the objects and the characteristic parameters. Using synthetic and experimental data, we checked experimentally that this approach can improve the accuracy of the reconstruction by a factor more than the square root of the image number.

  12. Facilitation and Coherence Between the Dynamic and Retrospective Perception of Segmentation in Computer-Generated Music

    Directory of Open Access Journals (Sweden)

    Freya Bailes


    Full Text Available We examined the impact of listening context (sound duration and prior presentation on the human perception of segmentation in sequences of computer music. This research extends previous work by the authors (Bailes & Dean, 2005, which concluded that context-dependent effects such as the asymmetrical detection of an increase in timbre compared to a decrease of the same magnitude have a significant bearing on the cognition of sound structure. The current study replicated this effect, and demonstrated that listeners (N = 14 are coherent in their detection of segmentation between real-time and retrospective tasks. In addition, response lag was reduced from a first hearing to a second hearing, and following long (7 s rather than short (1 or 3 s segments. These findings point to the role of short-term memory in dynamic structural perception of computer music.

  13. Uncertainties in source term calculations generated by the ORIGEN2 computer code for Hanford Production Reactors

    International Nuclear Information System (INIS)

    Heeb, C.M.


    The ORIGEN2 computer code is the primary calculational tool for computing isotopic source terms for the Hanford Environmental Dose Reconstruction (HEDR) Project. The ORIGEN2 code computes the amounts of radionuclides that are created or remain in spent nuclear fuel after neutron irradiation and radioactive decay have occurred as a result of nuclear reactor operation. ORIGEN2 was chosen as the primary code for these calculations because it is widely used and accepted by the nuclear industry, both in the United States and the rest of the world. Its comprehensive library of over 1,600 nuclides includes any possible isotope of interest to the HEDR Project. It is important to evaluate the uncertainties expected from use of ORIGEN2 in the HEDR Project because these uncertainties may have a pivotal impact on the final accuracy and credibility of the results of the project. There are three primary sources of uncertainty in an ORIGEN2 calculation: basic nuclear data uncertainty in neutron cross sections, radioactive decay constants, energy per fission, and fission product yields; calculational uncertainty due to input data; and code uncertainties (i.e., numerical approximations, and neutron spectrum-averaged cross-section values from the code library). 15 refs., 5 figs., 5 tabs

  14. On convergence generation in computing the electro-magnetic Casimir force

    International Nuclear Information System (INIS)

    Schuller, F.


    We tackle the very fundamental problem of zero-point energy divergence in the context of the Casimir effect. We calculate the Casimir force due to field fluctuations by using standard cavity radiation modes. The validity of convergence generation by means of an exponential energy cut-off factor is discussed in detail. (orig.)

  15. Using character valence in computer generated music to produce variation aligned to a storyline

    CSIR Research Space (South Africa)

    Featherstone, Coral


    Full Text Available that sentiment in the text of a novel can be used to automatically generate simple piano music that reflects the same sentiment as the novel. This study wished to establish a method whereby, if after aligning the text with the melody, the sentiment in the words...

  16. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning (United States)

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa


    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  17. Development of a methodology to generate materials constant for the FLARE-G computer code

    International Nuclear Information System (INIS)

    Martinez, A.S.; Rosier, C.J.; Schirru, R.; Silva, F.C. da; Thome Filho, Z.D.


    The methodology of calculation aiming to determine the parametrization constants of the multiplication factor and migration area is presented. These physical parameters are necessary in the solution of the diffusion equation with the nodal method, and they represent the adequated form of the macrogroup constants in the cell calculation. An automatic system was done to generate the parametrization constants. (E.G.) [pt

  18. Designing Computer-Supported Complex Systems Curricula for the Next Generation Science Standards in High School Science Classrooms

    Directory of Open Access Journals (Sweden)

    Susan A. Yoon


    Full Text Available We present a curriculum and instruction framework for computer-supported teaching and learning about complex systems in high school science classrooms. This work responds to a need in K-12 science education research and practice for the articulation of design features for classroom instruction that can address the Next Generation Science Standards (NGSS recently launched in the USA. We outline the features of the framework, including curricular relevance, cognitively rich pedagogies, computational tools for teaching and learning, and the development of content expertise, and provide examples of how the framework is translated into practice. We follow this up with evidence from a preliminary study conducted with 10 teachers and 361 students, aimed at understanding the extent to which students learned from the activities. Results demonstrated gains in students’ complex systems understanding and biology content knowledge. In interviews, students identified influences of various aspects of the curriculum and instruction framework on their learning.

  19. Application of computer-generated functional (parametric) maps in radionuclide renography

    International Nuclear Information System (INIS)

    Agress, H. Jr.; Levenson, S.M.; Gelfand, M.J.; Green, M.V.; Bailey, J.J.; Johnston, G.S.


    A functional (parametric) map is a single visual display of regional dynamic phenomena which facilitates interpretation of the nature of focal abnormalities in renal function. Methods for producing several kinds of functional maps based on computer calculations of radionuclide scan data are briefly described. Three abnormal cases are presented to illustrate the use of functional maps to separate focal lesions and to specify the dynamic nature of the abnormalities in a way which is difficult to achieve with conventional sequential renal scans and renograms alone

  20. A general-purpose computer program for studying ultrasonic beam patterns generated with acoustic lenses (United States)

    Roberti, Dino; Ludwig, Reinhold; Looft, Fred J.


    A 3-D computer model of a piston radiator with lenses for focusing and defocusing is presented. To achieve high-resolution imaging, the frequency of the transmitted and received ultrasound must be as high as 10 MHz. Current ultrasonic transducers produce an extremely narrow beam at these high frequencies and thus are not appropriate for imaging schemes such as synthetic-aperture focus techniques (SAFT). Consequently, a numerical analysis program has been developed to determine field intensity patterns that are radiated from ultrasonic transducers with lenses. Lens shapes are described and the field intensities are numerically predicted and compared with experimental results.

  1. Development of a new generation solid rocket motor ignition computer code (United States)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Ciucci, Alessandro; Johnson, Shelby D.


    This report presents the results of experimental and numerical investigations of the flow field in the head-end star grain slots of the Space Shuttle Solid Rocket Motor. This work provided the basis for the development of an improved solid rocket motor ignition transient code which is also described in this report. The correlation between the experimental and numerical results is excellent and provides a firm basis for the development of a fully three-dimensional solid rocket motor ignition transient computer code.

  2. Computing the flow past Vortex Generators: Comparison between RANS Simulations and Experiments

    DEFF Research Database (Denmark)

    Manolesos, M.; Sørensen, Niels N.; Troldborg, Niels


    The flow around a wind turbine airfoil equipped with Vortex Generators (VGs) is examined. Predictions from three different Reynolds Averaged Navier Stokes (RANS) solvers with two different turbulence models and two different VG modelling approaches are compared between them and with experimental ...... data. The best results are obtained with the more expensive fully resolved VG approach. The cost efficient BAY model can also provide acceptable results, if grid related numerical diffusion is minimized and only force coefficient polars are considered....

  3. Mood Expression in Real-Time Computer Generated Music using Pure Data


    Scirea, Marco; Nelson, Mark; Cheong, Yun-Gyung; Bae, Byung Chull


    This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short m...

  4. An automated tetrahedral mesh generator for computer simulation in Odontology based on the Delaunay's algorithm

    Directory of Open Access Journals (Sweden)

    Mauro Massayoshi Sakamoto


    Full Text Available In this work, a software package based on the Delaunay´s algorithm is described. The main feature of this package is the capability in applying discretization in geometric domains of teeth taking into account their complex inner structures and the materials with different hardness. Usually, the mesh generators reported in literature treat molars and other teeth by using simplified geometric models, or even considering the teeth as homogeneous structures.


    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  6. Shortfall online: The development of an educational computer game for teaching sustainable engineering to Millennial Generation students (United States)

    Gennett, Zachary Andrew

    Millennial Generation students bring significant learning and teaching challenges to the classroom, because of their unique learning styles, breadth of interests related to social and environmental issues, and intimate experiences with technology. As a result, there has been an increased willingness at many universities to experiment with pedagogical strategies that depart from a traditional "learning by listening" model, and move toward more innovative methods involving active learning through computer games. In particular, current students typically express a strong interest in sustainability in which economic concerns must be weighed relative to environmental and social responsibilities. A game-based setting could prove very effective for fostering an operational understanding of these tradeoffs, and especially the social dimension which remains largely underdeveloped relative to the economic and environmental aspects. Through an examination of the educational potential of computer games, this study hypothesizes that to acquire the skills necessary to manage and understand the complexities of sustainability, Millennial Generation students must be engaged in active learning exercises that present dynamic problems and foster a high level of social interaction. This has led to the development of an educational computer game, entitled Shortfall, which simulates a business milieu for testing alternative paths regarding the principles of sustainability. This study examines the evolution of Shortfall from an educational board game that teaches the principles of environmentally benign manufacturing, to a completely networked computer game, entitled Shortfall Online that teaches the principles of sustainability. A capital-based theory of sustainability is adopted to more accurately convey the tradeoffs and opportunity costs among economic prosperity, environmental preservation, and societal responsibilities. While the economic and environmental aspects of sustainability

  7. Features generated for computational splice-site prediction correspond to functional elements

    Directory of Open Access Journals (Sweden)

    Wilbur W John


    Full Text Available Abstract Background Accurate selection of splice sites during the splicing of precursors to messenger RNA requires both relatively well-characterized signals at the splice sites and auxiliary signals in the adjacent exons and introns. We previously described a feature generation algorithm (FGA that is capable of achieving high classification accuracy on human 3' splice sites. In this paper, we extend the splice-site prediction to 5' splice sites and explore the generated features for biologically meaningful splicing signals. Results We present examples from the observed features that correspond to known signals, both core signals (including the branch site and pyrimidine tract and auxiliary signals (including GGG triplets and exon splicing enhancers. We present evidence that features identified by FGA include splicing signals not found by other methods. Conclusion Our generated features capture known biological signals in the expected sequence interval flanking splice sites. The method can be easily applied to other species and to similar classification problems, such as tissue-specific regulatory elements, polyadenylation sites, promoters, etc.

  8. Parallel paving: An algorithm for generating distributed, adaptive, all-quadrilateral meshes on parallel computers

    Energy Technology Data Exchange (ETDEWEB)

    Lober, R.R.; Tautges, T.J.; Vaughan, C.T.


    Paving is an automated mesh generation algorithm which produces all-quadrilateral elements. It can additionally generate these elements in varying sizes such that the resulting mesh adapts to a function distribution, such as an error function. While powerful, conventional paving is a very serial algorithm in its operation. Parallel paving is the extension of serial paving into parallel environments to perform the same meshing functions as conventional paving only on distributed, discretized models. This extension allows large, adaptive, parallel finite element simulations to take advantage of paving`s meshing capabilities for h-remap remeshing. A significantly modified version of the CUBIT mesh generation code has been developed to host the parallel paving algorithm and demonstrate its capabilities on both two dimensional and three dimensional surface geometries and compare the resulting parallel produced meshes to conventionally paved meshes for mesh quality and algorithm performance. Sandia`s {open_quotes}tiling{close_quotes} dynamic load balancing code has also been extended to work with the paving algorithm to retain parallel efficiency as subdomains undergo iterative mesh refinement.

  9. An automated procedure for calculating system matrices from perturbation data generated by an EAI Pacer and 100 hybrid computer system (United States)

    Milner, E. J.; Krosel, S. M.


    Techniques are presented for determining the elements of the A, B, C, and D state variable matrices for systems simulated on an EAI Pacer 100 hybrid computer. An automated procedure systematically generates disturbance data necessary to linearize the simulation model and stores these data on a floppy disk. A separate digital program verifies this data, calculates the elements of the system matrices, and prints these matrices appropriately labeled. The partial derivatives forming the elements of the state variable matrices are approximated by finite difference calculations.

  10. Preliminary calculation for fission products generation and accumulation in different types of fuel rods by computer code FPRM-1

    International Nuclear Information System (INIS)

    Ishiwatari, Nasumi


    The computer code ''FPRM-1'' has been developed for calculation of the quantities of fission products gases released from pellets into plenum in a fuel rod. On the assumption that the irradiation tests of plutonium fuel and others under development in an in-pile water loop were performed, FP generations and accumulations in the fuel rods were calculated by the code. The result of measurement of 131 I released from a fuel rod (UO 2 pellets, 235 U 1.5% Enriched) with an artificial hole through cladding in an in-pile water loop was compared with that of calculation by the code; both were in good agreement. (author)

  11. Measurement of T1 by echo-planar imaging and the construction of computer-generated images

    International Nuclear Information System (INIS)

    Mansfield, P.; Guilfoyle, D.N.; Ordidge, R.J.; Coupland, R.E.


    The high-speed echo-planar imaging (EPI) technique is used to obtain rapid T 1 and spin density measurements by a two-point method. It is shown that neglect of edge effects in the slice selection procedure leads to significant systematic errors in T 1 . T 1 maps for two young patients, obtained at 4.0 MHz, are presented. The T 1 and spin density values obtained are used to produce computer-generated images in inversion recovery simulations. These results demonstrate marked improvement in image contrast without paying the time penalty incurred in real experiments, thereby greatly increasing patient throughput potential. (author)

  12. Color Helmet Mounted Display System with Real Time Computer Generated and Video Imagery for In-Flight Simulation (United States)

    Sawyer, Kevin; Jacobsen, Robert; Aiken, Edwin W. (Technical Monitor)


    NASA Ames Research Center and the US Army are developing the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) using a Sikorsky UH-60 helicopter for the purpose of flight systems research. A primary use of the RASCAL is in-flight simulation for which the visual scene will use computer generated imagery and synthetic vision. This research is made possible in part to a full color wide field of view Helmet Mounted Display (HMD) system that provides high performance color imagery suitable for daytime operations in a flight-rated package. This paper describes the design and performance characteristics of the HMD system. Emphasis is placed on the design specifications, testing, and integration into the aircraft of Kaiser Electronics' RASCAL HMD system that was designed and built under contract for NASA. The optical performance and design of the Helmet mounted display unit will be discussed as well as the unique capabilities provided by the system's Programmable Display Generator (PDG).


    CERN Multimedia

    I. Fisk


    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...


    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...


    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. Influence of the operational parameters on bioelectricity generation in continuous microbial fuel cell, experimental and computational fluid dynamics modelling (United States)

    Sobieszuk, Paweł; Zamojska-Jaroszewicz, Anna; Makowski, Łukasz


    The influence of the organic loading rate (also known as active anodic chamber volume) on bioelectricity generation in a continuous, two-chamber microbial fuel cell for the treatment of synthetic wastewater, with glucose as the only carbon source, was examined. Ten sets of experiments with different combinations of hydraulic retention times (0.24-1.14 d) and influent chemical oxygen demand concentrations were performed to verify the impact of organic loading rate on the voltage generation capacity of a simple dual-chamber microbial fuel cell working in continuous mode. We found that there is an optimal hydraulic retention time value at which the maximum voltage is generated: 0.41 d. However, there were no similar effects, in terms of voltage generation, when a constant hydraulic retention time with different influent chemical oxygen demand of wastewater was used. The obtained maximal voltage value (600 mV) has also been compared to literature data. Computational fluid dynamics (CFD) was used to calculate the fluid flow and the exit age distribution of fluid elements in the reactor to explain the obtained experimental results and identify the crucial parameters for the design of bioreactors on an industrial scale.

  17. Mood Expression in Real-Time Computer Generated Music using Pure Data

    DEFF Research Database (Denmark)

    Scirea, Marco; Nelson, Mark; Cheong, Yun-Gyung


    This paper presents an empirical study that investigated if procedurally generated music based on a set of musical features can elicit a target mood in the music listener. Drawn from the two-dimensional affect model proposed by Russell, the musical features that we have chosen to express moods...... are intensity, timbre, rhythm, and dissonances. The eight types of mood investigated in this study are being bored, content, happy, miserable, tired, fearful, peaceful, and alarmed. We created 8 short music clips using PD (Pure Data) programming language, each of them represents a particular mood. We carried...... out a pilot study and present a preliminary result....

  18. Three-dimensional weight-accumulation algorithm for generating multiple excitation spots in fast optical stimulation (United States)

    Takiguchi, Yu; Toyoda, Haruyoshi


    We report here an algorithm for calculating a hologram to be employed in a high-access speed microscope for observing sensory-driven synaptic activity across all inputs to single living neurons in an intact cerebral cortex. The system is based on holographic multi-beam generation using a two-dimensional phase-only spatial light modulator to excite multiple locations in three dimensions with a single hologram. The hologram was calculated with a three-dimensional weighted iterative Fourier transform method using the Ewald sphere restriction to increase the calculation speed. Our algorithm achieved good uniformity of three dimensionally generated excitation spots; the standard deviation of the spot intensities was reduced by a factor of two compared with a conventional algorithm.

  19. Environmental conditions influence for real-time hologram formation on dichromated polyvinyl alcohol NiCl2·6H2O doped films

    International Nuclear Information System (INIS)

    Fontanilla-Urdaneta, R C; Olivares-Perez, A; Fuentes-Tapia, I; Rios-Velasco, M A


    The real-time holographic gratings recording are studied by the presence of a metallic salt. The experimental process refers to analysis of diffraction efficiency by the influence of humidity in the coating solution on holograms formation in presence of electrical potential. The diffraction efficiency is measured as a function of the exposure energy until reach the saturation. The influence of the hologram parameters to get the diffraction efficiency is studied at room conditions.

  20. Automated generation of patient-tailored electronic care pathways by translating computer-interpretable guidelines into hierarchical task networks. (United States)

    González-Ferrer, Arturo; ten Teije, Annette; Fdez-Olivares, Juan; Milian, Krystyna


    This paper describes a methodology which enables computer-aided support for the planning, visualization and execution of personalized patient treatments in a specific healthcare process, taking into account complex temporal constraints and the allocation of institutional resources. To this end, a translation from a time-annotated computer-interpretable guideline (CIG) model of a clinical protocol into a temporal hierarchical task network (HTN) planning domain is presented. The proposed method uses a knowledge-driven reasoning process to translate knowledge previously described in a CIG into a corresponding HTN Planning and Scheduling domain, taking advantage of HTNs known ability to (i) dynamically cope with temporal and resource constraints, and (ii) automatically generate customized plans. The proposed method, focusing on the representation of temporal knowledge and based on the identification of workflow and temporal patterns in a CIG, makes it possible to automatically generate time-annotated and resource-based care pathways tailored to the needs of any possible patient profile. The proposed translation is illustrated through a case study based on a 70 pages long clinical protocol to manage Hodgkin's disease, developed by the Spanish Society of Pediatric Oncology. We show that an HTN planning domain can be generated from the corresponding specification of the protocol in the Asbru language, providing a running example of this translation. Furthermore, the correctness of the translation is checked and also the management of ten different types of temporal patterns represented in the protocol. By interpreting the automatically generated domain with a state-of-art HTN planner, a time-annotated care pathway is automatically obtained, customized for the patient's and institutional needs. The generated care pathway can then be used by clinicians to plan and manage the patients long-term care. The described methodology makes it possible to automatically generate

  1. A flowsheet model of a coal-fired MHD/steam combined electricity generating cycle, using the access computer model

    International Nuclear Information System (INIS)

    Davison, J.E.; Eldershaw, C.E.


    This document forms the final report on a study of a coal-fired magnetohydrodynamic (MHD)/steam electric power generation system carried out by British Coal Corporation for the Commission of the European Communities. The study objective was to provide mass and energy balances and overall plant efficiency predictions for MHD to assist the Commission in their evaluation of advanced power generation technologies. In early 1990 the British Coal Corporation completed a study for the Commission in which a computer flowsheet modelling package was used to predict the performance of a conceptual air blown MHD plant. Since that study was carried out increasing emphasis has been placed on the possible need to reduce CO 2 emissions to counter the so-called greenhouse effect. Air blown MHD could greatly reduce CO 2 emissions per KWh by virtue of its high thermal efficiency. However, if even greater reductions in CO 2 emissions were required the CO 2 produced by coal combustion may have to be disposed of, for example into the deep ocean or underground caverns. To achieve this at minimum cost a concentrated CO 2 flue gas would be required. This could be achieved in an MHD plant by using a mixture of high purity oxygen and recycled CO 2 flue gas in the combustor. To assess this plant concept the European Commission awarded British Coal a contract to produce performance predictions using the access computer program


    CERN Multimedia

    I. Fisk


    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. On random field Completely Automated Public Turing Test to Tell Computers and Humans Apart generation. (United States)

    Kouritzin, Michael A; Newton, Fraser; Wu, Biao


    Herein, we propose generating CAPTCHAs through random field simulation and give a novel, effective and efficient algorithm to do so. Indeed, we demonstrate that sufficient information about word tests for easy human recognition is contained in the site marginal probabilities and the site-to-nearby-site covariances and that these quantities can be embedded directly into certain conditional probabilities, designed for effective simulation. The CAPTCHAs are then partial random realizations of the random CAPTCHA word. We start with an initial random field (e.g., randomly scattered letter pieces) and use Gibbs resampling to re-simulate portions of the field repeatedly using these conditional probabilities until the word becomes human-readable. The residual randomness from the initial random field together with the random implementation of the CAPTCHA word provide significant resistance to attack. This results in a CAPTCHA, which is unrecognizable to modern optical character recognition but is recognized about 95% of the time in a human readability study.

  4. Computational investigations of the mixing performance inside liquid slugs generated by a microfluidic T-junction. (United States)

    Li, Yuehao; Reddy, Rupesh K; Kumar, Challa S S R; Nandakumar, Krishnaswamy


    Droplet-based microfluidics has gained extensive research interest as it overcomes several challenges confronted by conventional single-phase microfluidics. The mixing performance inside droplets/slugs is critical in many applications such as advanced material syntheses and in situ kinetic measurements. In order to understand the effects of operating conditions on the mixing performance inside liquid slugs generated by a microfluidic T-junction, we have adopted the volume of fluid method coupled with the species transport model to study and quantify the mixing efficiencies inside slugs. Our simulation results demonstrate that an efficient mixing process is achieved by the intimate collaboration of the twirling effect and the recirculating flow. Only if the reagents are distributed transversely by the twirling effect, the recirculating flow can bring in convection mechanism thus facilitating mixing. By comparing the mixing performance inside slugs at various operating conditions, we find that slug size plays the key role in influencing the mixing performance as it determines the amount of fluid to be distributed by the twirling effect. For the cases where short slugs are generated, the mixing process is governed by the fast convection mechanism because the twirling effect can distribute the fluid to the flow path of the recirculating flow effectively. For cases with long slugs, the mixing process is dominated by the slow diffusion mechanism since the twirling effect is insufficient to distribute the large amount of fluid. In addition, our results show that increasing the operating velocity has limited effects on improving the mixing performance. This study provides the insight of the mixing process and may benefit the design and operations of droplet-based microfluidics.

  5. Improvement of Sodium Neutronic Nuclear Data for the Computation of Generation IV Reactors

    International Nuclear Information System (INIS)

    Archier, P.


    The safety criteria to be met for Generation IV sodium fast reactors (SFR) require reduced and mastered uncertainties on neutronic quantities of interest. Part of these uncertainties come from nuclear data and, in the particular case of SFR, from sodium nuclear data, which show significant differences between available international libraries (JEFF-3.1.1, ENDF/B-VII.0, JENDL-4.0). The objective of this work is to improve the knowledge on sodium nuclear data for a better calculation of SFR neutronic parameters and reliable associated uncertainties. After an overview of existing 23 Na data, the impact of the differences is quantified, particularly on sodium void reactivity effects, with both deterministic and stochastic neutronic codes. Results show that it is necessary to completely re-evaluate sodium nuclear data. Several developments have been made in the evaluation code Conrad, to integrate new nuclear reactions models and their associated parameters and to perform adjustments with integral measurements. Following these developments, the analysis of differential data and the experimental uncertainties propagation have been performed with Conrad. The resolved resonances range has been extended up to 2 MeV and the continuum range begins directly beyond this energy. A new 23 Na evaluation and the associated multigroup covariances matrices were generated for future uncertainties calculations. The last part of this work focuses on the sodium void integral data feedback, using methods of integral data assimilation to reduce the uncertainties on sodium cross sections. This work ends with uncertainty calculations for industrial-like SFR, which show an improved prediction of their neutronic parameters with the new evaluation. (author) [fr

  6. Designing Second Generation Anti-Alzheimer Compounds as Inhibitors of Human Acetylcholinesterase: Computational Screening of Synthetic Molecules and Dietary Phytochemicals.

    Directory of Open Access Journals (Sweden)

    Hafsa Amat-Ur-Rasool

    Full Text Available Alzheimer's disease (AD, a big cause of memory loss, is a progressive neurodegenerative disorder. The disease leads to irreversible loss of neurons that result in reduced level of acetylcholine neurotransmitter (ACh. The reduction of ACh level impairs brain functioning. One aspect of AD therapy is to maintain ACh level up to a safe limit, by blocking acetylcholinesterase (AChE, an enzyme that is naturally responsible for its degradation. This research presents an in-silico screening and designing of hAChE inhibitors as potential anti-Alzheimer drugs. Molecular docking results of the database retrieved (synthetic chemicals and dietary phytochemicals and self-drawn ligands were compared with Food and Drug Administration (FDA approved drugs against AD as controls. Furthermore, computational ADME studies were performed on the hits to assess their safety. Human AChE was found to be most approptiate target site as compared to commonly used Torpedo AChE. Among the tested dietry phytochemicals, berberastine, berberine, yohimbine, sanguinarine, elemol and naringenin are the worth mentioning phytochemicals as potential anti-Alzheimer drugs The synthetic leads were mostly dual binding site inhibitors with two binding subunits linked by a carbon chain i.e. second generation AD drugs. Fifteen new heterodimers were designed that were computationally more efficient inhibitors than previously reported compounds. Using computational methods, compounds present in online chemical databases can be screened to design more efficient and safer drugs against cognitive symptoms of AD.

  7. Role of a computer-generated three-dimensional laryngeal model in anatomy teaching for advanced learners. (United States)

    Tan, S; Hu, A; Wilson, T; Ladak, H; Haase, P; Fung, K


    (1) To investigate the efficacy of a computer-generated three-dimensional laryngeal model for laryngeal anatomy teaching; (2) to explore the relationship between students' spatial ability and acquisition of anatomical knowledge; and (3) to assess participants' opinion of the computerised model. Forty junior doctors were randomised to undertake laryngeal anatomy study supplemented by either a three-dimensional computer model or two-dimensional images. Outcome measurements comprised a laryngeal anatomy test, the modified Vandenberg and Kuse mental rotation test, and an opinion survey. Mean scores ± standard deviations for the anatomy test were 15.7 ± 2.0 for the 'three dimensions' group and 15.5 ± 2.3 for the 'standard' group (p = 0.7222). Pearson's correlation between the rotation test scores and the scores for the spatial ability questions in the anatomy test was 0.4791 (p = 0.086, n = 29). Opinion survey answers revealed significant differences in respondents' perceptions of the clarity and 'user friendliness' of, and their preferences for, the three-dimensional model as regards anatomical study. The three-dimensional computer model was equivalent to standard two-dimensional images, for the purpose of laryngeal anatomy teaching. There was no association between students' spatial ability and functional anatomy learning. However, students preferred to use the three-dimensional model.

  8. Application of computer graphics to generate coal resources of the Cache coal bed, Recluse geologic model area, Campbell County, Wyoming (United States)

    Schneider, G.B.; Crowley, S.S.; Carey, M.A.


    Low-sulfur subbituminous coal resources have been calculated, using both manual and computer methods, for the Cache coal bed in the Recluse Model Area, which covers the White Tail Butte, Pitch Draw, Recluse, and Homestead Draw SW 7 1/2 minute quadrangles, Campbell County, Wyoming. Approximately 275 coal thickness measurements obtained from drill hole data are evenly distributed throughout the area. The Cache coal and associated beds are in the Paleocene Tongue River Member of the Fort Union Formation. The depth from the surface to the Cache bed ranges from 269 to 1,257 feet. The thickness of the coal is as much as 31 feet, but in places the Cache coal bed is absent. Comparisons between hand-drawn and computer-generated isopach maps show minimal differences. Total coal resources calculated by computer show the bed to contain 2,316 million short tons or about 6.7 percent more than the hand-calculated figure of 2,160 million short tons.

  9. Realization of the developing potential of training to computer science in conditions of adoption of the second generation state educational standards

    Directory of Open Access Journals (Sweden)

    Сергей Георгиевич Григорьев


    Full Text Available In article requirements to training to computer science and an information technology, formulated with a position of planned results presented in the standard of the second generation are described.

  10. Automated a complex computer aided design concept generated using macros programming (United States)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md


    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  11. Computer-assisted generation of individual training concepts for advanced education in manufacturing metrology

    International Nuclear Information System (INIS)

    Werner, Teresa; Weckenmann, Albert


    Due to increasing requirements on the accuracy and reproducibility of measurement results together with a rapid development of novel technologies for the execution of measurements, there is a high demand for adequately qualified metrologists. Accordingly, a variety of training offers are provided by machine manufacturers, universities and other institutions. Yet, for an interested learner it is very difficult to define an optimal training schedule for his/her individual demands. Therefore, a computer-based assistance tool is developed to support a demand-responsive scheduling of training. Based on the difference between the actual and intended competence profile and under consideration of amending requirements, an optimally customized qualification concept is derived. For this, available training offers are categorized according to different dimensions: regarding contents of the course, but also intended target groups, focus of the imparted competences, implemented methods of learning and teaching, expected constraints for learning and necessary preknowledge. After completing a course, the achieved competences and the transferability of gathered knowledge are evaluated. Based on the results, recommendations for amending measures of learning are provided. Thus, a customized qualification for manufacturing metrology is facilitated, adapted to the specific needs and constraints of each individual learner

  12. Automated a complex computer aided design concept generated using macros programming

    International Nuclear Information System (INIS)

    Ramly, Mohammad Rizal; Asrokin, Azharrudin; Rahman, Safura Abd; Zulkifly, Nurul Ain Md


    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes

  13. Computer analysis of flow perturbations generated by placement of choke bumps in a wind tunnel (United States)

    Campbell, R. L.


    An inviscid analytical study was conducted to determine the upstream flow perturbations caused by placing choke bumps in a wind tunnel. A computer program based on the stream-tube curvature method was used to calculate the resulting flow fields for a nominal free-stream Mach number range of 0.6 to 0.9. The choke bump geometry was also varied to investigate the effect of bump shape on the disturbance produced. Results from the study indicate that a region of significant variation from the free-stream conditions exists upstream of the throat of the tunnel. The extent of the disturbance region was, as a rule, dependent on Mach number and the geometry of the choke bump. In general, the upstream disturbance distance decreased for increasing nominal free-stream Mach number and for decreasing length-to-height ratio of the bump. A polynomial-curve choke bump usually produced less of a disturbance than did a circular-arc bump and going to an axisymmetric configuration (modeling choke bumps on all the tunnel walls) generally resulted in a lower disturbance than with the corresponding two dimensional case.

  14. The schemes and methods for producing of the visual security features used in the color hologram stereography (United States)

    Lushnikov, D. S.; Zherdev, A. Y.; Odinokov, S. B.; Markin, V. V.; Smirnov, A. V.


    Visual security elements used in color holographic stereograms - three-dimensional colored security holograms - and methods their production is describes in this article. These visual security elements include color micro text, color-hidden image, the horizontal and vertical flip - flop effects by change color and image. The article also presents variants of optical systems that allow record the visual security elements as part of the holographic stereograms. The methods for solving of the optical problems arising in the recording visual security elements are presented. Also noted perception features of visual security elements for verification of security holograms by using these elements. The work was partially funded under the Agreement with the RF Ministry of Education and Science № 14.577.21.0197, grant RFMEFI57715X0197.

  15. Research on the superimposed frame number of terahertz digital holograms in double-exposed phase retrieval algorithm (United States)

    Hu, Jia-qi; Li, Qi; Chen, Guang-hao


    Phase retrieval algorithm is gradually applied to the terahertz in-line digital holography, as its effectiveness in the removal of the zero-order diffraction light and twin image. Based on the experiments, the reconstruction results of the double-exposed phase retrieval algorithm are obtained which demonstrate that the method can improve the quality of reconstructed images, compared to the angular spectrum method. Furthermore, the influence of the superimposed frame number of the holograms in double-exposed phase retrieval algorithm is discussed. The result shows that the recording time can be reduced by reasonably decreasing the frames of the holograms and it contributes to real-time imaging and the algorithm practicality is also further improved.

  16. Optimum aberration coefficients for recording high-resolution off-axis holograms in a Cs-corrected TEM

    International Nuclear Information System (INIS)

    Linck, Martin


    Amongst the impressive improvements in high-resolution electron microscopy, the Cs-corrector also has significantly enhanced the capabilities of off-axis electron holography. Recently, it has been shown that the signal above noise in the reconstructable phase can be significantly improved by combining holography and hardware aberration correction. Additionally, with a spherical aberration close to zero, the traditional optimum focus for recording high-resolution holograms (“Lichte's defocus”) has become less stringent and both, defocus and spherical aberration, can be selected freely within a certain range. This new degree of freedom can be used to improve the signal resolution in the holographically reconstructed object wave locally, e.g. at the atomic positions. A brute force simulation study for an aberration corrected 200 kV TEM is performed to determine optimum values for defocus and spherical aberration for best possible signal to noise in the reconstructed atomic phase signals. Compared to the optimum aberrations for conventional phase contrast imaging (NCSI), which produce “bright atoms” in the image intensity, the resulting optimum values of defocus and spherical aberration for off-axis holography enable “black atom contrast” in the hologram. However, they can significantly enhance the local signal resolution at the atomic positions. At the same time, the benefits of hardware aberration correction for high-resolution off-axis holography are preserved. It turns out that the optimum is depending on the object and its thickness and therefore not universal. -- Highlights: ► Optimized aberration parameters for high-resolution off-axis holography. ► Simulation and analysis of noise in high-resolution off-axis holograms. ► Improving signal resolution in the holographically reconstructed phase shift. ► Comparison of “black” and “white” atom contrast in off-axis holograms.

  17. Morphometric description of the feline radius and ulna generated from computed tomography. (United States)

    Preston, Timothy; Glyde, Mark; Hosgood, Giselle; Snow, Lynne


    This study aimed to describe the length, internal and external diameters, cancellous bone volume and extent, and cortical thickness at predetermined locations in the radius and ulna of a cohort of skeletally mature, disease-free feline cadavers using radiography and computed tomography (CT). Five feline cadavers were used (mean weight 3.31 kg, range 2.55-4.24 kg). Antebrachii (n = 10) were radiographed to confirm skeletal maturity and normal radiographic appearance prior to CT. Reconstructed CT images were used to measure bone length, cortical thickness, internal and external diameters, and cancellous extent. Cancellous bone volume was calculated automatically using OsiriX after manual segmentation (350-850 Hounsfield units window) from axial CT slices. CT images were used to measure bone length, cortical thickness, internal and external diameters, and cancellous extent and volume. Mean radial length was 95.89 mm (95% confidence interval [CI] 88.52-103.26 mm) and mean ulna length was 114.67 mm (95% CI 105.53-123.81 mm). The olecranon had the largest mean cancellous bone volume (94.16 mm(3); 95% CI 72.09-116.23 mm(3)) and it extended a mean of 13.12 mm (95% CI 11.73-14.51 mm) distally. The radius at the level of the trochlea and the ulna at the level of the coronoid processes had the largest external diameters, respectively. The medullary canal narrowed at the level of the coronoid processes and became cranially eccentric at the proximal third of the diaphysis. The cranial cortex at the level of the coronoid processes and the caudal cortex of the olecranon were markedly thicker than other cortices at those levels. Morphometry of the feline antebrachium was described using CT, and should be a useful reference for future research investigations and clinical applications. © ISFM and AAFP 2014.

  18. The influence of computer-generated path on the robot’s effector stability of motion (United States)

    Foit, K.; Banaś, W.; Gwiazda, A.; Ćwikła, G.


    The off-line trajectory planning is often carried out due to economical and practical reasons: the robot is not excluded from the production process and the operator could benefit from testing programs in the virtual environment. On the other hand, the dedicated off-line programming and simulation software is often limited in features and is intended to roughly check the program. It should be expected that the arm of the real robot’s manipulator will realize the trajectory in different manner: the acceleration and deceleration phases may trigger the vibrations of the kinematic chain that could affect the precision of effector positioning and degrade the quality of process realized by the robot. The purpose of this work is the analysis of the selected cases, when the robot’s effector has been moved along the programmed path. The off-line generated, test trajectories have different arrangement of points: such approach has allowed evaluating the time needed to complete the each of the tasks, as well as measuring the level of the vibration of the robot’s wrist. All tests were performed without the load. The conclusions of the experiment may be useful during the trajectory planning in order to avoid the critical configuration of points.

  19. KNOW-BLADE task-3.3 report: Rotor blade computations with 3D vortex generators

    DEFF Research Database (Denmark)

    Johansen, J.; Sørensen, Niels N.; Reck, M.


    . They are: 1) A non-rotating airfoil section with VGs. 2) A rotating airfoil section with VGs. 3) A non-rotating wind turbine blade with VGs. The airfoil section was the FFA-W3-241 airfoil, which has beenmeasured in the VELUX wind tunnel with and without VGs placed at different chord wise positions. Three......The present report describes the work done in work package WP3.3: Aerodynamic Accessories in 3D in the EC project KNOW-BLADE. Vortex generators (VGs) are modelled in 3D Navier-Stokes solvers and applied on the flow around an airfoil and a wind turbineblade. Three test cases have been investigated...... of the partners have modelled the airfoil section as a thin airfoil section with symmetry boundary conditions in the span wise direction to simulate anarray of VGs. The wind turbine blade is the LM19.1 blade equipped with one pair of VGs placed at radius = 8.5 m. In general all partners have successfully modelled...

  20. Proton generation and transport in the fuel cell environment: atomistic computer simulations (United States)

    Spohr, Eckhard


    Hydrogen atoms in direct methanol fuel cells are produced ’in situ’ by dissociation of methanol on precious metal catalysts (Pt, Pt/Ru) in an aqueous environment. The abstraction of the first hydrogen atom via C H bond cleavage is generally considered to be the rate-limiting step of dissociative methanol adsorption on the catalyst surface. This oxidation reaction on platinum particles in a fuel cell is investigated by means of a combined approach of classical molecular dynamics (MD) simulations and ab initio calculations in order to obtain an understanding of the role of the solvent for the stabilization of intermediates and for the enhancement of proton desorption from the catalyst surface and subsequent transfer into the nearby polymer electrolyte membrane (PEM). The anodically generated protons need to migrate efficiently through the membrane to the cathode were they are consumed. At the same time water and methanol (in a direct methanol fuel cell) transport should be slow. Humidified PEMs are considered to consist of a nanometer-scale phase-separated bicontinuous network of polymer regions providing structural integrity, and of aqueous regions providing the pathways for proton conduction. MD simulations provide a powerful theoretical tool for the investigation and clarification of the relationship between molecular structure and these transport phenomena. In order to atomistically model larger fractions of a humidified PEM, a coarse-grained model of humidified polymer electrolyte membranes has been developed.


    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...


    CERN Multimedia

    I. Fisk


    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  3. Computer program MCAP-TOSS calculates steady-state fluid dynamics of coolant in parallel channels and temperature distribution in surrounding heat-generating solid (United States)

    Lee, A. Y.


    Computer program calculates the steady state fluid distribution, temperature rise, and pressure drop of a coolant, the material temperature distribution of a heat generating solid, and the heat flux distributions at the fluid-solid interfaces. It performs the necessary iterations automatically within the computer, in one machine run.

  4. Effect of the Power Balance® hologram on balance, flexibility, strength and speed-coordination among university students

    Directory of Open Access Journals (Sweden)

    Rafael Merino Marban, Daniel Mayorga Vega, Emilio Fernández Rodríguez, Francisco José Santana Pérez, Oscar Romero Ramos


    Full Text Available Based on the body’s energy field, the inventors of Power Balance® have created a hologram that theoretically runs through frequencies that are in our natural environment. Its creators say that people may experience improve balance, strength, flexibility, endurance, concentration, coordination and recovery time, among others. The purpose of this research is to evaluate the effect of Power Balance® hologram on balance, flexibility, strength and speed-coordination in university students. A sample of 105 young volunteers’ physical education students (age 20.91 ± 3.36 years, mass 69.69 ± 11.35 kg, height 171.70 ± 8.07 cm was used. A between-group experimental design with double-blind control group was used to evaluate the possible effects of the Power Balance ® on the dynamic balance, flexibility, abdominal strength, endurance and speed-coordination measured with the Dynamic Balance Test, Sit and Reach, Sit-ups in 30 seconds and Race 10 x 5 m, respectively. A t of Student for independent and dependent samples was used to assess the potential effects between-group and intra-group, respectively. Power Balance®’s hologram produces no significant effects on the balance, flexibility, strength and speed-coordination among university students

  5. Controlling the optical fiber output beam profile by focused ion beam machining of a phase hologram on fiber tip. (United States)

    Han, Jiho; Sparkes, Martin; O'Neill, William


    A phase hologram was machined on an optical fiber tip using a focused ion beam (FIB) system so that a ring-shaped beam emerges from the fiber tip. The fiber used for this work was a commercial single-mode optical fiber patch cable for a design wavelength of 633 nm with a germanosilicate core. The ring-shaped beam was chosen to ensure a simple geometry in the required phase hologram, though the Gerchberg-Saxton algorithm can be used to calculate a hologram for an arbitrary beam shape. The FIB machining took approximately 45 min at 30 kV and 200 pA. The radius of the resulting ring beam was 0.083 m at 1 m standoff, as compared to 0.1 m as was initially desired. Results suggest that this imaging technique may provide a basis for a beam-shaping method with several advantages over the current commercial solutions, having permanent alignment, compactness, and mechanical robustness. However, it would appear that minimizing the speckle pattern will remain a critical challenge for this technique to become widely implemented.

  6. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography (United States)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU


    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...


    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...


    CERN Multimedia

    I. Fisk


    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...


    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...


    CERN Multimedia

    I. Fisk


    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...


    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...


    CERN Multimedia

    I. Fisk


    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  14. Atomistics of carbon nanotube-polyacrylonitrile interfaces for next-generation carbon fibers: A multiscale computational study (United States)

    Lee, Juho; Choi, Ji Il; Jang, Seung Soon; Kumar, Satish; Cho, Art E.; Kim, Yong-Hoon

    Atomic-scale understanding of the carbon nanotube (CNT) - polyacrylonitrile (PAN) interfaces is a critical missing element for the development of next-generation carbon fibers. In this presentation, we provide the systematic atomistic analyses of the CNT-PAN interfaces based on a multiscale computational approach combining density-functional theory (DFT) and force-fields molecular dynamics (FFMD) simulations. Based on DFT calculations, we identify the preferable CNT-PAN configurations and furthermore elucidate the electronic origin of the CNT-PAN binding. Next, via FFMD simulations, we extract more realistic large-scale interfacial CNT-PAN atomic configurations and confirm that they faithfully reflect the geometric motives identified in DFT calculations. Implications of our findings in the context of development of advanced carbon fibers will be discussed. corresponding author.

  15. Measurement of T/sub 1/ by echo-planar imaging and the construction of computer-generated images

    Energy Technology Data Exchange (ETDEWEB)

    Mansfield, P.; Guilfoyle, D.N.; Ordidge, R.J.; Coupland, R.E.


    The high-speed echo-planar imaging (EPI) technique is used to obtain rapid T/sub 1/ and spin density measurements by a two-point method. It is shown that neglect of edge effects in the slice selection procedure leads to significant systematic errors in T/sub 1/. T/sub 1/ maps for two young patients, obtained at 4.0 MHz, are presented. The T/sub 1/ and spin density values obtained are used to produce computer-generated images in inversion recovery simulations. These results demonstrate marked improvement in image contrast without paying the time penalty incurred in real experiments, thereby greatly increasing patient throughput potential.

  16. DInSAR time series generation within a cloud computing environment: from ERS to Sentinel-1 scenario (United States)

    Casu, Francesco; Elefante, Stefano; Imperatore, Pasquale; Lanari, Riccardo; Manunta, Michele; Zinno, Ivana; Mathot, Emmanuel; Brito, Fabrice; Farres, Jordi; Lengert, Wolfgang


    One of the techniques that will strongly benefit from the advent of the Sentinel-1 system is Differential SAR Interferometry (DInSAR), which has successfully demonstrated to be an effective tool to detect and monitor ground displacements with centimetre accuracy. The geoscience communities (volcanology, seismicity, …), as well as those related to hazard monitoring and risk mitigation, make extensively use of the DInSAR technique and they will take advantage from the huge amount of SAR data acquired by Sentinel-1. Indeed, such an information will successfully permit the generation of Earth's surface displacement maps and time series both over large areas and long time span. However, the issue of managing, processing and analysing the large Sentinel data stream is envisaged by the scientific community to be a major bottleneck, particularly during crisis phases. The emerging need of creating a common ecosystem in which data, results and processing tools are shared, is envisaged to be a successful way to address such a problem and to contribute to the information and knowledge spreading. The Supersites initiative as well as the ESA SuperSites Exploitation Platform (SSEP) and the ESA Cloud Computing Operational Pilot (CIOP) projects provide effective answers to this need and they are pushing towards the development of such an ecosystem. It is clear that all the current and existent tools for querying, processing and analysing SAR data are required to be not only updated for managing the large data stream of Sentinel-1 satellite, but also reorganized for quickly replying to the simultaneous and highly demanding user requests, mainly during emergency situations. This translates into the automatic and unsupervised processing of large amount of data as well as the availability of scalable, widely accessible and high performance computing capabilities. The cloud computing environment permits to achieve all of these objectives, particularly in case of spike and peak

  17. High-resolution full-parallax computer-generated holographic stereogram created by e-beam technology (United States)

    Goncharsky, Alexander; Goncharsky, Anton; Durlevich, Svyatoslav


    A high-resolution computer-generated stereogram for forming full-parallax three-dimensional (3-D) images is proposed. A full-parallax 3-D image is formed from 825 two-dimensional (2-D) projections and can be observed in a wide angular range. The stereogram is a reflective diffractive optical element (DOE) that consists of 50×50 μm2 hogels, where each hogel corresponds to one pixel of the 2-D frames. A phase-type kinoform is computed in every hogel by solving a nonlinear inverse problem. The DOE relief is fabricated using electron-beam technology with pixel size of 0.2×0.2 μm2. The effectiveness of the technology developed is illustrated by photographs and a video of a real DOE under monochromatic light and under white light. The new high-resolution full-parallax stereograms can be used for protecting bank notes, documents, and ID cards against counterfeit.

  18. Final Technical Report: Sparse Grid Scenario Generation and Interior Algorithms for Stochastic Optimization in a Parallel Computing Environment

    Energy Technology Data Exchange (ETDEWEB)

    Mehrotra, Sanjay [Northwestern Univ., Evanston, IL (United States)


    The support from this grant resulted in seven published papers and a technical report. Two papers are published in SIAM J. on Optimization [87, 88]; two papers are published in IEEE Transactions on Power Systems [77, 78]; one paper is published in Smart Grid [79]; one paper is published in Computational Optimization and Applications [44] and one in INFORMS J. on Computing [67]). The works in [44, 67, 87, 88] were funded primarily by this DOE grant. The applied papers in [77, 78, 79] were also supported through a subcontract from the Argonne National Lab. We start by presenting our main research results on the scenario generation problem in Sections 1–2. We present our algorithmic results on interior point methods for convex optimization problems in Section 3. We describe a new ‘central’ cutting surface algorithm developed for solving large scale convex programming problems (as is the case with our proposed research) with semi-infinite number of constraints in Section 4. In Sections 5–6 we present our work on two application problems of interest to DOE.

  19. Operational procedure for computer program for design point characteristics of a compressed-air generator with through-flow combustor for V/STOL applications (United States)

    Krebs, R. P.


    The computer program described in this report calculates the design-point characteristics of a compressed-air generator for use in V/STOL applications such as systems with a tip-turbine-driven lift fan. The program computes the dimensions and mass, as well as the thermodynamic performance of a model air generator configuration which involves a straight through-flow combustor. Physical and thermodynamic characteristics of the air generator components are also given. The program was written in FORTRAN IV language. Provision has been made so that the program will accept input values in either SI units or U.S. customary units. Each air generator design-point calculation requires about 1.5 seconds of 7094 computer time for execution.

  20. Computational model to investigate the relative contributions of different neuromuscular properties of tibialis anterior on force generated during ankle dorsiflexion. (United States)

    Siddiqi, Ariba; Poosapadi Arjunan, Sridhar; Kumar, Dinesh Kant


    This study describes a new model of the force generated by tibialis anterior muscle with three new features: single-fiber action potential, twitch force, and pennation angle. This model was used to investigate the relative effects and interaction of ten age-associated neuromuscular parameters. Regression analysis (significance level of 0.05) between the neuromuscular properties and corresponding simulated force produced at the footplate was performed. Standardized slope coefficients were computed to rank the effect of the parameters. The results show that reduction in the average firing rate is the reason for the sharp decline in the force and other factors, such as number of muscle fibers, specific force, pennation angle, and innervation ratio. The fast fiber ratio affects the simulated force through two significant interactions. This study has ranked the individual contributions of the neuromuscular factors to muscle strength decline of the TA and identified firing rate decline as the biggest cause followed by decrease in muscle fiber number and specific force. The strategy for strength preservation for the elderly should focus on improving firing rate. Graphical abstract Neuromuscular properties of Tibialis Anterior on force generated during ankle dorsiflexion.

  1. Computational and experimental progress on laser-activated gas avalanche switches for broadband, high-power electromagnetic pulse generation

    International Nuclear Information System (INIS)

    Mayhall, D.J.; Yee, J.H.; Villa, F.


    This paper discusses the gas avalanche switch, a high-voltage, picosecond-speed switch, which has been proposed. The basic switch consists of pulse-charged electrodes, immersed in a high-pressure gas. An avalanche discharge is induced in the gas between the electrodes by ionization from a picosecond-scale laser pulse. The avalanching electrons move toward the anode, causing the applied voltage to collapse in picoseconds. This voltage collapse, if rapid enough, generates electromagnetic waves. A two-dimensional (2D), finite difference computer code solves Maxwell's equations for transverse magnetic modes for rectilinear electrodes between parallel plate conductors, along with electron conservation equations for continuity, momentum, and energy. Collision frequencies for ionization and momentum and energy transfer to neutral molecules are assumed to scale linearly with neutral pressure. Electrode charging and laser-driven electron deposition are assumed to be instantaneous. Code calculations are done for a pulse generator geometry, consisting of an 0.7 mm wide by 0.8 mm high, beveled, rectangular center electrode between grounded parallel plates at 2 mm spacing in air

  2. Computational methodology of sodium-water reaction phenomenon in steam generator of sodium-cooled fast reactor

    International Nuclear Information System (INIS)

    Takata, Takashi; Yamaguchi, Akira; Uchibori, Akihiro; Ohshima, Hiroyuki


    A new computational methodology of sodium-water reaction (SWR), which occurs in a steam generator of a liquid-sodium-cooled fast reactor when a heat transfer tube in the steam generator fails, has been developed considering multidimensional and multiphysics thermal hydraulics. Two kinds of reaction models are proposed in accordance with a phase of sodium as a reactant. One is the surface reaction model in which water vapor reacts directly with liquid sodium at the interface between the liquid sodium and the water vapor. The reaction heat will lead to a vigorous evaporation of liquid sodium, resulting in a reaction of gas-phase sodium. This is designated as the gas-phase reaction model. These two models are coupled with a multidimensional, multicomponent gas, and multiphase thermal hydraulics simulation method with compressibility (named the 'SERAPHIM' code). Using the present methodology, a numerical investigation of the SWR under a pin-bundle configuration (a benchmark analysis of the SWAT-1R experiment) has been carried out. As a result, the maximum gas temperature of approximately 1,300degC is predicted stably, which lies within the range of previous experimental observations. It is also demonstrated that the maximum temperature of the mass weighted average in the analysis agrees reasonably well with the experimental result measured by thermocouples. The present methodology will be promising to establish a theoretical and mechanical modeling of secondary failure propagation of heat transfer tubes due to such as an overheating rupture and a wastage. (author)

  3. A Computer-Aided Bibliometrics System for Journal Citation Analysis and Departmental Core Journal Ranking List Generation

    Directory of Open Access Journals (Sweden)

    Yih-Chearng Shiue


    Full Text Available Due to the tremendous increase and variation in serial publications, faculties in department of university are finding it difficult to generate and update their departmental core journal list regularly and accurately, and libraries are finding it difficult to maintain their current serial collection for different departments. Therefore, the evaluation of a departmental core journal list is an important task for departmental faculties and librarians. A departmental core journal list not only helps departments understand research performances of faculties and students, but also helps librarians make decisions about which journals to retain and which to cancel. In this study, a Computer-Aided Bibliometrics System was implemented and two methodologies (JCDF and LibJF were proposed in order to generate a departmental core journal ranking list and make the journal citation analysis. Six departments were taken as examples, with MIS as the major one. One journal citation pattern was found and the ratio of Turning point-to-No. journal was always around 0.07 among the 10 journals and 6 departments. After comparing with four methodologies via overlapping rate and standard deviation distances, the two proposed methodologies were shown to be better than questionnaire and library subscription method.

  4. Impact on Image Quality and Radiation Dose of Third-Generation Dual-Source Computed Tomography of the Coronary Arteries. (United States)

    Apfaltrer, Georg; Szolar, Dieter H; Wurzinger, Eric; Takx, Richard A P; Nance, John W; Dutschke, Anja; Tschauner, Sebastian; Loewe, Christian; Ringl, Helmut; Sorantin, Erich; Apfaltrer, Paul


    The aim of this study was to assess the image quality (IQ) and radiation dose of third-generation dual-source computed tomography (CT) coronary angiography (cCTA) in comparison with 64-slice single-source CT. This retrospective study included 140 patients (73 men, mean age 62 ± 11 years) with low-to-intermediate probability of coronary artery disease who underwent either third-generation dual-source cCTA using prospectively electrocardiography-triggered high-pitch spiral acquisition (n = 70) (group 1) or retrospective electrocardiography-gated cCTA on a 64-slice CT system (n = 70) (group 2). Contrast-to-noise and signal-to-noise ratios were measured within the aorta and coronary arteries. Subjective IQ was assessed using a 5-point Likert scale. Effective dose was estimated using specific conversion factors. The contrast-to-noise ratio of group 1 was significantly higher than group 2 at all levels (all p source CT system for cCTA leads to improved IQ with significant radiation dose savings. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Three-dimensional computational fluid dynamics analysis of buoyancy-driven natural ventilation and entropy generation in a prismatic greenhouse

    Directory of Open Access Journals (Sweden)

    Aich Walid


    Full Text Available A computational analysis of the natural ventilation process and entropy generation in 3-D prismatic greenhouse was performed using CFD. The aim of the study is to investigate how buoyancy forces influence air-flow and temperature patterns inside the greenhouse having lower level opening in its right heated façade and also upper level opening near the roof top in the opposite cooled façade. The bot-tom and all other walls are assumed to be perfect thermal insulators. Rayleigh number is the main parameter which changes from 103 to 106 and Prandtl number is fixed at Pr = 0.71. Results are reported in terms of particles trajectories, iso-surfaces of temperature, mean Nusselt number, and entropy generation. It has been found that the flow structure is sensitive to the value of Rayleigh number and that heat transfer increases with increasing this parameter. Also, it have been noticed that, using asymmetric opening positions improve the natural ventilation and facilitate the occurrence of buoyancy induced upward cross air-flow (low-level supply and upper-level extraction inside the greenhouse.


    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...


    CERN Document Server


    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...


    CERN Multimedia

    I. Fisk


    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...


    CERN Multimedia

    I. Fisk


      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...


    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...


    CERN Multimedia

    Contributions from I. Fisk


    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  12. Future tense: call for a new generation of artists (United States)

    Ohlmann, Dietmar


    Some people try hard to educate others about the beauty and technical benefits of holographic applications but another generation is already waiting to learn more about the media which talk to them about the future. Today the most common question is 'How can I do holograms with a computer?' 'Can I do it with an Amiga?' For the MIT specialists these are now very simple questions. We can expect to see the present shape of the holographic laboratory pass into history. I personally like to work with a VHS camera and mix it with CAD/CAM images, but computer and video are not the only media which will change the face of holography. The He.Ne. will be exchanged by diode laser. In a wavelength of 690 nm, some of them bring 40 mW in single mode and single line, not bigger than your little finger. Having such energy in so little a container, and the state of the art drifts rapidly into more flexibility. Using new media and introducing it in our societies give us a new responsibility. Would too much media kill the art? I do not think so, because I like the variety of media which give new possibility of expression. The game with new media is the power of creativity and it will find its meaning by itself.


    CERN Multimedia

    I. Fisk


    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  14. Production of an optimized matrix hologram and development of a copying process for large format holograms for solar applications. Final report; Herstellung von optimierten Materhologrammen und Entwicklung eines Kopierverfahrens fuer grossformatige Hologramme fuer die Solaranwendung; Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Stojanoff, C.G.; Brasseur, O.; Froehlich, K.; Kubitzek, R.; Schuette, H.; Tropartz, S.


    The aim of the research project is the development of a technique for reasonably priced mass- production of large area holographic focussers for photo-electrics. The holograms up to 1 m{sup 2} in size simultaneously act as spectrum dispersing and focussing optical elements, which focus the sunlight in discrete bands on spectrally matched solar cells. The three most important characteristics of the holographic focussers are the refractive efficiency, the spectrum bandwidth and the central working wavelength. The optimum values of the parameters conditional on the variation of the layer thickness and the mean refractive index via the aperture of the hologram and on the ability of the layer to make large modulations of the refractive index possible. The mass-production of large format holographic lenses demands precise coating of the carrier substrate of the hologram copies with dichromate gelatine and is done with the aid of a scanning copying process. (orig.) [Deutsch] Ziel des Forschungsvorhabens ist die Entwicklung einer Technololgie fuer die preiswerte serielle Herstellung von grossflaechigen holographischen Konzentratoren fuer die Photovoltaik. Die bis zu 1 m{sup 2} grossen Hologramme wirken gleichzeitig als spektral dispergierende und fokussierende optische Elemente, die das Sonnenlicht in diskreten Baendern auf spektral angepasste Solarzellen konzentrieren. Die drei wichtigsten Charakteristiken der holographischen Konzentratoren sind die Beugungseffizienz, die spektrale Bandbreite und die zentrale Arbeitswellenlaenge. Die optimalen Werte dieser Parameter sind eindeutig durch die Variation der Schichtdicke und des mittleren Brechungsindexes ueber die Apertur des Hologramms sowie durch die Faehigkeit der Schicht, grosse Brechnungsindexmodulation zu ermoeglichen, bedingt. Die serielle Fertigung der grossformatigen holographischen Linsen verlangt eine praezise Beschichtung der Traegersubstrate der Hologrammkopien mit Dichromat-Gelatine und erfolgt mit Hilfe eines

  15. LSG: An External-Memory Tool to Compute String Graphs for Next-Generation Sequencing Data Assembly. (United States)

    Bonizzoni, Paola; Vedova, Gianluca Della; Pirola, Yuri; Previtali, Marco; Rizzi, Raffaella


    The large amount of short read data that has to be assembled in future applications, such as in metagenomics or cancer genomics, strongly motivates the investigation of disk-based approaches to index next-generation sequencing (NGS) data. Positive results in this direction stimulate the investigation of efficient external memory algorithms for de novo assembly from NGS data. Our article is also motivated by the open problem of designing a space-efficient algorithm to compute a string graph using an indexing procedure based on the Burrows-Wheeler transform (BWT). We have developed a disk-based algorithm for computing string graphs in external memory: the light string graph (LSG). LSG relies on a new representation of the FM-index that is exploited to use an amount of main memory requirement that is independent from the size of the data set. Moreover, we have developed a pipeline for genome assembly from NGS data that integrates LSG with the assembly step of SGA (Simpson and Durbin, 2012 ), a state-of-the-art string graph-based assembler, and uses BEETL for indexing the input data. LSG is open source software and is available online. We have analyzed our implementation on a 875-million read whole-genome dataset, on which LSG has built the string graph using only 1GB of main memory (reducing the memory occupation by a factor of 50 with respect to SGA), while requiring slightly more than twice the time than SGA. The analysis of the entire pipeline shows an important decrease in memory usage, while managing to have only a moderate increase in the running time.

  16. Generation of Gaussian 09 Input Files for the Computation of 1H and 13C NMR Chemical Shifts of Structures from a Spartan’14 Conformational Search




    Authors: Spencer Reisbick & Patrick Willoughby ### Abstract This protocol describes an approach to preparing a series of Gaussian 09 computational input files for an ensemble of conformers generated in Spartan’14. The resulting input files are necessary for computing optimum geometries, relative conformer energies, and NMR shielding tensors using Gaussian. Using the conformational search feature within Spartan’14, an ensemble of conformational isomers was obtained. To convert the str...

  17. Exploring Shifts in Middle School Learners' Modeling Activity While Generating Drawings, Animations, and Computational Simulations of Molecular Diffusion (United States)

    Wilkerson-Jerde, Michelle H.; Gravel, Brian E.; Macrander, Christopher A.


    Modeling and using technology are two practices of particular interest to K-12 science educators. These practices are inextricably linked among professionals, who engage in modeling activity with and across a variety of representational technologies. In this paper, we explore the practices of five sixth-grade girls as they generated models of smell diffusion using drawing, stop-motion animation, and computational simulation during a multi-day workshop. We analyze video, student discourse, and artifacts to address the questions: In what ways did learners' modeling practices, reasoning about mechanism, and ideas about smell shift as they worked across this variety of representational technologies? And, what supports enabled them to persist and progress in the modeling activity? We found that the girls engaged in two distinct modeling cycles that reflected persistence and deepening engagement in the task. In the first, messing about, they focused on describing and representing many ideas related to the spread of smell at once. In the second, digging in, they focused on testing and revising specific mechanisms that underlie smell diffusion. Upon deeper analysis, we found these cycles were linked to the girls' invention of "oogtom," a representational object that encapsulated many ideas from the first cycle and allowed the girls to restart modeling with the mechanistic focus required to construct simulations. We analyze the role of activity design, facilitation, and technological infrastructure in this pattern of engagement over the course of the workshop and discuss implications for future research, curriculum design, and classroom practice.

  18. A study of the first-generation pipeline embolization device morphology using intraoperative angiographic computed tomography (ACT)

    International Nuclear Information System (INIS)

    Aurboonyawat, Thaweesak; Schmidt, Paul J.; Piotin, Michel; Blanc, Raphael; Spelle, Laurant; Moret, Jacques


    The pipeline embolization device (PED, Chestnut Medical, Menlo Park, CA, USA) has been used in our department since September 2008. The first-generation PED had limited radio-opacity. Before September 2008, we began obtaining an angiographic computed tomography (ACT) before and after each procedure to detect intracranial complications. We retrospectively examined the ACT of our patient's with the PED to evaluate the in vivo stent morphology. Twelve patients had a PED placed in our department from September 2008 to January 2009. The stent morphology (stent profile and wall apposition) of three segments of each stent was evaluated. Metal coils adjacent to the stent created too much artifact to evaluate the stent morphology in 4 of 12 patients. Two of the 12 patients were excluded for other reasons. Post-processing of the ACT images was necessary to optimize the evaluation of the stent morphology. Six intracranial PEDs could be adequately evaluated by the ACT, and for these particular cases, 18 of 18 stent segments showed an optimal stent profile and 14 of 18 stent segments showed optimal arterial wall apposition. ACT provided detailed images of the morphology of the PED in six patients. ACT helped detect two stent segments that required balloon dilation to improve the stent-arterial wall apposition; and during the retrospective analysis (after refining post-processing techniques), we identified one additional stent with suboptimal arterial wall apposition. The main limitation of the ACT was the additional radiation dose to the patient. (orig.)

  19. Pixel-size-maintained image reconstruction of digital holograms on arbitrarily tilted planes by the angular spectrum method. (United States)

    Jeong, Seung Jun; Hong, Chung Ki


    We present an effective method for the pixel-size-maintained reconstruction of images on arbitrarily tilted planes in digital holography. The method is based on the plane wave expansion of the diffraction wave fields and the three-axis rotation of the wave vectors. The images on the tilted planes are reconstructed without loss of the frequency contents of the hologram and have the same pixel sizes. Our method shows good results in the extreme cases of large tilting angles and in the region closer than the paraxial case. The effectiveness of the method is demonstrated by both simulation and experiment.

  20. Spiral phase plates with radial discontinuities for the generation of multiring orbital angular momentum beams: fabrication, characterization, and application (United States)

    Ruffato, Gianluca; Massari, Michele; Carli, Marta; Romanato, Filippo


    A design of spiral phase plates for the generation of multiring beams carrying orbital angular momentum (OAM) is presented. Besides the usual helical profile, these phase plates present radial π-discontinuities in correspondence of the zeros of the associated Laguerre polynomials. Samples were fabricated by electron beam lithography over glass substrates coated with a polymethylmethacrylate resist layer. The optical response was analyzed and the purity of the generated beams was investigated in terms of Laguerre-Gaussian modes contributions. The far-field intensity pattern was compared with theoretical models and numerical simulations, while the expected phase features were confirmed by interferometric analysis with a Mach-Zehnder setup. The high quality of the output beams confirms the applicability of these phase plates for the generation of high-order OAM beams with nonzero radial index. An application consisting of the design of computer-generated holograms encoding information for light beams carrying phase singularities is presented and described. A numerical code based on an iterative Fourier transform algorithm has been developed for the computation of phase-only diffractive optical element for illumination under OAM beams. Numerical analysis and preliminary experimental results confirm the applicability of these devices as high-security optical elements for anticounterfeiting applications.