WorldWideScience

Sample records for streak camera based

  1. Design of microcontroller based system for automation of streak camera

    International Nuclear Information System (INIS)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-01-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  2. Design of microcontroller based system for automation of streak camera.

    Science.gov (United States)

    Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  3. Design of microcontroller based system for automation of streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P. [Laser Electronics Support Division, RRCAT, Indore 452013 (India)

    2010-08-15

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  4. Soft x-ray streak cameras

    International Nuclear Information System (INIS)

    Stradling, G.L.

    1988-01-01

    This paper is a discussion of the development and of the current state of the art in picosecond soft x-ray streak camera technology. Accomplishments from a number of institutions are discussed. X-ray streak cameras vary from standard visible streak camera designs in the use of an x-ray transmitting window and an x-ray sensitive photocathode. The spectral sensitivity range of these instruments includes portions of the near UV and extends from the subkilovolt x- ray region to several tens of kilovolts. Attendant challenges encountered in the design and use of x-ray streak cameras include the accommodation of high-voltage and vacuum requirements, as well as manipulation of a photocathode structure which is often fragile. The x-ray transmitting window is generally too fragile to withstand atmospheric pressure, necessitating active vacuum pumping and a vacuum line of sight to the x-ray signal source. Because of the difficulty of manipulating x-ray beams with conventional optics, as is done with visible light, the size of the photocathode sensing area, access to the front of the tube, the ability to insert the streak tube into a vacuum chamber and the capability to trigger the sweep with very short internal delay times are issues uniquely relevant to x-ray streak camera use. The physics of electron imaging may place more stringent limitations on the temporal and spatial resolution obtainable with x-ray photocathodes than with the visible counterpart. Other issues which are common to the entire streak camera community also concern the x-ray streak camera users and manufacturers

  5. Laser-based terahertz-field-driven streak camera for the temporal characterization of ultrashort processes

    Energy Technology Data Exchange (ETDEWEB)

    Schuette, Bernd

    2011-09-15

    In this work, a novel laser-based terahertz-field-driven streak camera is presented. It allows for a pulse length characterization of femtosecond (fs) extreme ultraviolet (XUV) pulses by a cross-correlation with terahertz (THz) pulses generated with a Ti:sapphire laser. The XUV pulses are emitted by a source of high-order harmonic generation (HHG) in which an intense near-infrared (NIR) fs laser pulse is focused into a gaseous medium. The design and characterization of a high-intensity THz source needed for the streak camera is also part of this thesis. The source is based on optical rectification of the same NIR laser pulse in a lithium niobate crystal. For this purpose, the pulse front of the NIR beam is tilted via a diffraction grating to achieve velocity matching between NIR and THz beams within the crystal. For the temporal characterization of the XUV pulses, both HHG and THz beams are focused onto a gas target. The harmonic radiation creates photoelectron wavepackets which are then accelerated by the THz field depending on its phase at the time of ionization. This principle adopted from a conventional streak camera and now widely used in attosecond metrology. The streak camera presented here is an advancement of a terahertz-field-driven streak camera implemented at the Free Electron Laser in Hamburg (FLASH). The advantages of the laser-based streak camera lie in its compactness, cost efficiency and accessibility, while providing the same good quality of measurements as obtained at FLASH. In addition, its flexibility allows for a systematic investigation of streaked Auger spectra which is presented in this thesis. With its fs time resolution, the terahertz-field-driven streak camera thereby bridges the gap between attosecond and conventional cameras. (orig.)

  6. Laser-based terahertz-field-driven streak camera for the temporal characterization of ultrashort processes

    International Nuclear Information System (INIS)

    Schuette, Bernd

    2011-09-01

    In this work, a novel laser-based terahertz-field-driven streak camera is presented. It allows for a pulse length characterization of femtosecond (fs) extreme ultraviolet (XUV) pulses by a cross-correlation with terahertz (THz) pulses generated with a Ti:sapphire laser. The XUV pulses are emitted by a source of high-order harmonic generation (HHG) in which an intense near-infrared (NIR) fs laser pulse is focused into a gaseous medium. The design and characterization of a high-intensity THz source needed for the streak camera is also part of this thesis. The source is based on optical rectification of the same NIR laser pulse in a lithium niobate crystal. For this purpose, the pulse front of the NIR beam is tilted via a diffraction grating to achieve velocity matching between NIR and THz beams within the crystal. For the temporal characterization of the XUV pulses, both HHG and THz beams are focused onto a gas target. The harmonic radiation creates photoelectron wavepackets which are then accelerated by the THz field depending on its phase at the time of ionization. This principle adopted from a conventional streak camera and now widely used in attosecond metrology. The streak camera presented here is an advancement of a terahertz-field-driven streak camera implemented at the Free Electron Laser in Hamburg (FLASH). The advantages of the laser-based streak camera lie in its compactness, cost efficiency and accessibility, while providing the same good quality of measurements as obtained at FLASH. In addition, its flexibility allows for a systematic investigation of streaked Auger spectra which is presented in this thesis. With its fs time resolution, the terahertz-field-driven streak camera thereby bridges the gap between attosecond and conventional cameras. (orig.)

  7. Compact Optical Technique for Streak Camera Calibration

    International Nuclear Information System (INIS)

    Bell, P; Griffith, R; Hagans, K; Lerche, R; Allen, C; Davies, T; Janson, F; Justin, R; Marshall, B; Sweningsen, O

    2004-01-01

    The National Ignition Facility (NIF) is under construction at the Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses1 (optical comb generators) that are suitable for temporal calibrations. These optical comb generators (Figure 1) are used with the LLNL optical streak cameras. They are small, portable light sources that produce a series of temporally short, uniformly spaced, optical pulses. Comb generators have been produced with 0.1, 0.5, 1, 3, 6, and 10-GHz pulse trains of 780-nm wavelength light with individual pulse durations of ∼25-ps FWHM. Signal output is via a fiber-optic connector. Signal is transported from comb generator to streak camera through multi-mode, graded-index optical fibers. At the NIF, ultra-fast streak-cameras are used by the Laser Fusion Program experimentalists to record fast transient optical signals. Their temporal resolution is unmatched by any other transient recorder. Their ability to spatially discriminate an image along the input slit allows them to function as a one-dimensional image recorder, time-resolved spectrometer, or multichannel transient recorder. Depending on the choice of photocathode, they can be made sensitive to photon energies from 1.1 eV to 30 keV and beyond. Comb generators perform two important functions for LLNL streak-camera users. First, comb generators are used as a precision time-mark generator for calibrating streak camera sweep rates. Accuracy is achieved by averaging many streak camera images of comb generator signals. Time-base calibrations with portable comb generators are easily done in both the calibration laboratory and in situ. Second, comb signals are applied

  8. rf streak camera based ultrafast relativistic electron diffraction.

    Science.gov (United States)

    Musumeci, P; Moody, J T; Scoby, C M; Gutierrez, M S; Tran, T

    2009-01-01

    We theoretically and experimentally investigate the possibility of using a rf streak camera to time resolve in a single shot structural changes at the sub-100 fs time scale via relativistic electron diffraction. We experimentally tested this novel concept at the UCLA Pegasus rf photoinjector. Time-resolved diffraction patterns from thin Al foil are recorded. Averaging over 50 shots is required in order to get statistics sufficient to uncover a variation in time of the diffraction patterns. In the absence of an external pump laser, this is explained as due to the energy chirp on the beam out of the electron gun. With further improvements to the electron source, rf streak camera based ultrafast electron diffraction has the potential to yield truly single shot measurements of ultrafast processes.

  9. Streak camera recording of interferometer fringes

    International Nuclear Information System (INIS)

    Parker, N.L.; Chau, H.H.

    1977-01-01

    The use of an electronic high-speed camera in the streaking mode to record interference fringe motion from a velocity interferometer is discussed. Advantages of this method over the photomultiplier tube-oscilloscope approach are delineated. Performance testing and data for the electronic streak camera are discussed. The velocity profile of a mylar flyer accelerated by an electrically exploded bridge, and the jump-off velocity of metal targets struck by these mylar flyers are measured in the camera tests. Advantages of the streak camera include portability, low cost, ease of operation and maintenance, simplified interferometer optics, and rapid data analysis

  10. Microprocessor-controlled wide-range streak camera

    Science.gov (United States)

    Lewis, Amy E.; Hollabaugh, Craig

    2006-08-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  11. Microprocessor-controlled, wide-range streak camera

    International Nuclear Information System (INIS)

    Amy E. Lewis; Craig Hollabaugh

    2006-01-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized

  12. Notes on the IMACON 500 streak camera system

    International Nuclear Information System (INIS)

    Clendenin, J.E.

    1985-01-01

    The notes provided are intended to supplement the instruction manual for the IMACON 500 streak camera system. The notes cover the streak analyzer, instructions for timing the streak camera, and calibration

  13. Traveling wave deflector design for femtosecond streak camera

    International Nuclear Information System (INIS)

    Pei, Chengquan; Wu, Shengli; Luo, Duan; Wen, Wenlong; Xu, Junkai; Tian, Jinshou; Zhang, Minrui; Chen, Pin; Chen, Jianzhong; Liu, Rong

    2017-01-01

    In this paper, a traveling wave deflection deflector (TWD) with a slow-wave property induced by a microstrip transmission line is proposed for femtosecond streak cameras. The pass width and dispersion properties were simulated. In addition, the dynamic temporal resolution of the femtosecond camera was simulated by CST software. The results showed that with the proposed TWD a femtosecond streak camera can achieve a dynamic temporal resolution of less than 600 fs. Experiments were done to test the femtosecond streak camera, and an 800 fs dynamic temporal resolution was obtained. Guidance is provided for optimizing a femtosecond streak camera to obtain higher temporal resolution.

  14. Traveling wave deflector design for femtosecond streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Pei, Chengquan; Wu, Shengli [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China); Luo, Duan [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Wen, Wenlong [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); Xu, Junkai [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Tian, Jinshou, E-mail: tianjs@opt.ac.cn [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); Collaborative Innovation Center of Extreme Optics, Shanxi University, Taiyuan, Shanxi 030006 (China); Zhang, Minrui; Chen, Pin [Xi’an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of Chinese Academy of Sciences, Beijing 100049 (China); Chen, Jianzhong [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China); Liu, Rong [Xi' an Technological University, Xi' an 710021 (China)

    2017-05-21

    In this paper, a traveling wave deflection deflector (TWD) with a slow-wave property induced by a microstrip transmission line is proposed for femtosecond streak cameras. The pass width and dispersion properties were simulated. In addition, the dynamic temporal resolution of the femtosecond camera was simulated by CST software. The results showed that with the proposed TWD a femtosecond streak camera can achieve a dynamic temporal resolution of less than 600 fs. Experiments were done to test the femtosecond streak camera, and an 800 fs dynamic temporal resolution was obtained. Guidance is provided for optimizing a femtosecond streak camera to obtain higher temporal resolution.

  15. Microprocessor-controlled, wide-range streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Amy E. Lewis, Craig Hollabaugh

    2006-09-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera’s user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  16. Ultra fast x-ray streak camera

    International Nuclear Information System (INIS)

    Coleman, L.W.; McConaghy, C.F.

    1975-01-01

    A unique ultrafast x-ray sensitive streak camera, with a time resolution of 50psec, has been built and operated. A 100A thick gold photocathode on a beryllium vacuum window is used in a modified commerical image converter tube. The X-ray streak camera has been used in experiments to observe time resolved emission from laser-produced plasmas. (author)

  17. Sweep devices for picosecond image-converter streak cameras

    International Nuclear Information System (INIS)

    Cunin, B.; Miehe, J.A.; Sipp, B.; Schelev, M.Ya.; Serduchenko, J.N.; Thebault, J.

    1979-01-01

    Four different sweep devices based on microwave tubes, avalanche transistors, krytrons, and laser-triggered spark gaps are treated in detail. These control circuits are developed for picosecond image-converter cameras and generate sweep pulses providing streak speeds in the range of 10 7 to 5x10 10 cm/sec with maximum time resolution better than 10 -12 sec. Special low-jitter triggering schemes reduce the jitter to less than 5x10 -11 sec. Some problems arising in the construction and matching of the sweep devices and image-streak tube are discussed. Comparative parameters of nanosecond switching elements are presented. The results described can be used by other authors involved in streak camera development

  18. A time-resolved image sensor for tubeless streak cameras

    Science.gov (United States)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  19. Characterization of X-ray streak cameras for use on Nova

    International Nuclear Information System (INIS)

    Kalantar, D.H.; Bell, P.M.; Costa, R.L.; Hammel, B.A.; Landen, O.L.; Orzechowski, T.J.; Hares, J.D.; Dymoke-Bradshaw, A.K.L.

    1996-09-01

    There are many different types of measurements that require a continuous time history of x-ray emission that can be provided with an x-ray streak camera. In order to properly analyze the images that are recorded with the x-ray streak cameras operated on Nova, it is important to account for the streak characterization of each camera. We have performed a number of calibrations of the streak cameras both on the bench as well as with Nova disk target shots where we use a time modulated laser intensity profile (self-beating of the laser) on the target to generate an x-ray comb. We have measured the streak camera sweep direction and spatial offset, curvature of the electron optics, sweep rate, and magnification and resolution of the electron optics

  20. Compact Optical Technique for Streak Camera Calibration

    International Nuclear Information System (INIS)

    Curt Allen; Terence Davies; Frans Janson; Ronald Justin; Bruce Marshall; Oliver Sweningsen; Perry Bell; Roger Griffith; Karla Hagans; Richard Lerche

    2004-01-01

    The National Ignition Facility is under construction at the Lawrence Livermore National Laboratory for the U.S. Department of Energy Stockpile Stewardship Program. Optical streak cameras are an integral part of the experimental diagnostics instrumentation. To accurately reduce data from the streak cameras a temporal calibration is required. This article describes a technique for generating trains of precisely timed short-duration optical pulses that are suitable for temporal calibrations

  1. Reliable and repeatable characterization of optical streak cameras

    International Nuclear Information System (INIS)

    Charest, Michael R. Jr.; Torres, Peter III; Silbernagel, Christopher T.; Kalantar, Daniel H.

    2008-01-01

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility. To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information.

  2. Reliable and Repeatable Characterization of Optical Streak Cameras

    International Nuclear Information System (INIS)

    Kalantar, D; Charest, M; Torres III, P; Charest, M

    2008-01-01

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information

  3. Reliable and Repeatable Characterization of Optical Streak Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Michael Charest Jr., Peter Torres III, Christopher Silbernagel, and Daniel Kalantar

    2008-10-31

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information.

  4. Reliable and Repeatable Characterication of Optical Streak Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Kalantar, D; Charest, M; Torres III, P; Charest, M

    2008-05-06

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser experiments at facilities such as the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electrical components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases, the characterization data are applied to the raw data images to correct for the nonlinearities. In order to characterize an optical streak camera, a specific set of data is collected, where the response to defined inputs are recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, and temporal resolution from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information.

  5. Triggered streak and framing rotating-mirror cameras

    International Nuclear Information System (INIS)

    Huston, A.E.; Tabrar, A.

    1975-01-01

    A pulse motor has been developed which enables a mirror to be rotated to speeds in excess of 20,000 rpm with 10 -4 s. High-speed cameras of both streak and framing type have been assembled which incorporate this mirror drive, giving streak writing speeds up to 2,000ms -1 , and framing speeds up to 500,000 frames s -1 , in each case with the capability of triggering the camera from the event under investigation. (author)

  6. Soft x-ray streak camera for laser fusion applications

    International Nuclear Information System (INIS)

    Stradling, G.L.

    1981-04-01

    This thesis reviews the development and significance of the soft x-ray streak camera (SXRSC) in the context of inertial confinement fusion energy development. A brief introduction of laser fusion and laser fusion diagnostics is presented. The need for a soft x-ray streak camera as a laser fusion diagnostic is shown. Basic x-ray streak camera characteristics, design, and operation are reviewed. The SXRSC design criteria, the requirement for a subkilovolt x-ray transmitting window, and the resulting camera design are explained. Theory and design of reflector-filter pair combinations for three subkilovolt channels centered at 220 eV, 460 eV, and 620 eV are also presented. Calibration experiments are explained and data showing a dynamic range of 1000 and a sweep speed of 134 psec/mm are presented. Sensitivity modifications to the soft x-ray streak camera for a high-power target shot are described. A preliminary investigation, using a stepped cathode, of the thickness dependence of the gold photocathode response is discussed. Data from a typical Argus laser gold-disk target experiment are shown

  7. Flat-field response and geometric distortion measurements of optical streak cameras

    International Nuclear Information System (INIS)

    Montgomery, D.S.; Drake, R.P.; Jones, B.A.; Wiedwald, J.D.

    1987-01-01

    To accurately measure pulse amplitude, shape, and relative time histories of optical signals with an optical streak camera, it is necessary to correct each recorded image for spatially-dependent gain nonuniformity and geometric distortion. Gain nonuniformities arise from sensitivity variations in the streak-tube photocathode, phosphor screen, image-intensifier tube, and image recording system. By using a 1.053-μm, long-pulse, high-power laser to generate a spatially and temporally uniform source as input to the streak camera, the combined effects of flat-field response and geometric distortion can be measured under the normal dynamic operation of cameras with S-1 photocathodes. Additionally, by using the same laser system to generate a train of short pulses that can be spatially modulated at the input of the streak camera, the authors can create a two-dimensional grid of equally-spaced pulses. This allows a dynamic measurement of the geometric distortion of the streak camera. The author discusses the techniques involved in performing these calibrations, present some of the measured results for LLNL optical streak cameras, and will discuss software methods to correct for these effects

  8. Fabry-Perot interferometry using an image-intensified rotating-mirror streak camera

    International Nuclear Information System (INIS)

    Seitz, W.L.; Stacy, H.L.

    1983-01-01

    A Fabry-Perot velocity interferometer system is described that uses a modified rotating mirror streak camera to recrod the dynamic fringe positions. A Los Alamos Model 72B rotating-mirror streak camera, equipped with a beryllium mirror, was modified to include a high aperture (f/2.5) relay lens and a 40-mm image-intensifier tube such that the image normally formed at the film plane of the streak camera is projected onto the intensifier tube. Fringe records for thin (0.13 mm) flyers driven by a small bridgewire detonator obtained with a Model C1155-01 Hamamatsu and Model 790 Imacon electronic streak cameras are compared with those obtained with the image-intensified rotating-mirror streak camera (I 2 RMC). Resolution comparisons indicate that the I 2 RMC gives better time resolution than either the Hamamatsu or the Imacon for total writing times of a few microseconds or longer

  9. Flat-field response and geometric distortion measurements of optical streak cameras

    International Nuclear Information System (INIS)

    Montgomery, D.S.; Drake, R.P.; Jones, B.A.; Wiedwald, J.D.

    1987-08-01

    To accurately measure pulse amplitude, shape, and relative time histories of optical signals with an optical streak camera, it is necessary to correct each recorded image for spatially-dependent gain nonuniformity and geometric distortion. Gain nonuniformities arise from sensitivity variations in the streak-tube photocathode, phosphor screen, image-intensifier tube, and image recording system. These nonuniformities may be severe, and have been observed to be on the order of 100% for some LLNL optical streak cameras. Geometric distortion due to optical couplings, electron-optics, and sweep nonlinearity not only affects pulse position and timing measurements, but affects pulse amplitude and shape measurements as well. By using a 1.053-μm, long-pulse, high-power laser to generate a spatially and temporally uniform source as input to the streak camera, the combined effects of flat-field response and geometric distortion can be measured under the normal dynamic operation of cameras with S-1 photocathodes. Additionally, by using the same laser system to generate a train of short pulses that can be spatially modulated at the input of the streak camera, we can effectively create a two-dimensional grid of equally-spaced pulses. This allows a dynamic measurement of the geometric distortion of the streak camera. We will discuss the techniques involved in performing these calibrations, will present some of the measured results for LLNL optical streak cameras, and will discuss software methods to correct for these effects. 6 refs., 6 figs

  10. Dynamic range studies of the RCA streak tube in the LLL streak camera

    International Nuclear Information System (INIS)

    Thomas, S.W.; Phillips, G.E.

    1979-01-01

    As indicated by tests on several cameras, the dynamic range of the Lawrence Livermore Laboratory streak-camera system appears to be about two orders of magnitude greater than those reported for other systems for 10- to 200-ps pulses. The lack of a fine mesh grid in the RCA streak tube used in these cameras probably contributes to a lower system dynamic noise and therefore raises the dynamic range. A developmental tube with a mesh grid was tested and supports this conjecture. Order-of-magnitude variations in input slit width do not affect the spot size on the phosphor or the dynamic range of the RCA tube. (author)

  11. A novel simultaneous streak and framing camera without principle errors

    Science.gov (United States)

    Jingzhen, L.; Fengshan, S.; Ningwen, L.; Xiangdong, G.; Bin, H.; Qingyang, W.; Hongyi, C.; Yi, C.; Xiaowei, L.

    2018-02-01

    A novel simultaneous streak and framing camera with continuous access, the perfect information of which is far more important for the exact interpretation and precise evaluation of many detonation events and shockwave phenomena, has been developed. The camera with the maximum imaging frequency of 2 × 106 fps and the maximum scanning velocity of 16.3 mm/μs has fine imaging properties which are the eigen resolution of over 40 lp/mm in the temporal direction and over 60 lp/mm in the spatial direction and the framing frequency principle error of zero for framing record, and the maximum time resolving power of 8 ns and the scanning velocity nonuniformity of 0.136%~-0.277% for streak record. The test data have verified the performance of the camera quantitatively. This camera, simultaneously gained frames and streak with parallax-free and identical time base, is characterized by the plane optical system at oblique incidence different from space system, the innovative camera obscura without principle errors, and the high velocity motor driven beryllium-like rotating mirror, made of high strength aluminum alloy with cellular lateral structure. Experiments demonstrate that the camera is very useful and reliable to take high quality pictures of the detonation events.

  12. Picosecond X-ray streak camera dynamic range measurement

    Energy Technology Data Exchange (ETDEWEB)

    Zuber, C., E-mail: celine.zuber@cea.fr; Bazzoli, S.; Brunel, P.; Gontier, D.; Raimbourg, J.; Rubbelynck, C.; Trosseille, C. [CEA, DAM, DIF, F-91297 Arpajon (France); Fronty, J.-P.; Goulmy, C. [Photonis SAS, Avenue Roger Roncier, BP 520, 19106 Brive Cedex (France)

    2016-09-15

    Streak cameras are widely used to record the spatio-temporal evolution of laser-induced plasma. A prototype of picosecond X-ray streak camera has been developed and tested by Commissariat à l’Énergie Atomique et aux Énergies Alternatives to answer the Laser MegaJoule specific needs. The dynamic range of this instrument is measured with picosecond X-ray pulses generated by the interaction of a laser beam and a copper target. The required value of 100 is reached only in the configurations combining the slowest sweeping speed and optimization of the streak tube electron throughput by an appropriate choice of high voltages applied to its electrodes.

  13. Reliable and Repeatable Characterization of Optical Streak Cameras

    International Nuclear Information System (INIS)

    Michael R. Charest, Peter Torres III, Christopher Silbernagel

    2008-01-01

    Optical streak cameras are used as primary diagnostics for a wide range of physics and laser performance verification experiments at the National Ignition Facility (NIF). To meet the strict accuracy requirements needed for these experiments, the systematic nonlinearities of the streak cameras (attributed to nonlinearities in the optical and electronic components that make up the streak camera system) must be characterized. In some cases the characterization information is used as a guide to help determine how experiment data should be taken. In other cases the characterization data is used to 'correct' data images, to remove some of the nonlinearities. In order to obtain these camera characterizations, a specific data set is collected where the response to specific known inputs is recorded. A set of analysis software routines has been developed to extract information such as spatial resolution, dynamic range, temporal resolution, etc., from this data set. The routines are highly automated, requiring very little user input and thus provide very reliable and repeatable results that are not subject to interpretation. An emphasis on quality control has been placed on these routines due to the high importance of the camera characterization information

  14. Design of neutron streak camera for fusion diagnostics

    International Nuclear Information System (INIS)

    Wang, C.L.; Kalibjian, R.; Singh, M.S.

    1982-06-01

    The D-T burn time for advanced laser-fusion targets is calculated to be very short, 2 . Each fission fragment leaving the cathode generates 400 secondary electrons that are all < 20 eV. These electrons are focussed to a point with an extractor and an anode, and are then purified with an electrostatic deflector. The electron beam is streaked and detected with the standard streak camera techniques. Careful shielding is needed for x-rays from the fusion target and general background. It appears that the neutron streak camera can be a viable and unique tool for studying temporal history of fusion burns in D-T plasmas of a few keV ion temperature

  15. Picosecond x-ray streak cameras

    Science.gov (United States)

    Averin, V. I.; Bryukhnevich, Gennadii I.; Kolesov, G. V.; Lebedev, Vitaly B.; Miller, V. A.; Saulevich, S. V.; Shulika, A. N.

    1991-04-01

    The first multistage image converter with an X-ray photocathode (UMI-93 SR) was designed in VNIIOFI in 1974 [1]. The experiments carried out in IOFAN pointed out that X-ray electron-optical cameras using the tube provided temporal resolution up to 12 picoseconds [2]. The later work has developed into the creation of the separate streak and intensifying tubes. Thus, PV-003R tube has been built on base of UMI-93SR design, fibre optically connected to PMU-2V image intensifier carrying microchannel plate.

  16. Streak cameras and their applications

    International Nuclear Information System (INIS)

    Bernet, J.M.; Imhoff, C.

    1987-01-01

    Over the last several years, development of various measurement techniques in the nanosecond and pico-second range has led to increased reliance on streak cameras. This paper will present the main electronic and optoelectronic performances of the Thomson-CSF TSN 506 cameras and their associated devices used to build an automatic image acquisition and processing system (NORMA). A brief survey of the diversity and the spread of the use of high speed electronic cinematography will be illustrated by a few typical applications [fr

  17. Improved approach to characterizing and presenting streak camera performance

    International Nuclear Information System (INIS)

    Wiedwald, J.D.; Jones, B.A.

    1985-01-01

    The performance of a streak camera recording system is strongly linked to the technique used to amplify, detect and quantify the streaked image. At the Lawrence Livermore National Laboratory (LLNL) streak camera images have been recorded both on film and by fiber-optically coupling to charge-coupled devices (CCD's). During the development of a new process for recording these images (lens coupling the image onto a cooled CCD) the definitions of important performance characteristics such as resolution and dynamic range were re-examined. As a result of this development, these performance characteristics are now presented to the streak camera user in a more useful format than in the past. This paper describes how these techniques are used within the Laser Fusion Program at LLNL. The system resolution is presented as a modulation transfer function, including the seldom reported effects that flare and light scattering have at low spatial frequencies. Data are presented such that a user can adjust image intensifier gain and pixel averaging to optimize the useful dynamic range in any particular application

  18. Characterization results from several commercial soft X-ray streak cameras

    Science.gov (United States)

    Stradling, G. L.; Studebaker, J. K.; Cavailler, C.; Launspach, J.; Planes, J.

    The spatio-temporal performance of four soft X-ray streak cameras has been characterized. The objective in evaluating the performance capability of these instruments is to enable us to optimize experiment designs, to encourage quantitative analysis of streak data and to educate the ultra high speed photography and photonics community about the X-ray detector performance which is available. These measurements have been made collaboratively over the space of two years at the Forge pulsed X-ray source at Los Alamos and at the Ketjak laser facility an CEA Limeil-Valenton. The X-ray pulse lengths used for these measurements at these facilities were 150 psec and 50 psec respectively. The results are presented as dynamically-measured modulation transfer functions. Limiting temporal resolution values were also calculated. Emphasis is placed upon shot noise statistical limitations in the analysis of the data. Space charge repulsion in the streak tube limits the peak flux at ultra short experiments duration times. This limit results in a reduction of total signal and a decrease in signal to no ise ratio in the streak image. The four cameras perform well with 20 1p/mm resolution discernable in data from the French C650X, the Hadland X-Chron 540 and the Hamamatsu C1936X streak cameras. The Kentech X-ray streak camera has lower modulation and does not resolve below 10 1p/mm but has a longer photocathode.

  19. Improvements in Off-Center Focusing in an X-ray Streak Camera

    International Nuclear Information System (INIS)

    McDonald, J W; Weber, F; Holder, J P; Bell, P M

    2003-01-01

    Due to the planar construction of present x-ray streak tubes significant off-center defocusing is observed in both static and dynamic images taken with one-dimensional resolution slits. Based on the streak tube geometry curved photocathodes with radii of curvature ranging from 3.5 to 18 inches have been fabricated. We report initial off-center focusing performance data on the evaluation of these ''improved'' photocathodes in an X-ray streak camera and an update on the theoretical simulations to predict the optimum cathode curvature

  20. Structured photocathodes for improved high-energy x-ray efficiency in streak cameras

    Energy Technology Data Exchange (ETDEWEB)

    Opachich, Y. P., E-mail: opachiyp@nv.doe.gov; Huffman, E.; Koch, J. A. [National Security Technologies, LLC, Livermore, California 94551 (United States); Bell, P. M.; Bradley, D. K.; Hatch, B.; Landen, O. L.; MacPhee, A. G.; Nagel, S. R. [Lawrence Livermore National Laboratory, Livermore, California 94551 (United States); Chen, N.; Gopal, A.; Udin, S. [Nanoshift LLC, Emeryville, California 94608 (United States); Feng, J. [Lawrence Berkeley National Laboratory, Berkeley, California 94720 (United States); Hilsabeck, T. J. [General Atomics, San Diego, California 92121 (United States)

    2016-11-15

    We have designed and fabricated a structured streak camera photocathode to provide enhanced efficiency for high energy X-rays (1–12 keV). This gold coated photocathode was tested in a streak camera and compared side by side against a conventional flat thin film photocathode. Results show that the measured electron yield enhancement at energies ranging from 1 to 10 keV scales well with predictions, and that the total enhancement can be more than 3×. The spatial resolution of the streak camera does not show degradation in the structured region. We predict that the temporal resolution of the detector will also not be affected as it is currently dominated by the slit width. This demonstration with Au motivates exploration of comparable enhancements with CsI and may revolutionize X-ray streak camera photocathode design.

  1. Cheap streak camera based on the LD-S-10 intensifier tube

    Science.gov (United States)

    Dashevsky, Boris E.; Krutik, Mikhail I.; Surovegin, Alexander L.

    1992-01-01

    Basic properties of a new streak camera and its test results are reported. To intensify images on its screen, we employed modular G1 tubes, the LD-A-1.0 and LD-A-0.33, enabling magnification of 1.0 and 0.33, respectively. If necessary, the LD-A-0.33 tube may be substituted by any other image intensifier of the LDA series, the choice to be determined by the size of the CCD matrix with fiber-optical windows. The reported camera employs a 12.5- mm-long CCD strip consisting of 1024 pixels, each 12 X 500 micrometers in size. Registered radiation was imaged on a 5 X 0.04 mm slit diaphragm tightly connected with the LD-S- 10 fiber-optical input window. Electrons escaping the cathode are accelerated in a 5 kV electric field and focused onto a phosphor screen covering a fiber-optical plate as they travel between deflection plates. Sensitivity of the latter was 18 V/mm, which implies that the total deflecting voltage was 720 V per 40 mm of the screen surface, since reversed-polarity scan pulses +360 V and -360 V were applied across the deflection plate. The streak camera provides full scan times over the screen of 15, 30, 50, 100, 250, and 500 ns. Timing of the electrically or optically driven camera was done using a 10 ns step-controlled-delay (0 - 500 ns) circuit.

  2. Streak camera imaging of single photons at telecom wavelength

    Science.gov (United States)

    Allgaier, Markus; Ansari, Vahid; Eigner, Christof; Quiring, Viktor; Ricken, Raimund; Donohue, John Matthew; Czerniuk, Thomas; Aßmann, Marc; Bayer, Manfred; Brecht, Benjamin; Silberhorn, Christine

    2018-01-01

    Streak cameras are powerful tools for temporal characterization of ultrafast light pulses, even at the single-photon level. However, the low signal-to-noise ratio in the infrared range prevents measurements on weak light sources in the telecom regime. We present an approach to circumvent this problem, utilizing an up-conversion process in periodically poled waveguides in Lithium Niobate. We convert single photons from a parametric down-conversion source in order to reach the point of maximum detection efficiency of commercially available streak cameras. We explore phase-matching configurations to apply the up-conversion scheme in real-world applications.

  3. Performance of Laser Megajoule’s x-ray streak camera

    Energy Technology Data Exchange (ETDEWEB)

    Zuber, C., E-mail: celine.zuber@cea.fr; Bazzoli, S.; Brunel, P.; Burillo, M.; Gontier, D.; Moreau, I.; Oudot, G.; Rubbelynck, C.; Soullié, G.; Stemmler, P.; Trosseille, C. [CEA, DAM, DIF, F-91297 Arpajon (France); Fronty, J. P.; Goulmy, C. [Photonis France SAS, Avenue Roger Roncier, BP 520, 19106 Brive Cedex (France)

    2016-11-15

    A prototype of a picosecond x-ray streak camera has been developed and tested by Commissariat à l’Énergie Atomique et aux Énergies Alternatives to provide plasma-diagnostic support for the Laser Megajoule. We report on the measured performance of this streak camera, which almost fulfills the requirements: 50-μm spatial resolution over a 15-mm field in the photocathode plane, 17-ps temporal resolution in a 2-ns timebase, a detection threshold lower than 625 nJ/cm{sup 2} in the 0.05–15 keV spectral range, and a dynamic range greater than 100.

  4. The LLL compact 10-ps streak camera

    International Nuclear Information System (INIS)

    Thomas, S.W.; Houghton, J.W.; Tripp, G.R.; Coleman, L.W.

    1975-01-01

    The 10-ps streak camera has been redesigned to simplify its operation, reduce manufacturing costs, and improve its appearance. The electronics have been simplified, a film indexer added, and a contacted slit has been evaluated. Data support a 10-ps resolution. (author)

  5. C.C.D. readout of a picosecond streak camera with an intensified C.C.D

    International Nuclear Information System (INIS)

    Lemonier, M.; Richard, J.C.; Cavailler, C.; Mens, A.; Raze, G.

    1984-08-01

    This paper deals with a digital streak camera readout device. The device consists in a low light level television camera made of a solid state C.C.D. array coupled to an image intensifier associated to a video-digitizer coupled to a micro-computer system. The streak camera images are picked-up as a video signal, digitized and stored. This system allows the fast recording and the automatic processing of the data provided by the streak tube

  6. Optical Comb Generation for Streak Camera Calibration for Inertial Confinement Fusion Experiments

    International Nuclear Information System (INIS)

    Ronald Justin; Terence Davies; Frans Janson; Bruce Marshall; Perry Bell; Daniel Kalantar; Joseph Kimbrough; Stephen Vernon; Oliver Sweningsen

    2008-01-01

    The National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL) is coming on-line to support physics experimentation for the U.S. Department of Energy (DOE) programs in Inertial Confinement Fusion (ICF) and Stockpile Stewardship (SS). Optical streak cameras are an integral part of the experimental diagnostics instrumentation at NIF. To accurately reduce streak camera data a highly accurate temporal calibration is required. This article describes a technique for simultaneously generating a precise +/- 2 ps optical marker pulse (fiducial reference) and trains of precisely timed, short-duration optical pulses (so-called 'comb' pulse trains) that are suitable for the timing calibrations. These optical pulse generators are used with the LLNL optical streak cameras. They are small, portable light sources that, in the comb mode, produce a series of temporally short, uniformly spaced optical pulses, using a laser diode source. Comb generators have been produced with pulse-train repetition rates up to 10 GHz at 780 nm, and somewhat lower frequencies at 664 nm. Individual pulses can be as short as 25-ps FWHM. Signal output is via a fiber-optic connector on the front panel of the generator box. The optical signal is transported from comb generator to streak camera through multi-mode, graded-index optical fiber

  7. STREAK CAMERA MEASUREMENTS OF THE APS PC GUN DRIVE LASER

    Energy Technology Data Exchange (ETDEWEB)

    Dooling, J. C.; Lumpkin, A. H.

    2017-06-25

    We report recent pulse-duration measurements of the APS PC Gun drive laser at both second harmonic and fourth harmonic wavelengths. The drive laser is a Nd:Glass-based chirped pulsed amplifier (CPA) operating at an IR wavelength of 1053 nm, twice frequency-doubled to obtain UV output for the gun. A Hamamatsu C5680 streak camera and an M5675 synchroscan unit are used for these measurements; the synchroscan unit is tuned to 119 MHz, the 24th subharmonic of the linac s-band operating frequency. Calibration is accomplished both electronically and optically. Electronic calibration utilizes a programmable delay line in the 119 MHz rf path. The optical delay uses an etalon with known spacing between reflecting surfaces and is coated for the visible, SH wavelength. IR pulse duration is monitored with an autocorrelator. Fitting the streak camera image projected profiles with Gaussians, UV rms pulse durations are found to vary from 2.1 ps to 3.5 ps as the IR varies from 2.2 ps to 5.2 ps.

  8. X-ray streak camera for observation of tightly pinched relativistic electron beams

    International Nuclear Information System (INIS)

    Johnson, D.J.

    1977-01-01

    A pinhole camera is coupled with a Pilot-B scintillator and image-intensified TRW streak camera to study pinched electron beam profiles via observation of anode target bremsstrahlung. Streak intensification is achieved with an EMI image intensifier operated at a gain of up to 10 6 which allows optimizing the pinhole configuration so that resolution is simultaneously limited by photon-counting statistics and pinhole geometry. The pinhole used is one-dimensional and is fabricated by inserting uranium shims with hyperbolic curved edges between two 5-cm-thick lead blocks. The loss of spatial resolution due to the x-ray transmission through the perimeter of the pinhole is calculated and a streak photograph of a Gamble I pinched beam interacting with a brass anode is presented

  9. Sweep time performance of optic streak camera

    International Nuclear Information System (INIS)

    Wang Zhebin; Yang Dong; Zhang Huige

    2012-01-01

    The sweep time performance of the optic streak camera (OSC) is of critical importance to its application. The systematic analysis of full-screen sweep velocity shows that the traditional method based on the averaged velocity and its nonlinearity would increase the uncertainty of sweep time and can not reflect the influence of the spatial distortion of OSC. A elaborate method for sweep time has been developed with the aid of full-screen sweep velocity and its uncertainty. It is proved by the theoretical analysis and experimental study that the method would decrease the uncertainty of sweep time within 1%, which would improve the accuracy of sweep time and the reliability of OSC application. (authors)

  10. Precise measurement of a subpicosecond electron single bunch by the femtosecond streak camera

    International Nuclear Information System (INIS)

    Uesaka, M.; Ueda, T.; Kozawa, T.; Kobayashi, T.

    1998-01-01

    Precise measurement of a subpicosecond electron single bunch by the femtosecond streak camera is presented. The subpicosecond electron single bunch of energy 35 MeV was generated by the achromatic magnetic pulse compressor at the S-band linear accelerator of nuclear engineering research laboratory (NERL), University of Tokyo. The electric charge per bunch and beam size are 0.5 nC and the horizontal and vertical beam sizes are 3.3 and 5.5 mm (full width at half maximum; FWHM), respectively. Pulse shape of the electron single bunch is measured via Cherenkov radiation emitted in air by the femtosecond streak camera. Optical parameters of the optical measurement system were optimized based on much experiment and numerical analysis in order to achieve a subpicosecond time resolution. By using the optimized optical measurement system, the subpicosecond pulse shape, its variation for the differents rf phases in the accelerating tube, the jitter of the total system and the correlation between measured streak images and calculated longitudinal phase space distributions were precisely evaluated. This measurement system is going to be utilized in several subpicosecond analyses for radiation physics and chemistry. (orig.)

  11. Commissioning of the advanced light source dual-axis streak camera

    International Nuclear Information System (INIS)

    Hinkson, J.; Keller, R.; Byrd, J.

    1997-05-01

    A dual-axis camera, Hamamatsu model C5680, has been installed on the Advanced Light Source photon-diagnostics beam-line to investigate electron-beam parameters. During its commissioning process, the camera has been used to measure single-bunch length vs. current, relative bunch charge in adjacent RF buckets, and bunchphase stability. In this paper the authors describe the visible-light branch of the diagnostics beam-line, the streak-camera installation, and the timing electronics. They will show graphical results of beam measurements taken during a variety of accelerator conditions

  12. A sampling ultra-high-speed streak camera based on the use of a unique photomultiplier

    International Nuclear Information System (INIS)

    Marode, Emmanuel

    An apparatus reproducing the ''streak'' mode of a high-speed camera is proposed for the case of a slit AB whose variations in luminosity are repetitive. A photomultiplier, analysing the object AB point by point, and a still camera, photographing a slit fixed on the oscilloscope screen parallel to the sweep direction are placed on a mobile platform P. The movement of P assures a time-resolved analysis of AB. The resolution is of the order of 2.10 -9 s, and can be improved [fr

  13. Characteristics of uranium oxide cathode for neutron streak camera

    International Nuclear Information System (INIS)

    Niki, H.; Itoga, K.; Yamanaka, M.; Yamanaka, T.; Yamanaka, C.

    1986-01-01

    In laser fusion research, time-resolved neutron measurements require 20ps resolution in order to obtain the time history of the D-T burn. Uranium oxide was expected to be a sensitive material as a cathode of a neutron streak camera because of its large fission cross section. The authors report their measurements of some characteristics of the uranium oxide cathode connected to a conventional streak tube. 14 MeV neutron signal were observed as the bright spots on a TV monitor using a focus mode opration. Detection efficiency was ∼ 1 x 10 -6 for 1 μm thick cathode. Each signal consisted of more than several tens of components, which were corresponding to the secondary electrons dragged out from the cathode by a fission fragment. Time resolution is thought to be limited mainly by the transit time spread of the secondary electrons. 14ps resolution was obtained by a streak mode operation for a single fission event

  14. Compact streak camera for the shock study of solids by using the high-pressure gas gun

    Science.gov (United States)

    Nagayama, Kunihito; Mori, Yasuhito

    1993-01-01

    For the precise observation of high-speed impact phenomena, a compact high-speed streak camera recording system has been developed. The system consists of a high-pressure gas gun, a streak camera, and a long-pulse dye laser. The gas gun installed in our laboratory has a muzzle of 40 mm in diameter, and a launch tube of 2 m long. Projectile velocity is measured by the laser beam cut method. The gun is capable of accelerating a 27 g projectile up to 500 m/s, if helium gas is used as a driver. The system has been designed on the principal idea that the precise optical measurement methods developed in other areas of research can be applied to the gun study. The streak camera is 300 mm in diameter, with a rectangular rotating mirror which is driven by an air turbine spindle. The attainable streak velocity is 3 mm/microsecond(s) . The size of the camera is rather small aiming at the portability and economy. Therefore, the streak velocity is relatively slower than the fast cameras, but it is possible to use low-sensitivity but high-resolution film as a recording medium. We have also constructed a pulsed dye laser of 25 - 30 microsecond(s) in duration. The laser can be used as a light source of observation. The advantage for the use of the laser will be multi-fold, i.e., good directivity, almost single frequency, and so on. The feasibility of the system has been demonstrated by performing several experiments.

  15. Development of intelligent control system for X-ray streak camera in diagnostic instrument manipulator

    International Nuclear Information System (INIS)

    Pei, Chengquan; Wu, Shengli; Tian, Jinshou; Liu, Zhen; Fang, Yuman; Gao, Guilong; Liang, Lingliang; Wen, Wenlong

    2015-01-01

    An intelligent control system for an X ray streak camera in a diagnostic instrument manipulator (DIM) is proposed and implemented, which can control time delay, electric focusing, image gain adjustment, switch of sweep voltage, acquiring environment parameters etc. The system consists of 16 A/D converters and 16 D/A converters, a 32-channel general purpose input/output (GPIO) and two sensors. An isolated DC/DC converter with multi-outputs and a single mode fiber were adopted to reduce the interference generated by the common ground among the A/D, D/A and I/O. The software was designed using graphical programming language and can remotely access the corresponding instrument from a website. The entire intelligent control system can acquire the desirable data at a speed of 30 Mb/s and store it for later analysis. The intelligent system was implemented on a streak camera in a DIM and it shows a temporal resolution of 11.25 ps, spatial distortion of less than 10% and dynamic range of 279:1. The intelligent control system has been successfully used in a streak camera to verify the synchronization of multi-channel laser on the Inertial Confinement Fusion Facility

  16. Development of intelligent control system for X-ray streak camera in diagnostic instrument manipulator

    Energy Technology Data Exchange (ETDEWEB)

    Pei, Chengquan [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi' an 710049 (China); Wu, Shengli, E-mail: slwu@mail.xjtu.edu.cn [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi' an 710049 (China); Tian, Jinshou [Xi' an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); Liu, Zhen [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi' an 710049 (China); Fang, Yuman [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi' an 710049 (China); University of the Chinese Academy of Sciences, Beijing 100039 (China); Gao, Guilong; Liang, Lingliang [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi' an 710049 (China); Xi' an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); University of the Chinese Academy of Sciences, Beijing 100039 (China); Wen, Wenlong [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi' an 710049 (China)

    2015-11-01

    An intelligent control system for an X ray streak camera in a diagnostic instrument manipulator (DIM) is proposed and implemented, which can control time delay, electric focusing, image gain adjustment, switch of sweep voltage, acquiring environment parameters etc. The system consists of 16 A/D converters and 16 D/A converters, a 32-channel general purpose input/output (GPIO) and two sensors. An isolated DC/DC converter with multi-outputs and a single mode fiber were adopted to reduce the interference generated by the common ground among the A/D, D/A and I/O. The software was designed using graphical programming language and can remotely access the corresponding instrument from a website. The entire intelligent control system can acquire the desirable data at a speed of 30 Mb/s and store it for later analysis. The intelligent system was implemented on a streak camera in a DIM and it shows a temporal resolution of 11.25 ps, spatial distortion of less than 10% and dynamic range of 279:1. The intelligent control system has been successfully used in a streak camera to verify the synchronization of multi-channel laser on the Inertial Confinement Fusion Facility.

  17. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited)

    Energy Technology Data Exchange (ETDEWEB)

    MacPhee, A. G., E-mail: macphee2@llnl.gov; Hatch, B. W.; Bell, P. M.; Bradley, D. K.; Datte, P. S.; Landen, O. L.; Palmer, N. E.; Piston, K. W.; Rekow, V. V. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); Dymoke-Bradshaw, A. K. L.; Hares, J. D. [Kentech Instruments Ltd., Isis Building, Howbery Park, Wallingford, Oxfordshire OX10 8BD (United Kingdom); Hassett, J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); Department of Electrical and Computer Engineering, University of Rochester, Rochester, New York 14627 (United States); Meadowcroft, A. L. [AWE Aldermaston, Reading, Berkshire RG7 4PR (United Kingdom); Hilsabeck, T. J.; Kilkenny, J. D. [General Atomics, P.O. Box 85608, San Diego, California 92186-5608 (United States)

    2016-11-15

    We report simulations and experiments that demonstrate an increase in spatial resolution of the NIF core diagnostic x-ray streak cameras by at least a factor of two, especially off axis. A design was achieved by using a corrector electron optic to flatten the field curvature at the detector plane and corroborated by measurement. In addition, particle in cell simulations were performed to identify the regions in the streak camera that contribute the most to space charge blurring. These simulations provide a tool for convolving synthetic pre-shot spectra with the instrument function so signal levels can be set to maximize dynamic range for the relevant part of the streak record.

  18. Improving the off-axis spatial resolution and dynamic range of the NIF X-ray streak cameras (invited).

    Science.gov (United States)

    MacPhee, A G; Dymoke-Bradshaw, A K L; Hares, J D; Hassett, J; Hatch, B W; Meadowcroft, A L; Bell, P M; Bradley, D K; Datte, P S; Landen, O L; Palmer, N E; Piston, K W; Rekow, V V; Hilsabeck, T J; Kilkenny, J D

    2016-11-01

    We report simulations and experiments that demonstrate an increase in spatial resolution of the NIF core diagnostic x-ray streak cameras by at least a factor of two, especially off axis. A design was achieved by using a corrector electron optic to flatten the field curvature at the detector plane and corroborated by measurement. In addition, particle in cell simulations were performed to identify the regions in the streak camera that contribute the most to space charge blurring. These simulations provide a tool for convolving synthetic pre-shot spectra with the instrument function so signal levels can be set to maximize dynamic range for the relevant part of the streak record.

  19. Identification and Removal of High Frequency Temporal Noise in a Nd:YAG Macro-Pulse Laser Assisted with a Diagnostic Streak Camera

    International Nuclear Information System (INIS)

    Kent Marlett; Ke-Xun Sun

    2004-01-01

    This paper discusses the use of a reference streak camera (SC) to diagnose laser performance and guide modifications to remove high frequency noise from Bechtel Nevada's long-pulse laser. The upgraded laser exhibits less than 0.1% high frequency noise in cumulative spectra, exceeding National Ignition Facility (NIF) calibration specifications. Inertial Confinement Fusion (ICF) experiments require full characterization of streak cameras over a wide range of sweep speeds (10 ns to 480 ns). This paradigm of metrology poses stringent spectral requirements on the laser source for streak camera calibration. Recently, Bechtel Nevada worked with a laser vendor to develop a high performance, multi-wavelength Nd:YAG laser to meet NIF calibration requirements. For a typical NIF streak camera with a 4096 x 4096 pixel CCD, the flat field calibration at 30 ns requires a smooth laser spectrum over 33 MHz to 68 GHz. Streak cameras are the appropriate instrumentation for measuring laser amplitude noise at these very high frequencies since the upper end spectral content is beyond the frequency response of typical optoelectronic detectors for a single shot pulse. The SC was used to measure a similar laser at its second harmonic wavelength (532 nm), to establish baseline spectra for testing signal analysis algorithms. The SC was then used to measure the new custom calibration laser. In both spatial-temporal measurements and cumulative spectra, 6-8 GHz oscillations were identified. The oscillations were found to be caused by inter-surface reflections between amplifiers. Additional variations in the SC spectral data were found to result from temperature instabilities in the seeding laser. Based on these findings, laser upgrades were made to remove the high frequency noise from the laser output

  20. Temporal resolution limit estimation of x-ray streak cameras using a CsI photocathode

    Energy Technology Data Exchange (ETDEWEB)

    Li, Xiang; Gu, Li; Zong, Fangke; Zhang, Jingjin; Yang, Qinlao, E-mail: qlyang@szu.edu.cn [Key Laboratory of Optoelectronic Devices and Systems of Ministry of Education and Guangdong Province, Institute of Optoelectronics, Shenzhen University, Shenzhen 518060 (China)

    2015-08-28

    A Monte Carlo model is developed and implemented to calculate the characteristics of x-ray induced secondary electron (SE) emission from a CsI photocathode used in an x-ray streak camera. Time distributions of emitted SEs are investigated with an incident x-ray energy range from 1 to 30 keV and a CsI thickness range from 100 to 1000 nm. Simulation results indicate that SE time distribution curves have little dependence on the incident x-ray energy and CsI thickness. The calculated time dispersion within the CsI photocathode is about 70 fs, which should be the temporal resolution limit of x-ray streak cameras that use CsI as the photocathode material.

  1. Compact optical technique for streak camera calibration

    International Nuclear Information System (INIS)

    Bell, Perry; Griffith, Roger; Hagans, Karla; Lerche, Richard; Allen, Curt; Davies, Terence; Janson, Frans; Justin, Ronald; Marshall, Bruce; Sweningsen, Oliver

    2004-01-01

    To produce accurate data from optical streak cameras requires accurate temporal calibration sources. We have reproduced an older technology for generating optical timing marks that had been lost due to component availability. Many improvements have been made which allow the modern units to service a much larger need. Optical calibrators are now available that produce optical pulse trains of 780 nm wavelength light at frequencies ranging from 0.1 to 10 GHz, with individual pulse widths of approximately 25 ps full width half maximum. Future plans include the development of single units that produce multiple frequencies to cover a wide temporal range, and that are fully controllable via an RS232 interface

  2. Compact optical technique for streak camera calibration

    Science.gov (United States)

    Bell, Perry; Griffith, Roger; Hagans, Karla; Lerche, Richard; Allen, Curt; Davies, Terence; Janson, Frans; Justin, Ronald; Marshall, Bruce; Sweningsen, Oliver

    2004-10-01

    To produce accurate data from optical streak cameras requires accurate temporal calibration sources. We have reproduced an older technology for generating optical timing marks that had been lost due to component availability. Many improvements have been made which allow the modern units to service a much larger need. Optical calibrators are now available that produce optical pulse trains of 780 nm wavelength light at frequencies ranging from 0.1 to 10 GHz, with individual pulse widths of approximately 25 ps full width half maximum. Future plans include the development of single units that produce multiple frequencies to cover a wide temporal range, and that are fully controllable via an RS232 interface.

  3. A Robust In-Situ Warp-Correction Algorithm For VISAR Streak Camera Data at the National Ignition Facility

    International Nuclear Information System (INIS)

    Labaria, George R.; Warrick, Abbie L.; Celliers, Peter M.; Kalantar, Daniel H.

    2015-01-01

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high-energy-density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time, affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.

  4. A Robust In-Situ Warp-Correction Algorithm For VISAR Streak Camera Data at the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Labaria, George R. [Univ. of California, Santa Cruz, CA (United States); Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Warrick, Abbie L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Celliers, Peter M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kalantar, Daniel H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-12

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam pulsed laser system for high-energy-density physics experiments. Sophisticated diagnostics have been designed around key performance metrics to achieve ignition. The Velocity Interferometer System for Any Reflector (VISAR) is the primary diagnostic for measuring the timing of shocks induced into an ignition capsule. The VISAR system utilizes three streak cameras; these streak cameras are inherently nonlinear and require warp corrections to remove these nonlinear effects. A detailed calibration procedure has been developed with National Security Technologies (NSTec) and applied to the camera correction analysis in production. However, the camera nonlinearities drift over time, affecting the performance of this method. An in-situ fiber array is used to inject a comb of pulses to generate a calibration correction in order to meet the timing accuracy requirements of VISAR. We develop a robust algorithm for the analysis of the comb calibration images to generate the warp correction that is then applied to the data images. Our algorithm utilizes the method of thin-plate splines (TPS) to model the complex nonlinear distortions in the streak camera data. In this paper, we focus on the theory and implementation of the TPS warp-correction algorithm for the use in a production environment.

  5. Light field driven streak-camera for single-shot measurements of the temporal profile of XUV-pulses from a free-electron laser; Lichtfeld getriebene Streak-Kamera zur Einzelschuss Zeitstrukturmessung der XUV-Pulse eines Freie-Elektronen Lasers

    Energy Technology Data Exchange (ETDEWEB)

    Fruehling, Ulrike

    2009-10-15

    The Free Electron Laser in Hamburg (FLASH) is a source for highly intense ultra short extreme ultraviolet (XUV) light pulses with pulse durations of a few femtoseconds. Due to the stochastic nature of the light generation scheme based on self amplified spontaneous emission (SASE), the duration and temporal profile of the XUV pulses fluctuate from shot to shot. In this thesis, a THz-field driven streak-camera capable of single pulse measurements of the XUV pulse-profile has been realized. In a first XUV-THz pump-probe experiment at FLASH, the XUV-pulses are overlapped in a gas target with synchronized THz-pulses generated by a new THz-undulator. The electromagnetic field of the THz light accelerates photoelectrons produced by the XUV-pulses with the resulting change of the photoelectron momenta depending on the phase of the THz field at the time of ionisation. This technique is intensively used in attosecond metrology where near infrared streaking fields are employed for the temporal characterisation of attosecond XUV-Pulses. Here, it is adapted for the analysis of pulse durations in the few femtosecond range by choosing a hundred times longer far infrared streaking wavelengths. Thus, the gap between conventional streak cameras with typical resolutions of hundreds of femtoseconds and techniques with attosecond resolution is filled. Using the THz-streak camera, the time dependent electric field of the THz-pulses was sampled in great detail while on the other hand the duration and even details of the time structure of the XUV-pulses were characterized. (orig.)

  6. Streak camera measurements of laser pulse temporal dispersion in short graded-index optical fibers

    International Nuclear Information System (INIS)

    Lerche, R.A.; Phillips, G.E.

    1981-01-01

    Streak camera measurements were used to determine temporal dispersion in short (5 to 30 meter) graded-index optical fibers. Results show that 50-ps, 1.06-μm and 0.53-μm laser pulses can be propagated without significant dispersion when care is taken to prevent propagation of energy in fiber cladding modes

  7. Lasers and laser applications. Imaging implosion dynamics: The x-ray pinhole/streak camera

    International Nuclear Information System (INIS)

    Attwood, D.T.

    1976-01-01

    A Livermore-developed x-ray-sensitive streak camera was combined with a unique x-ray pinhole camera to make dynamic photographs of laser-irradiated fusion target implosions. These photographs show x radiation emitted from the imploding shell during its 100-ps implosion; they are the first continuous observations of an imploding laser-driven fusion capsule. The diagnostic system has a time resolution of 15 ps and a spatial resolution of about 6 μm. Results agree very well with those predicted by our LASNEX calculations, confirming that the essential physics are correctly described in the code and providing further confidence in the soundness of this approach to inertial confinement fusion

  8. Synchronization of streak and framing camera measurements of an intense relativistic electron beam propagating through gas

    International Nuclear Information System (INIS)

    Weidman, D.J.; Murphy, D.P.; Myers, M.C.; Meger, R.A.

    1994-01-01

    The expansion of the radius of a 5 MeV, 20 kA, 40 ns electron beam from SuperIBEX during propagation through gas is being measured. The beam is generated, conditions, equilibrated, and then passed through a thin foil that produces Cherenkov light, which is recorded by a streak camera. At a second location, the beam hits another Cherenkov emitter, which is viewed by a framing camera. Measurements at these two locations can provide a time-resolved measure of the beam expansion. The two measurements, however, must be synchronized with each other, because the beam radius is not constant throughout the pulse due to variations in beam current and energy. To correlate the timing of the two diagnostics, several shots have been taken with both diagnostics viewing Cherenkov light from the same foil. Experimental measurements of the Cherenkov light from one foil viewed by both diagnostics will be presented to demonstrate the feasibility of correlating the diagnostics with each other. Streak camera data showing the optical fiducial, as well as the final correlation of the two diagnostics, will also be presented. Preliminary beam radius measurements from Cherenkov light measured at two locations will be shown

  9. Fiber scintillator/streak camera detector for burn history measurement in inertial confinement fusion experiment

    International Nuclear Information System (INIS)

    Miyanaga, N.; Ohba, N.; Fujimoto, K.

    1997-01-01

    To measure the burn history in an inertial confinement fusion experiment, we have developed a new neutron detector based on plastic scintillation fibers. Twenty-five fiber scintillators were arranged in a geometry compensation configuration by which the time-of-flight difference of the neutrons is compensated by the transit time difference of light passing through the fibers. Each fiber scintillator is spliced individually to an ultraviolet optical fiber that is coupled to a streak camera. We have demonstrated a significant improvement of sensitivity compared with the usual bulk scintillator coupled to a bundle of the same ultraviolet fibers. copyright 1997 American Institute of Physics

  10. Overall comparison of subpicosecond electron beam diagnostics by the polychromator, the interferometer and the femtosecond streak camera

    CERN Document Server

    Watanabe, T; Yoshimatsu, T; Sasaki, S; Sugiyama, Y; Ishi, K; Shibata, Y; Kondo, Y; Yoshii, K; Ueda, T; Uesaka, M

    2002-01-01

    Measurements of longitudinal bunch length of subpicosecond and picosecond electron beams have been performed by three methods with three radiation sources at the 35 MeV S-band twin liner accelerators at Nuclear Engineering Research Laboratory, University of Tokyo. The methods we adopt are the femtosecond streak camera with a nondispersive reflective optics, the coherent transition radiation (CTR) Michelson interferometer and the 10 ch polychromator that detects the spectrum of CTR and coherent diffraction radiation (CDR). The measurements by the two CTR methods were independently done with the streak camera and their results were consistent with one another. As a result, the reliability of the polychromator for the diagnostics of less than picosecond electron bunch and the usefulness of the diagnostics for the single shot measurement were verified. Furthermore, perfect nondestructive diagnostics for subpicosecond bunches was performed utilizing CDR interferometry. Then the good agreement between CDR interfero...

  11. Picosecond streak camera diagnostics of CO2 laser-produced plasmas

    International Nuclear Information System (INIS)

    Jaanimagi, P.A.; Marjoribanks, R.S.; Sancton, R.W.; Enright, G.D.; Richardson, M.C.

    1979-01-01

    The interaction of intense laser radiation with solid targets is currently of considerable interest in laser fusion studies. Its understanding requires temporal knowledge of both laser and plasma parameters on a picosecond time scale. In this paper we describe the progress we have recently made in analysing, with picosecond time resolution, various features of intense nanosecond CO 2 laser pulse interaction experiments. An infrared upconversion scheme, having linear response and <20 ps temporal resolution, has been utilized to characterise the 10 μm laser pulse. Various features of the interaction have been studied with the aid of picosecond IR and x-ray streak cameras. These include the temporal and spatial characteristics of high harmonic emission from the plasma, and the temporal development of the x-ray continuum spectrum. (author)

  12. Aluminum-coated optical fibers as efficient infrared timing fiducial photocathodes for synchronizing x-ray streak cameras

    International Nuclear Information System (INIS)

    Koch, J.A.; MacGowan, B.J.

    1991-01-01

    The timing fiducial system at the Nova Two-Beam Facility allows time-resolved x-ray and optical streak camera data from laser-produced plasmas to be synchronized to within 30 ps. In this system, an Al-coated optical fiber is inserted into an aperture in the cathode plate of each streak camera. The coating acts as a photocathode for a low-energy pulse of 1ω (λ = 1.054 μm) light which is synchronized to the main Nova beam. The use of the fundamental (1ω) for this fiducial pulse has been found to offer significant advantages over the use of the 2ω second harmonic (λ = 0.53 μm). These advantages include brighter signals, greater reliability, and a higher relative damage threshold, allowing routine use without fiber replacement. The operation of the system is described, and experimental data and interpretations are discussed which suggest that the electron production in the Al film is due to thermionic emission. The results of detailed numerical simulations of the relevant thermal processes, undertaken to model the response of the coated fiber to 1ω laser pulses, are also presented, which give qualitative agreement with experimental data. Quantitative discrepancies between the modeling results and the experimental data are discussed, and suggestions for further research are given

  13. Streak electronic camera with slow-scanning storage tube used in the field of high-speed cineradiography

    International Nuclear Information System (INIS)

    Marilleau, J.; Bonnet, L.; Garcin, G.; Guix, R.; Loichot, R.

    The cineradiographic machine designed for measurements in the field of detonics consists of a linear accelerator associated with a braking target, a scintillator and a remote controlled electronic camera. The quantum factor of X-ray detection and the energetic efficiency of the scintillator are given. The electronic camera is built upon a deflection-converter tube (RCA C. 73 435 AJ) coupled by optical fibres to a photosensitive storage tube (TH-CSF Esicon) used in a slow-scanning process with electronic recording of the information. The different parts of the device are described. Some capabilities such as data processing numerical outputs, measurements and display are outlined. A streak cineradiogram of a typical implosion experiment is given [fr

  14. Towards jitter free synchronization of synchroscan streak cameras by noisy periodic laser pulses

    International Nuclear Information System (INIS)

    Cunin, B.; Heisel, F.; Miehe, J.A.

    1991-01-01

    In connection with the parameters characterizing the phase noise in cw mode-locked lasers and under the employ of streak cameras operated by sinewave deflection, the timing capabilities of the measuring system for two commonly used synchronization techniques are discussed by stochastic description. Especially, the power spectrum of the sweep signal versus the laser phase noise is examined in detail. The theoretical results are used to interpret experimental observations recorded by means of actively and passively mode-locked lasers. One of the interesting applications of synchroscan operations to metrology is the determination of short-term instabilities of the oscillator on a time scale near to the period. (author) 12 refs.; 3 figs

  15. Evaluation of dynamic range for LLNL streak cameras using high contrast pulsed and pulse podiatry on the Nova laser system

    International Nuclear Information System (INIS)

    Richards, J.B.; Weiland, T.L.; Prior, J.A.

    1990-01-01

    This paper reports on a standard LLNL streak camera that has been used to analyze high contrast pulses on the Nova laser facility. These pulses have a plateau at their leading edge (foot) with an amplitude which is approximately 1% of the maximum pulse height. Relying on other features of the pulses and on signal multiplexing, we were able to determine how accurately the foot amplitude was being represented by the camera. Results indicate that the useful single channel dynamic range of the instrument approaches 100:1

  16. Multislit streak photography for plasma dynamics studies

    International Nuclear Information System (INIS)

    Tou, T.Y.; Lee, S.

    1988-01-01

    A microscope slide with several transparent slits installed in a streak camera is used to record time-resolved two-dimensional information when a curved luminous plasma sheath traverses these slits. Applying this method to the plasma focus experiment, the axial run-down trajectory and the shapes of the plasma sheath at various moments can be obtained from a single streak photograph

  17. Streak tube development

    International Nuclear Information System (INIS)

    Hinrichs, C.K.; Estrella, R.M.

    1979-01-01

    A research program for the development of a high-speed, high-resolution streak image tube is described. This is one task in the development of a streak camera system with digital electronic readout, whose primary application is for diagnostics in underground nuclear testing. This program is concerned with the development of a high-resolution streak image tube compatible with x-ray input and electronic digital output. The tube must be capable of time resolution down to 100 psec and spatial resolution to provide greater than 1000 resolution elements across the cathode (much greater than presently available). Another objective is to develop the capability to make design changes in tube configurations to meet different experimental requirements. A demountable prototype streak tube was constructed, mounted on an optical bench, and placed in a vacuum system. Initial measurements of the tube resolution with an undeflected image show a resolution of 32 line pairs per millimeter over a cathode diameter of one inch, which is consistent with the predictions of the computer simulations. With the initial set of unoptmized deflection plates, the resolution pattern appeared to remain unchanged for static deflections of +- 1/2-inch, a total streak length of one inch, also consistent with the computer simulations. A passively mode-locked frequency-doubled dye laser is being developed as an ultraviolet pulsed light source to measure dynamic tube resolution during streaking. A sweep circuit to provide the deflection voltage in the prototype tube has been designed and constructed and provides a relatively linear ramp voltage with ramp durations adjustable between 10 and 1000 nsec

  18. Deflection system of a high-speed streak camera in the form of a delay line

    International Nuclear Information System (INIS)

    Korzhenevich, I.M.; Fel'dman, G.G.

    1993-01-01

    This paper presents an analysis of the operation of a meander deflection system, well-known in oscillography, when it is used to scan the image in a streak-camera tube. Effects that are specific to high-speed photography are considered. It is shown that such a deflection system imposes reduced requirements both on the steepness and on the duration of the linear leading edges of the pulses of the spark gaps that generate the sweep voltage. An example of the design of a meander deflection system whose sensitivity is a factor of two higher than for a conventional system is considered. 5 refs., 3 figs

  19. Evaluation of dynamic range for LLNL streak cameras using high contrast pulses and pulse podiatry'' on the Nova laser system

    Energy Technology Data Exchange (ETDEWEB)

    Richards, J.B.; Weiland, T.L.; Prior, J.A.

    1990-07-01

    A standard LLNL streak camera has been used to analyze high contrast pulses on the Nova laser facility. These pulses have a plateau at their leading edge (foot) with an amplitude which is approximately 1% of the maximum pulse height. Relying on other features of the pulses and on signal multiplexing, we were able to determine how accurately the foot amplitude was being represented by the camera. Results indicate that the useful single channel dynamic range of the instrument approaches 100:1. 1 ref., 4 figs., 1 tab.

  20. Development and performance test of picosecond pulse x-ray excited streak camera system for scintillator characterization

    International Nuclear Information System (INIS)

    Yanagida, Takayuki; Fujimoto, Yutaka; Yoshikawa, Akira

    2010-01-01

    To observe time and wavelength-resolved scintillation events, picosecond pulse X-ray excited streak camera system is developed. The wavelength range spreads from vacuum ultraviolet (VUV) to near infrared region (110-900 nm) and the instrumental response function is around 80 ps. This work describes the principle of the newly developed instrument and the first performance test using BaF 2 single crystal scintillator. Core valence luminescence of BaF 2 peaking around 190 and 220 nm is clearly detected by our system, and the decay time turned out to be of 0.7 ns. These results are consistent with literature and confirm that our system properly works. (author)

  1. Realization of an optical multi and mono-channel analyzer, associated to a streak camera. Application to metrology of picosecond low intensity luminous pulses

    International Nuclear Information System (INIS)

    Roth, J.M.

    1985-02-01

    An electronic system including a low light level television tube (Nocticon) to digitize images from streak cameras is studied and realized. Performances (sensibility, signal-to-noise ratio) are studied and compared with a multi-channel analyzer using a linear network of photodiodes. It is applied to duration and amplitude measurement of short luminous pulses [fr

  2. Large-grazing-angle, multi-image Kirkpatrick-Baez microscope as the front end to a high-resolution streak camera for OMEGA

    International Nuclear Information System (INIS)

    Gotchev, O.V.; Hayes, L.J.; Jaanimagi, P.A.; Knauer, J.P.; Marshall, F.J.; Meyerhofer, D.D.

    2003-01-01

    A high-resolution x-ray microscope with a large grazing angle has been developed, characterized, and fielded at the Laboratory for Laser Energetics. It increases the sensitivity and spatial resolution in planar direct-drive hydrodynamic stability experiments, relevant to inertial confinement fusion research. It has been designed to work as the optical front end of the PJX - a high-current, high-dynamic-range x-ray streak camera. Optical design optimization, results from numerical ray tracing, mirror-coating choice, and characterization have been described previously [O. V. Gotchev, et al., Rev. Sci. Instrum. 74, 2178 (2003)]. This work highlights the optics' unique mechanical design and flexibility and considers certain applications that benefit from it. Characterization of the microscope's resolution in terms of its modulation transfer function over the field of view is shown. Recent results from hydrodynamic stability experiments, diagnosed with the optic and the PJX, are provided to confirm the microscope's advantages as a high-resolution, high-throughput x-ray optical front end for streaked imaging

  3. Absolute calibration method for fast-streaked, fiber optic light collection, spectroscopy systems

    International Nuclear Information System (INIS)

    Johnston, Mark D.; Frogget, Brent; Oliver, Bryan Velten; Maron, Yitzhak; Droemer, Darryl W.; Crain, Marlon D.

    2010-01-01

    This report outlines a convenient method to calibrate fast (<1ns resolution) streaked, fiber optic light collection, spectroscopy systems. Such a system is used to collect spectral data on plasmas generated in the A-K gap of electron beam diodes fielded on the RITS-6 accelerator (8-12MV, 140-200kA). On RITS, light is collected through a small diameter (200 micron) optical fiber and recorded on a fast streak camera at the output of 1 meter Czerny-Turner monochromator (F/7 optics). To calibrate such a system, it is necessary to efficiently couple light from a spectral lamp into a 200 micron diameter fiber, split it into its spectral components, with 10 Angstroms or less resolution, and record it on a streak camera with 1ns or less temporal resolution.

  4. Streak-Camera Measurements with High Currents in PEP-II and Variable Optics in SPEAR3

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, Weixeng; Fisher, Alan, a Corbett, Jeff; /SLAC

    2008-06-05

    A dual-axis, synchroscan streak camera was used to measure longitudinal bunch profiles in three storage rings at SLAC: the PEP-II low- and high-energy rings, and SPEAR3. At high currents, both PEP rings exhibit a transient synchronous-phase shift along the bunch train due to RF-cavity beam loading. Bunch length and profile asymmetry were measured along the train for a range of beam currents. To avoid the noise inherent in a dual-axis sweep, we accumulated single-axis synchroscan images while applying a 50-ns gate to the microchannel plate. To improve the extinction ratio, an upstream mirror pivoting at 1 kHz was synchronized with the 2kHz MCP gate to deflect light from other bunches off the photocathode. Bunch length was also measured on the HER as a function of beam energy. For SPEAR3 we measured bunch length as a function of single-bunch current for several lattices: achromatic, low-emittance and low momentum compaction. In the first two cases, resistive and reactive impedance components can be extracted from the longitudinal bunch profiles. In the low-alpha configurations, we observed natural bunch lengths approaching the camera resolution, requiring special care to remove instrumental effects, and saw evidence of periodic bursting.

  5. X-ray streak and framing camera techniques

    International Nuclear Information System (INIS)

    Coleman, L.W.; Attwood, D.T.

    1975-01-01

    This paper reviews recent developments and applications of ultrafast diagnostic techniques for x-ray measurements. These techniques, based on applications of image converter devices, are already capable of significantly important resolution capabilities. Techniques capable of time resolution in the sub-nanosecond regime are being considered. Mechanical cameras are excluded from considerations as are devices using phosphors or fluors as x-ray converters

  6. Two-color spatial and temporal temperature measurements using a streaked soft x-ray imager

    Energy Technology Data Exchange (ETDEWEB)

    Moore, A. S., E-mail: alastair.moore@physics.org; Ahmed, M. F.; Soufli, R.; Pardini, T.; Hibbard, R. L.; Bailey, C. G.; Bell, P. M.; Hau-Riege, S. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); Benstead, J.; Morton, J.; Guymer, T. M.; Garbett, W. J.; Rubery, M. S.; Skidmore, J. W. [Directorate Science and Technology, AWE Aldermaston, Reading RG7 4PR (United Kingdom); Bedzyk, M.; Shoup, M. J.; Regan, S. P.; Agliata, T.; Jungquist, R. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623 (United States); Schmidt, D. W. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); and others

    2016-11-15

    A dual-channel streaked soft x-ray imager has been designed and used on high energy-density physics experiments at the National Ignition Facility. This streaked imager creates two images of the same x-ray source using two slit apertures and a single shallow angle reflection from a nickel mirror. Thin filters are used to create narrow band pass images at 510 eV and 360 eV. When measuring a Planckian spectrum, the brightness ratio of the two images can be translated into a color-temperature, provided that the spectral sensitivity of the two images is well known. To reduce uncertainty and remove spectral features in the streak camera photocathode from this photon energy range, a thin 100 nm CsI on 50 nm Al streak camera photocathode was implemented. Provided that the spectral shape is well-known, then uncertainties on the spectral sensitivity limits the accuracy of the temperature measurement to approximately 4.5% at 100 eV.

  7. Large-Grazing-Angle, Multi-Image Kirkpatrick-Baez Microscope as the Front End to a High-Resolution Streak Camera for OMEGA

    International Nuclear Information System (INIS)

    Gotchev, O.V.; Hayes, L.J.; Jaanimagi, P.A.; Knauer, J.P.; Marshall, F.J.; Meyerhofer, D. D.

    2003-01-01

    (B204)A new, high-resolution x-ray microscope with a large grazing angle has been developed, characterized, and fielded at the Laboratory for Laser Energetics. It increases the sensitivity and spatial resolution in planar direct-drive hydrodynamic stability experiments, relevant to inertial confinement fusion (ICF) research. It has been designed to work as the optical front end of the PJX-a high-current, high-dynamic-range x-ray streak camera. Optical design optimization, results from numerical ray tracing, mirror-coating choice, and characterization have been described previously [O. V. Gotchev, et al./Rev. Sci. Instrum. 74, 2178 (2003)]. This work highlights the optics' unique mechanical design and flexibility and considers certain applications that benefit from it. Characterization of the microscope's resolution in terms of its modulation transfer function (MTF) over the field of view is shown. Recent results from hydrodynamic stability experiments, diagnosed with the optic and the PJX, are provided to confirm the microscope's advantages as a high-resolution, high-throughput x-ray optical front end for streaked imaging

  8. Initial tests of the dual-sweep streak camera system planned for APS particle-beam diagnostics

    International Nuclear Information System (INIS)

    Lumpkin, A.; Yang, B.; Gai, W.; Cieslik, W.

    1995-01-01

    Initial tests of a dual-sweep streak system planned for use on the Advanced Photon Source (APS) have been performed using assets of the Argonne Wakefield Accelerator (AWA) facility. The short light pulses from the photoelectric injector drive laser in both the visible (λ=496 nm, Δt∼1.5 ps (FWHM)), and the ultraviolet (λ=248 nm, Δt∼5 ps (FWHM)) were used. Both a UV-visible S20 photocathode streak tube and a UV-to-x-ray Au photocathode streak tube were tested. Calibration data with an etalon were also obtained. A sample of dual-sweep streak data using optical synchrotron radiation on the APS injector synchrotron is also presented

  9. Imacon 600 ultrafast streak camera evaluation

    International Nuclear Information System (INIS)

    Owen, T.C.; Coleman, L.W.

    1975-01-01

    The Imacon 600 has a number of designed in disadvantages for use as an ultrafast diagnostic instrument. The unit is physically large (approximately 5' long) and uses an external power supply rack for the image intensifier. Water cooling is required for the intensifier; it is quiet but not conducive to portability. There is no interlock on the cooling water. The camera does have several switch selectable sweep speeds. This is desirable if one is working with both slow and fast events. The camera can be run in a framing mode. (MOW)

  10. Time- and wavelength-resolved luminescence evaluation of several types of scintillators using streak camera system equipped with pulsed X-ray source

    Energy Technology Data Exchange (ETDEWEB)

    Furuya, Yuki, E-mail: f.yuki@mail.tagen.tohoku.ac.j [Institute of Multidisciplinary Research for Advanced Materials, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai 980-8577 (Japan); Yanagida, Takayuki; Fujimoto, Yutaka; Yokota, Yuui; Kamada, Kei [Institute of Multidisciplinary Research for Advanced Materials, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai 980-8577 (Japan); Kawaguchi, Noriaki [Institute of Multidisciplinary Research for Advanced Materials, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai 980-8577 (Japan); Research and Development Division, Tokuyama., Co. Ltd., ICR-Building, Minamiyoshinari, Aoba-ku, Sendai (Japan); Ishizu, Sumito [Research and Development Division, Tokuyama., Co. Ltd., ICR-Building, Minamiyoshinari, Aoba-ku, Sendai (Japan); Uchiyama, Koro; Mori, Kuniyoshi [Hamamatsu Photonics K.K., 325-6, Sunayama-cho, Naka-ku, Hamamatsu, Shizuoka 430-8587 (Japan); Kitano, Ken [Vacuum and Optical Instruments, 2-18-18 Shimomaruko, Ota, Tokyo 146-0092 (Japan); Nikl, Martin [Institute of Physics ASCR, Cukrovarnicka 10, Prague 6, 162-53 (Czech Republic); Yoshikawa, Akira [Institute of Multidisciplinary Research for Advanced Materials, Tohoku University, 2-1-1 Katahira, Aoba-ku, Sendai 980-8577 (Japan); NICHe, Tohoku University, 6-6-10 Aoba, Aramaki, Aoba-ku, Sendai 980-8579 (Japan)

    2011-04-01

    To design new scintillating materials, it is very important to understand detailed information about the events, which occurred during the excitation and emission processes under the ionizing radiation excitation. We developed a streak camera system equipped with picosecond pulsed X-ray source to observe time- and wavelength-resolved scintillation events. In this report, we test the performance of this new system using several types of scintillators including bulk oxide/halide crystals, transparent ceramics, plastics and powders. For all samples, the results were consistent with those reported previously. The results demonstrated that the developed system is suitable for evaluation of the scintillation properties.

  11. A new streaked soft x-ray imager for the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Benstead, J., E-mail: james.benstead@awe.co.uk; Morton, J.; Guymer, T. M.; Garbett, W. J.; Rubery, M. S.; Skidmore, J. W. [AWE, Aldermaston, Reading, Berkshire RG7 4PR (United Kingdom); Moore, A. S.; Ahmed, M. F.; Soufli, R.; Pardini, T.; Hibbard, R. L.; Bailey, C. G.; Bell, P. M.; Hau-Riege, S. [Lawrence Livermore National Laboratory, Livermore, California 94550 (United States); Bedzyk, M.; Shoup, M. J.; Reagan, S.; Agliata, T.; Jungquist, R. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623 (United States); Schmidt, D. W. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); and others

    2016-05-15

    A new streaked soft x-ray imager has been designed for use on high energy-density (HED) physics experiments at the National Ignition Facility based at the Lawrence Livermore National Laboratory. This streaked imager uses a slit aperture, single shallow angle reflection from a nickel mirror, and soft x-ray filtering to, when coupled to one of the NIF’s x-ray streak cameras, record a 4× magnification, one-dimensional image of an x-ray source with a spatial resolution of less than 90 μm. The energy band pass produced depends upon the filter material used; for the first qualification shots, vanadium and silver-on-titanium filters were used to gate on photon energy ranges of approximately 300–510 eV and 200–400 eV, respectively. A two-channel version of the snout is available for x-ray sources up to 1 mm and a single-channel is available for larger sources up to 3 mm. Both the one and two-channel variants have been qualified on quartz wire and HED physics target shots.

  12. Neutron imaging system based on a video camera

    International Nuclear Information System (INIS)

    Dinca, M.

    2004-01-01

    possess versatile and unique readout capabilities that have established their utility in scientific and especially for radiation field applications. A detector for neutron radiography based on a cooled CID camera offers some capabilities, as follows: - Extended linear dynamic range up to 109 without blooming or streaking; - Arbitrary pixel selection and nondestructive readout makes it possible to introduce a high degree of exposure control to low-light viewing of static scenes; - Read multiple areas of interest of an image within a given frame at higher rates; - Wide spectral response (185 nm - 1100 nm); - CIDs tolerate high radiation environments up to 3 Mrad integrated dose; - The contiguous pixel structure of CID arrays contributes to accurate imaging because there are virtually no opaque areas between pixels. (author)

  13. X-ray streak-camera study of the dynamics of laser-imploded microballoons

    International Nuclear Information System (INIS)

    Key, M.H.; Lamb, M.J.; Lewis, C.L.S.; Moore, A.; Evans, R.G.

    1979-01-01

    The time and space development of the x-ray emission from the irradiated target surface and the implosion core in laser-compressed glass microballoons is recorded by x-ray streak photography. The experimental variation of implosion time with target mass and laser energy is considered and compared with computer modeling of the implosion

  14. Image-converter streak cameras with very high gain

    International Nuclear Information System (INIS)

    1975-01-01

    A new camera is described with slit scanning and very high photonic gain (G=5000). Development of the technology of tubes and microchannel plates has enabled integration of such an amplifying element in an image converter tube which does away with the couplings and the intermediary electron-photon-electron conversions of the classical converter systems having external amplification. It is thus possible to obtain equal or superior performance while retaining considerable gain for the camera, great compactness, great flexibility in use, and easy handling. (author)

  15. Initial Demonstration of 9-MHz Framing Camera Rates on the FAST UV Drive Laser Pulse Trains

    Energy Technology Data Exchange (ETDEWEB)

    Lumpkin, A. H. [Fermilab; Edstrom Jr., D. [Fermilab; Ruan, J. [Fermilab

    2016-10-09

    We report the configuration of a Hamamatsu C5680 streak camera as a framing camera to record transverse spatial information of green-component laser micropulses at 3- and 9-MHz rates for the first time. The latter is near the time scale of the ~7.5-MHz revolution frequency of the Integrable Optics Test Accelerator (IOTA) ring and its expected synchroton radiation source temporal structure. The 2-D images are recorded with a Gig-E readout CCD camera. We also report a first proof of principle with an OTR source using the linac streak camera in a semi-framing mode.

  16. Streaked, x-ray-transmission-grating spectrometer

    International Nuclear Information System (INIS)

    Ceglio, N.M.; Roth, M.; Hawryluk, A.M.

    1981-08-01

    A free standing x-ray transmission grating has been coupled with a soft x-ray streak camera to produce a time resolved x-ray spectrometer. The instrument has a temporal resolution of approx. 20 psec, is capable of covering a broad spectral range, 2 to 120 A, has high sensitivity, and is simple to use requiring no complex alignment procedure. In recent laser fusion experiments the spectrometer successfully recorded time resolved spectra over the range 10 to 120 A with a spectral resolving power, lambda/Δlambda of 4 to 50, limited primarily by source size and collimation effects

  17. Ultrafast streak and framing technique for the observation of laser driven shock waves in transparent solid targets

    International Nuclear Information System (INIS)

    Van Kessel, C.G.M.; Sachsenmaier, P.; Sigel, R.

    1975-01-01

    Shock waves driven by laser ablation in plane transparent plexiglass and solid hydrogen targets have been observed with streak and framing techniques using a high speed image converter camera, and a dye laser as a light source. The framing pictures have been made by mode locking the dye laser and using a wide streak slit. In both materials a growing hemispherical shock wave is observed with the maximum velocity at the onset of laser radiation. (author)

  18. Wind Streaks on Earth; Exploration and Interpretation

    Science.gov (United States)

    Cohen-Zada, Aviv Lee; Blumberg, Dan G.; Maman, Shimrit

    2015-04-01

    Wind streaks, one of the most common aeolian features on planetary surfaces, are observable on the surface of the planets Earth, Mars and Venus. Due to their reflectance properties, wind streaks are distinguishable from their surroundings, and they have thus been widely studied by remote sensing since the early 1970s, particularly on Mars. In imagery, these streaks are interpreted as the presence - or lack thereof - of small loose particles on the surface deposited or eroded by wind. The existence of wind streaks serves as evidence for past or present active aeolian processes. Therefore, wind streaks are thought to represent integrative climate processes. As opposed to the comprehensive and global studies of wind streaks on Mars and Venus, wind streaks on Earth are understudied and poorly investigated, both geomorphologically and by remote sensing. The aim of this study is, thus, to fill the knowledge gap about the wind streaks on Earth by: generating a global map of Earth wind streaks from modern high-resolution remotely sensed imagery; incorporating the streaks in a geographic information system (GIS); and overlaying the GIS layers with boundary layer wind data from general circulation models (GCMs) and data from the ECMWF Reanalysis Interim project. The study defines wind streaks (and thereby distinguishes them from other aeolian features) based not only on their appearance in imagery but more importantly on their surface appearance. This effort is complemented by a focused field investigation to study wind streaks on the ground and from a variety of remotely sensed images (both optical and radar). In this way, we provide a better definition of the physical and geomorphic characteristics of wind streaks and acquire a deeper knowledge of terrestrial wind streaks as a means to better understand global and planetary climate and climate change. In a preliminary study, we detected and mapped over 2,900 wind streaks in the desert regions of Earth distributed in

  19. Potential applications of a dual-sweep streak camera system for characterizing particle and photon beams of VUV, XUV, and x-ray FELS

    Energy Technology Data Exchange (ETDEWEB)

    Lumpkin, A. [Argonne National Lab., IL (United States)

    1995-12-31

    The success of time-resolved imaging techniques in the Characterization of particle beams and photon beams of the recent generation of L-band linac-driven or storage ring FELs in the infrared, visible, and ultraviolet wavelength regions can be extended to the VUV, XUV, and x-ray FELs. Tests and initial data have been obtained with the Hamamatsu C5680 dual-sweep streak camera system which includes a demountable photocathode (thin Au) assembly and a flange that allows windowless operation with the transport vacuum system. This system can be employed at wavelengths shorter than 100 nm and down to 1 {Angstrom}. First tests on such a system at 248-nm wavelengths have been performed oil the Argonne Wakefield Accelerator (AWA) drive laser source. A quartz window was used at the tube entrance aperture. A preliminary test using a Be window mounted on a different front flange of the streak tube to look at an x-ray bremsstrahlung source at the AWA was limited by photon statistics. This system`s limiting resolution of {sigma}{approximately}1.1 ps observed at 248 nm would increase with higher incoming photon energies to the photocathode. This effect is related to the fundamental spread in energies of the photoelectrons released from the photocathodes. Possible uses of the synchrotron radiation sources at the Advanced Photon Source and emerging short wavelength FELs to test the system will be presented.

  20. Streaking into middle school science: The Dell Streak pilot project

    Science.gov (United States)

    Austin, Susan Eudy

    A case study is conducted implementing the Dell Streak seven-inch android device into eighth grade science classes of one teacher in a rural middle school in the Piedmont region of North Carolina. The purpose of the study is to determine if the use of the Dell Streaks would increase student achievement on standardized subject testing, if the Streak could be used as an effective instructional tool, and if it could be considered an effective instructional resource for reviewing and preparing for the science assessments. A mixed method research design was used for the study to analyze both quantitative and qualitative results to determine if the Dell Streaks' utilization could achieve the following: 1. instructional strategies would change, 2. it would be an effective instructional tool, and 3. a comparison of the students' test scores and benchmark assessments' scores would provide statistically significant difference. Through the use of an ANOVA it was determined a statistically significant difference had occurred. A Post Hoc analysis was conducted to identify where the difference occurred. Finally a T-test determined was there was no statistically significance difference between the mean End-of-Grade tests and four quarterly benchmark scores of the control and the experimental groups. Qualitative research methods were used to gather results to determine if the Streaks were an effective instructional tool. Classroom observations identified that the teacher's teaching styles and new instructional strategies were implemented throughout the pilot project. Students had an opportunity to complete a questionnaire three times during the pilot project. Results revealed what the students liked about using the devices and the challenges they were facing. The teacher completed a reflective questionnaire throughout the pilot project and offered valuable reflections about the use of the devices in an educational setting. The reflection data supporting the case study was drawn

  1. Target 3-D reconstruction of streak tube imaging lidar based on Gaussian fitting

    Science.gov (United States)

    Yuan, Qingyu; Niu, Lihong; Hu, Cuichun; Wu, Lei; Yang, Hongru; Yu, Bing

    2018-02-01

    Streak images obtained by the streak tube imaging lidar (STIL) contain the distance-azimuth-intensity information of a scanned target, and a 3-D reconstruction of the target can be carried out through extracting the characteristic data of multiple streak images. Significant errors will be caused in the reconstruction result by the peak detection method due to noise and other factors. So as to get a more precise 3-D reconstruction, a peak detection method based on Gaussian fitting of trust region is proposed in this work. Gaussian modeling is performed on the returned wave of single time channel of each frame, then the modeling result which can effectively reduce the noise interference and possesses a unique peak could be taken as the new returned waveform, lastly extracting its feature data through peak detection. The experimental data of aerial target is for verifying this method. This work shows that the peak detection method based on Gaussian fitting reduces the extraction error of the feature data to less than 10%; utilizing this method to extract the feature data and reconstruct the target make it possible to realize the spatial resolution with a minimum 30 cm in the depth direction, and improve the 3-D imaging accuracy of the STIL concurrently.

  2. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-01-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique

  3. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-11-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique

  4. Streak detection and analysis pipeline for optical images

    Science.gov (United States)

    Virtanen, J.; Granvik, M.; Torppa, J.; Muinonen, K.; Poikonen, J.; Lehti, J.; Säntti, T.; Komulainen, T.; Flohrer, T.

    2014-07-01

    We describe a novel data processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data to support the development and validation of population models, and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. We focus on the low signal-to-noise (SNR) detection of objects with high angular velocities, resulting in long and faint object trails, or streaks, in the optical images. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and, particularly for satellites, within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a 'track-before-detect' problem, resulting in streaks of arbitrary lengths. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, algorithms are not readily available yet. In the ESA-funded StreakDet (Streak detection and astrometric reduction) project, we develop and evaluate an automated processing pipeline applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The algorithmic

  5. Triton's streaks as windblown dust

    Science.gov (United States)

    Sagan, Carl; Chyba, Christopher

    1990-01-01

    Explanations for the surface streaks observed by Voyager 2 on Triton's southern hemisphere are discussed. It is shown that, despite Triton's tenuous atmosphere, low-cohesion dust trains with diameters of about 5 micron or less may be carried into suspension by aeolian surface shear stress, given expected geostrophic wind speeds of about 10 m/s. For geyser-like erupting dust plumes, it is shown that dust-settling time scales and expected wind velocities can produce streaks with length scales in good agreement with those of the streaks. Thus, both geyserlike eruptions or direct lifting by surface winds appear to be viable mechanisms for the origin of the streaks.

  6. Streak detection and analysis pipeline for space-debris optical images

    Science.gov (United States)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  7. Simultaneous streak and frame interferometry for electron density measurements of laser produced plasmas

    Energy Technology Data Exchange (ETDEWEB)

    Quevedo, H. J., E-mail: hjquevedo@utexas.edu; McCormick, M.; Wisher, M.; Bengtson, Roger D.; Ditmire, T. [Center for High Energy Density Science, Department of Physics, University of Texas at Austin, Austin, Texas 78712 (United States)

    2016-01-15

    A system of two collinear probe beams with different wavelengths and pulse durations was used to capture simultaneously snapshot interferograms and streaked interferograms of laser produced plasmas. The snapshots measured the two dimensional, path-integrated, electron density on a charge-coupled device while the radial temporal evolution of a one dimensional plasma slice was recorded by a streak camera. This dual-probe combination allowed us to select plasmas that were uniform and axisymmetric along the laser direction suitable for retrieving the continuous evolution of the radial electron density of homogeneous plasmas. Demonstration of this double probe system was done by measuring rapidly evolving plasmas on time scales less than 1 ns produced by the interaction of femtosecond, high intensity, laser pulses with argon gas clusters. Experiments aimed at studying homogeneous plasmas from high intensity laser-gas or laser-cluster interaction could benefit from the use of this probing scheme.

  8. Nonlinear streak computation using boundary region equations

    Energy Technology Data Exchange (ETDEWEB)

    Martin, J A; Martel, C, E-mail: juanangel.martin@upm.es, E-mail: carlos.martel@upm.es [Depto. de Fundamentos Matematicos, E.T.S.I Aeronauticos, Universidad Politecnica de Madrid, Plaza Cardenal Cisneros 3, 28040 Madrid (Spain)

    2012-08-01

    The boundary region equations (BREs) are applied for the simulation of the nonlinear evolution of a spanwise periodic array of streaks in a flat plate boundary layer. The well-known BRE formulation is obtained from the complete Navier-Stokes equations in the high Reynolds number limit, and provides the correct asymptotic description of three-dimensional boundary layer streaks. In this paper, a fast and robust streamwise marching scheme is introduced to perform their numerical integration. Typical streak computations present in the literature correspond to linear streaks or to small-amplitude nonlinear streaks computed using direct numerical simulation (DNS) or the nonlinear parabolized stability equations (PSEs). We use the BREs to numerically compute high-amplitude streaks, a method which requires much lower computational effort than DNS and does not have the consistency and convergence problems of the PSE. It is found that the flow configuration changes substantially as the amplitude of the streaks grows and the nonlinear effects come into play. The transversal motion (in the wall normal-streamwise plane) becomes more important and strongly distorts the streamwise velocity profiles, which end up being quite different from those of the linear case. We analyze in detail the resulting flow patterns for the nonlinearly saturated streaks and compare them with available experimental results. (paper)

  9. 100ps UV/x-ray framing camera

    International Nuclear Information System (INIS)

    Eagles, R.T.; Freeman, N.J.; Allison, J.M.; Sibbett, W.; Sleat, W.E.; Walker, D.R.

    1988-01-01

    The requirement for a sensitive two-dimensional imaging diagnostic with picosecond time resolution, particularly in the study of laser-produced plasmas, has previously been discussed. A temporal sequence of framed images would provide useful supplementary information to that provided by time resolved streak images across a spectral region of interest from visible to x-ray. To fulfill this requirement the Picoframe camera system has been developed. Results pertaining to the operation of a camera having S20 photocathode sensitivity are reviewed and the characteristics of an UV/x-ray sensitive version of the Picoframe system are presented

  10. Ultra-fast framing camera tube

    Science.gov (United States)

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  11. Transition due to streamwise streaks in a supersonic flat plate boundary layer

    Science.gov (United States)

    Paredes, Pedro; Choudhari, Meelan M.; Li, Fei

    2016-12-01

    Transition induced by stationary streaks undergoing transient growth in a supersonic flat plate boundary layer flow is studied using numerical computations. While the possibility of strong transient growth of small-amplitude stationary perturbations in supersonic boundary layer flows has been demonstrated in previous works, its relation to laminar-turbulent transition cannot be established within the framework of linear disturbances. Therefore, this paper investigates the nonlinear evolution of initially linear optimal disturbances that evolve into finite amplitude streaks in the downstream region, and then studies the modal instability of those streaks as a likely cause for the onset of bypass transition. The nonmodal evolution of linearly optimal stationary perturbations in a supersonic, Mach 3 flat plate boundary layer is computed via the nonlinear plane-marching parabolized stability equations (PSE) for stationary perturbations, or equivalently, the perturbation form of parabolized Navier-Stokes equations. To assess the effect of the nonlinear finite-amplitude streaks on transition, the linear form of plane-marching PSE is used to investigate the instability of the boundary layer flow modified by the spanwise periodic streaks. The onset of transition is estimated using an N -factor criterion based on modal amplification of the secondary instabilities of the streaks. In the absence of transient growth disturbances, first mode instabilities in a Mach 3, zero pressure gradient boundary layer reach N =10 at Rex≈107 . However, secondary instability modes of the stationary streaks undergoing transient growth are able to achieve the same N -factor at Rex<2 ×106 when the initial streak amplitude is sufficiently large. In contrast to the streak instabilities in incompressible flows, subharmonic instability modes with twice the fundamental spanwise wavelength of the streaks are found to have higher amplification ratios than the streak instabilities at fundamental

  12. Movement-based Interaction in Camera Spaces

    DEFF Research Database (Denmark)

    Eriksson, Eva; Riisgaard Hansen, Thomas; Lykke-Olesen, Andreas

    2006-01-01

    In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movement-based projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space......, relations, and feedback. We see these as central for describing and analysing movement-based systems using camera tracking and we show how these three concepts can be used to analyse other camera tracking applications....

  13. MULTIMODAL IMAGING OF ANGIOID STREAKS ASSOCIATED WITH TURNER SYNDROME.

    Science.gov (United States)

    Chiu, Bing Q; Tsui, Edmund; Hussnain, Syed Amal; Barbazetto, Irene A; Smith, R Theodore

    2018-02-13

    To report multimodal imaging in a novel case of angioid streaks in a patient with Turner syndrome with 10-year follow-up. Case report of a patient with Turner syndrome and angioid streaks followed at Bellevue Hospital Eye Clinic from 2007 to 2017. Fundus photography, fluorescein angiography, and optical coherence tomography angiography were obtained. Angioid streaks with choroidal neovascularization were noted in this patient with Turner syndrome without other systemic conditions previously correlated with angioid streaks. We report a case of angioid streaks with choroidal neovascularization in a patient with Turner syndrome. We demonstrate that angioid streaks, previously associated with pseudoxanthoma elasticum, Ehlers-Danlos syndrome, Paget disease of bone, and hemoglobinopathies, may also be associated with Turner syndrome, and may continue to develop choroidal neovascularization, suggesting the need for careful ophthalmic examination in these patients.

  14. A numerical algorithm to evaluate the transient response for a synchronous scanning streak camera using a time-domain Baum–Liu–Tesche equation

    Energy Technology Data Exchange (ETDEWEB)

    Pei, Chengquan [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China); Tian, Jinshou [Xi' an Institute of Optics and Precision Mechanics, Chinese Academy of Sciences, Xi' an 710119 (China); Wu, Shengli, E-mail: slwu@mail.xjtu.edu.cn [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China); He, Jiai [School of Computer and Communication, Lanzhou University of Technology, Lanzhou, Gansu 730050 (China); Liu, Zhen [Key Laboratory for Physical Electronics and Devices of the Ministry of Education, Xi' an Jiaotong University, Xi’an 710049 (China)

    2016-10-01

    The transient response is of great influence on the electromagnetic compatibility of synchronous scanning streak cameras (SSSCs). In this paper we propose a numerical method to evaluate the transient response of the scanning deflection plate (SDP). First, we created a simplified circuit model for the SDP used in an SSSC, and then derived the Baum–Liu–Tesche (BLT) equation in the frequency domain. From the frequency-domain BLT equation, its transient counterpart was derived. These parameters, together with the transient-BLT equation, were used to compute the transient load voltage and load current, and then a novel numerical method to fulfill the continuity equation was used. Several numerical simulations were conducted to verify this proposed method. The computed results were then compared with transient responses obtained by a frequency-domain/fast Fourier transform (FFT) method, and the accordance was excellent for highly conducting cables. The benefit of deriving the BLT equation in the time domain is that it may be used with slight modifications to calculate the transient response and the error can be controlled by a computer program. The result showed that the transient voltage was up to 1000 V and the transient current was approximately 10 A, so some protective measures should be taken to improve the electromagnetic compatibility.

  15. A numerical algorithm to evaluate the transient response for a synchronous scanning streak camera using a time-domain Baum–Liu–Tesche equation

    International Nuclear Information System (INIS)

    Pei, Chengquan; Tian, Jinshou; Wu, Shengli; He, Jiai; Liu, Zhen

    2016-01-01

    The transient response is of great influence on the electromagnetic compatibility of synchronous scanning streak cameras (SSSCs). In this paper we propose a numerical method to evaluate the transient response of the scanning deflection plate (SDP). First, we created a simplified circuit model for the SDP used in an SSSC, and then derived the Baum–Liu–Tesche (BLT) equation in the frequency domain. From the frequency-domain BLT equation, its transient counterpart was derived. These parameters, together with the transient-BLT equation, were used to compute the transient load voltage and load current, and then a novel numerical method to fulfill the continuity equation was used. Several numerical simulations were conducted to verify this proposed method. The computed results were then compared with transient responses obtained by a frequency-domain/fast Fourier transform (FFT) method, and the accordance was excellent for highly conducting cables. The benefit of deriving the BLT equation in the time domain is that it may be used with slight modifications to calculate the transient response and the error can be controlled by a computer program. The result showed that the transient voltage was up to 1000 V and the transient current was approximately 10 A, so some protective measures should be taken to improve the electromagnetic compatibility.

  16. The origin and structure of streak-like instabilities in laminar boundary layer flames

    Science.gov (United States)

    Gollner, Michael; Miller, Colin; Tang, Wei; Finney, Mark

    2017-11-01

    Streamwise streaks are consistently observed in wildland fires, at the base of pool fires, and in other heated flows within a boundary layer. This study examines both the origin of these structures and their role in influencing some of the macroscopic properties of the flow. Streaks were reproduced and characterized via experiments on stationary heated strips and liquid and gas-fueled burners in laminar boundary layer flows, providing a framework to develop theory based on both observed and measured physical phenomena. The incoming boundary layer was established as the controlling mechanism in forming streaks, which are generated by pre-existing coherent structures, while the amplification of streaks was determined to be compatible with quadratic growth of Rayleigh-Taylor Instabilities, providing credence to the idea that the downstream growth of streaks is strongly tied to buoyancy. These local instabilities were also found to affect macroscopic properties of the flow, including heat transfer to the surface, indicating that a two-dimensional assumption may fail to adequately describe heat and mass transfer during flame spread and other reacting boundary layer flows. This work was supported by NSF (CBET-1554026) and the USDA-FS (13-CS-11221637-124).

  17. StreakDet data processing and analysis pipeline for space debris optical observations

    Science.gov (United States)

    Virtanen, Jenni; Flohrer, Tim; Muinonen, Karri; Granvik, Mikael; Torppa, Johanna; Poikonen, Jonne; Lehti, Jussi; Santti, Tero; Komulainen, Tuomo; Naranen, Jyri

    We describe a novel data processing and analysis pipeline for optical observations of space debris. The monitoring of space object populations requires reliable acquisition of observational data, to support the development and validation of space debris environment models, the build-up and maintenance of a catalogue of orbital elements. In addition, data is needed for the assessment of conjunction events and for the support of contingency situations or launches. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a “track before detect” problem, resulting in streaks, i.e., object trails of arbitrary lengths, in the images. The scope of the ESA-funded StreakDet (Streak detection and astrometric reduction) project is to investigate solutions for detecting and reducing streaks from optical images, particularly in the low signal-to-noise ratio (SNR) domain, where algorithms are not readily available yet. For long streaks, the challenge is to extract precise position information and related registered epochs with sufficient precision. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, there is a need to discuss and compare these approaches for space debris analysis, in order to develop and evaluate prototype implementations. In the StreakDet project, we develop algorithms applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The proposed processing pipeline starts from the

  18. Homography-based multiple-camera person-tracking

    Science.gov (United States)

    Turk, Matthew R.

    2009-01-01

    Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of

  19. Streaking tremor in Cascadia

    Science.gov (United States)

    Vidale, J. E.; Ghosh, A.; Sweet, J. R.; Creager, K. C.; Wech, A.; Houston, H.

    2009-12-01

    Details of tremor deep in subduction zones is damnably difficult to glimpse because of the lack of crisp initial arrivals, low waveform coherence, uncertain focal mechanisms, and the probability of simultaneous activity across extended regions. Yet such details hold out the best hope to illuminate the unknown mechanisms underlying episodic tremor and slip. Attacking this problem with brute force, we pointed a small, very dense seismic array down at the migration path of a good-sized episodic tremor and slip (ETS) event. In detail, it was an 84-element, 1300-m-aperture temporary seismic array in northern Washington, and the migration path of the May 2008 ETS event was 30-40 km directly underneath. Our beamforming technique tracked the time, incident angle, and azimuth of tremor radiation in unprecedented detail. We located the tremor by assuming it occurs on the subduction interface, estimated relative tremor moment released by each detected tremor window, and mapped it on the interface [Ghosh et al., GRL, 2009]. Fortunately for our ability to image it, the tremor generally appears to emanate from small regions, and we were surprised by how steadily the regions migrated with time. For the first time in Cascadia, we found convergence-parallel transient streaks of tremor migrating at velocities of several tens of km/hr, with movement in both up- and down-dip directions. Similar patterns have been seen in Japan [Shelly, G3, 2007]. This is in contrast to the long-term along-strike marching of tremor at 10 km/day. These streaks tend to propagate steadily and often repeat the same track on the interface multiple times. They light up persistent moment patches on the interface by a combination of increased amplitude and longer residence time within the patches. The up- and down-dip migration dominates the 2 days of tremor most clearly imaged by our array. The tendency of the streaks to fill in bands is the subject of the presentation of Ghosh et al. here. The physical

  20. A compact large-format streak tube for imaging lidar

    Science.gov (United States)

    Hui, Dandan; Luo, Duan; Tian, Liping; Lu, Yu; Chen, Ping; Wang, Junfeng; Sai, Xiaofeng; Wen, Wenlong; Wang, Xing; Xin, Liwei; Zhao, Wei; Tian, Jinshou

    2018-04-01

    The streak tubes with a large effective photocathode area, large effective phosphor screen area, and high photocathode radiant sensitivity are essential for improving the field of view, depth of field, and detectable range of the multiple-slit streak tube imaging lidar. In this paper, a high spatial resolution, large photocathode area, and compact meshless streak tube with a spherically curved cathode and screen is designed and tested. Its spatial resolution reaches 20 lp/mm over the entire Φ28 mm photocathode working area, and the simulated physical temporal resolution is better than 30 ps. The temporal distortion in our large-format streak tube, which is shown to be a non-negligible factor, has a minimum value as the radius of curvature of the photocathode varies. Furthermore, the photocathode radiant sensitivity and radiant power gain reach 41 mA/W and 18.4 at the wavelength of 550 nm, respectively. Most importantly, the external dimensions of our streak tube are no more than Φ60 mm × 110 mm.

  1. Betting Decision Under Break-Streak Pattern: Evidence from Casino Gaming.

    Science.gov (United States)

    Fong, Lawrence Hoc Nang; So, Amy Siu Ian; Law, Rob

    2016-03-01

    Cognitive bias is prevalent among gamblers, especially those with gambling problems. Grounded in the heuristics theories, this study contributes to the literature by examining a cognitive bias triggered by the break streak pattern in the casino setting. We postulate that gamblers tend to bet on the latest outcome when there is a break-streak pattern. Moreover, three determinants of the betting decision under break-streak pattern, including the streak length of the alternative outcome, the frequency of the latest outcome, and gender, were identified and examined in this study. A non-participatory observational study was conducted among the Cussec gamblers in a casino in Macao. An analysis of 1229 bets confirms our postulation, particularly when the streak of the alternative outcome is long, the latest outcome is frequent, and the gamblers are females. The findings provide meaningful implications for casino management and public policymakers regarding the minimization of gambling harm.

  2. Video Sharing System Based on Wi-Fi Camera

    OpenAIRE

    Qidi Lin; Hewei Yu; Jinbin Huang; Weile Liang

    2015-01-01

    This paper introduces a video sharing platform based on WiFi, which consists of camera, mobile phone and PC server. This platform can receive wireless signal from the camera and show the live video on the mobile phone captured by camera. In addition, it is able to send commands to camera and control the camera's holder to rotate. The platform can be applied to interactive teaching and dangerous area's monitoring and so on. Testing results show that the platform can share ...

  3. Orientation tuning of contrast masking caused by motion streaks.

    Science.gov (United States)

    Apthorp, Deborah; Cass, John; Alais, David

    2010-08-01

    We investigated whether the oriented trails of blur left by fast-moving dots (i.e., "motion streaks") effectively mask grating targets. Using a classic overlay masking paradigm, we varied mask contrast and target orientation to reveal underlying tuning. Fast-moving Gaussian blob arrays elevated thresholds for detection of static gratings, both monoptically and dichoptically. Monoptic masking at high mask (i.e., streak) contrasts is tuned for orientation and exhibits a similar bandwidth to masking functions obtained with grating stimuli (∼30 degrees). Dichoptic masking fails to show reliable orientation-tuned masking, but dichoptic masks at very low contrast produce a narrowly tuned facilitation (∼17 degrees). For iso-oriented streak masks and grating targets, we also explored masking as a function of mask contrast. Interestingly, dichoptic masking shows a classic "dipper"-like TVC function, whereas monoptic masking shows no dip and a steeper "handle". There is a very strong unoriented component to the masking, which we attribute to transiently biased temporal frequency masking. Fourier analysis of "motion streak" images shows interesting differences between dichoptic and monoptic functions and the information in the stimulus. Our data add weight to the growing body of evidence that the oriented blur of motion streaks contributes to the processing of fast motion signals.

  4. Angioid streaks, clinical course, complications, and current therapeutic management

    Directory of Open Access Journals (Sweden)

    Ilias Georgalas

    2008-12-01

    Full Text Available Ilias Georgalas1, Dimitris Papaconstantinou2, Chrysanthi Koutsandrea2, George Kalantzis2, Dimitris Karagiannis2, Gerasimos Georgopoulos2, Ioannis Ladas21Department of Ophthalmology, “G. Gennimatas” Hospital of Athens, NHS, Athens, Greece; 2Department of Ophthalmology, “G. Gennimatas” Hospital of Athens, University of Athens, Athens, GreeceAbstract: Angioid streaks are visible irregular crack-like dehiscences in Bruch’s membrane that are associated with atrophic degeneration of the overlying retinal pigmented epithelium. Angioid streaks may be associated with pseudoxanthoma elasticum, Paget’s disease, sickle-cell anemia, acromegaly, Ehlers–Danlos syndrome, and diabetes mellitus, but also appear in patients without any systemic disease. Patients with angioid streaks are generally asymptomatic, unless the lesions extend towards the foveola or develop complications such as traumatic Bruch’s membrane rupture or macular choroidal neovascularization (CNV. The visual prognosis in patients with CNV secondary to angioid streaks if untreated, is poor and most treatment modalities, until recently, have failed to limit the devastating impact of CNV in central vision. However, it is likely that treatment with antivascular endothelial growth factor, especially in treatment-naive eyes to yield favorable results in the future and this has to be investigated in future studies.Keywords: angioid streaks, pseudoxanthoma elasticum, choroidal neovascularization

  5. Earth aeolian wind streaks: Comparison to wind data from model and stations

    Science.gov (United States)

    Cohen-Zada, A. L.; Maman, S.; Blumberg, D. G.

    2017-05-01

    Wind streak is a collective term for a variety of aeolian features that display distinctive albedo surface patterns. Wind streaks have been used to map near-surface winds and to estimate atmospheric circulation patterns on Mars and Venus. However, because wind streaks have been studied mostly on Mars and Venus, much of the knowledge regarding the mechanism and time frame of their formation and their relationship to the atmospheric circulation cannot be verified. This study aims to validate previous studies' results by a comparison of real and modeled wind data with wind streak orientations as measured from remote-sensing images. Orientations of Earth wind streaks were statistically correlated to resultant drift direction (RDD) values calculated from reanalysis and wind data from 621 weather stations. The results showed good agreement between wind streak orientations and reanalysis RDD (r = 0.78). A moderate correlation was found between the wind streak orientations and the weather station data (r = 0.47); a similar trend was revealed on a regional scale when the analysis was performed by continent, with r ranging from 0.641 in North America to 0.922 in Antarctica. At sites where wind streak orientations did not correspond to the RDDs (i.e., a difference of 45°), seasonal and diurnal variations in the wind flow were found to be responsible for deviation from the global pattern. The study thus confirms that Earth wind streaks were formed by the present wind regime and they are indeed indicative of the long-term prevailing wind direction on global and regional scales.

  6. Approaches to diagnosis and detection of cassava brown streak ...

    African Journals Online (AJOL)

    Cassava brown streak disease (CBSD) has been a problem in the East African coastal cassava growing areas for more than 70 years. The disease is caused by successful infection with Cassava Brown Streak Virus (CBSV) (Family, Potyviridae: Genus, Ipomovirus). Diagnosis of CBSD has for long been primarily leaf ...

  7. Simulation-based camera navigation training in laparoscopy-a randomized trial

    DEFF Research Database (Denmark)

    Nilsson, Cecilia; Sørensen, Jette Led; Konge, Lars

    2017-01-01

    patient safety. The objectives of this trial were to examine how to train laparoscopic camera navigation and to explore the transfer of skills to the operating room. MATERIALS AND METHODS: A randomized, single-center superiority trial with three groups: The first group practiced simulation-based camera...... navigation tasks (camera group), the second group practiced performing a simulation-based cholecystectomy (procedure group), and the third group received no training (control group). Participants were surgical novices without prior laparoscopic experience. The primary outcome was assessment of camera.......033), had a higher score. CONCLUSIONS: Simulation-based training improves the technical skills required for camera navigation, regardless of practicing camera navigation or the procedure itself. Transfer to the clinical setting could, however, not be demonstrated. The control group demonstrated higher...

  8. Postprocessing method to clean up streaks due to noisy detectors

    International Nuclear Information System (INIS)

    Tuy, H.K.; Mattson, R.A.

    1990-01-01

    This paper reports that occasionally, one of the thousands of detectors in a CT scanner will intermittently produce erroneous data, creating streaks in the reconstructed image. The authors propose a method to identify and clean up the streaks automatically. To find the rays along which the data values are bad, a binary image registering the edges of the original image is created. Forward projection is applied to the binary image to single out edges along rays. Data along views containing the identified bad rays are estimated by means of forward projecting the original image. Back projection of the negative of the estimated convolved data along these views onto the streaky image will remove streaks from the image. Image enhancement is achieved by means of back projecting the convolved data estimated from the image after the streak removal along views of bad rays

  9. Novel computer-based endoscopic camera

    Science.gov (United States)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  10. Control of Vascular Streak Dieback Disease of Cocoa with Flutriafol Fungicides

    Directory of Open Access Journals (Sweden)

    Febrilia Nur'aini

    2014-12-01

    Full Text Available Vascular streak dieback caused by the fungus Oncobasidium theobromae is one of the important diseases in cocoa crop in Indonesia. One approach to control the disease is by using fungicides. The aim of this research was to determine the effect of class triazole fungicides to the intensity of the vascular streak dieback disease on cocoa seedling phase, immature and mature cocoa. Experiments were conducted in Kotta Blater, PTPN XII and Kaliwining, Indonesian  Coffee and Cocoa Research Institute. Flutriafol 250 g/l with a concentration 0,05%, 0,1% and 0,15% foliar sprayed on cocoa seedlings, immature and mature cocoa. Active compound combination of Azoxystrobin and Difenoconazole with 0,1% concentration used as a comparation fungicides. The result showed that Flutriafol with 0,05%, 0,1% and 0,15% concentration and Azoxystrobin & Difenoconazol with 0,1% concentration could suppress the vascular streak dieback disease on seedlings. On immature plants, the application of Flutriafol was not effectively suppress the vascular streak dieback disease whereas the fungicide comparison could suppress with the efficacy level of 46.22%. On mature plants,both of fungicides could not suppress the vascular streak dieback disease. Key words: Fungicide, cocoa, vascular streak dieback, triazole, flutriafol, azoxystrobin+difenoconazol

  11. Laparoscopic Removal of Streak Gonads in Turner Syndrome.

    Science.gov (United States)

    Mandelberger, Adrienne; Mathews, Shyama; Andikyan, Vaagn; Chuang, Linus

    To demonstrate the skills necessary for complete resection of bilateral streak gonads in Turner syndrome. Video case presentation with narration highlighting the key techniques used. The video was deemed exempt from formal review by our institutional review board. Turner syndrome is a form of gonadal dysgenesis that affects 1 in 2500 live births. Patients often have streak gonads and may present with primary amenorrhea or premature ovarian failure. Patients with a mosaic karyotype that includes a Y chromosome are at increased risk for gonadoblastoma and subsequent transformation into malignancy. Gonadectomy is recommended for these patients, typically at adolescence. Streak gonads can be difficult to identify, and tissue margins are often in close proximity to critical retroperitoneal structures. Resection can be technically challenging and requires a thorough understanding of retroperitoneal anatomy and precise dissection techniques to ensure complete removal. Laparoscopic approach to bilateral salpingo-oophorectomy of streak gonads. Retroperitoneal dissection and ureterolysis are performed, with the aid of the Ethicon Harmonic Ace, to ensure complete gonadectomy. Careful and complete resection of gonadal tissue in the hands of a skilled laparoscopic surgeon is key for effective cancer risk reduction surgery in Turner syndrome mosaics. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  12. Principal axis-based correspondence between multiple cameras for people tracking.

    Science.gov (United States)

    Hu, Weiming; Hu, Min; Zhou, Xue; Tan, Tieniu; Lou, Jianguang; Maybank, Steve

    2006-04-01

    Visual surveillance using multiple cameras has attracted increasing interest in recent years. Correspondence between multiple cameras is one of the most important and basic problems which visual surveillance using multiple cameras brings. In this paper, we propose a simple and robust method, based on principal axes of people, to match people across multiple cameras. The correspondence likelihood reflecting the similarity of pairs of principal axes of people is constructed according to the relationship between "ground-points" of people detected in each camera view and the intersections of principal axes detected in different camera views and transformed to the same view. Our method has the following desirable properties: 1) Camera calibration is not needed. 2) Accurate motion detection and segmentation are less critical due to the robustness of the principal axis-based feature to noise. 3) Based on the fused data derived from correspondence results, positions of people in each camera view can be accurately located even when the people are partially occluded in all views. The experimental results on several real video sequences from outdoor environments have demonstrated the effectiveness, efficiency, and robustness of our method.

  13. Stabilization of the hypersonic boundary layer by finite-amplitude streaks

    Science.gov (United States)

    Ren, Jie; Fu, Song; Hanifi, Ardeshir

    2016-02-01

    Stabilization of two-dimensional disturbances in hypersonic boundary layer flows by finite-amplitude streaks is investigated using nonlinear parabolized stability equations. The boundary-layer flows at Mach numbers 4.5 and 6.0 are studied in which both first and second modes are supported. The streaks considered here are driven either by the so-called optimal perturbations (Klebanoff-type) or the centrifugal instability (Görtler-type). When the streak amplitude is in an appropriate range, i.e., large enough to modulate the laminar boundary layer but low enough to not trigger secondary instability, both first and second modes can effectively be suppressed.

  14. LAMOST CCD camera-control system based on RTS2

    Science.gov (United States)

    Tian, Yuan; Wang, Zheng; Li, Jian; Cao, Zi-Huang; Dai, Wei; Wei, Shou-Lin; Zhao, Yong-Heng

    2018-05-01

    The Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST) is the largest existing spectroscopic survey telescope, having 32 scientific charge-coupled-device (CCD) cameras for acquiring spectra. Stability and automation of the camera-control software are essential, but cannot be provided by the existing system. The Remote Telescope System 2nd Version (RTS2) is an open-source and automatic observatory-control system. However, all previous RTS2 applications were developed for small telescopes. This paper focuses on implementation of an RTS2-based camera-control system for the 32 CCDs of LAMOST. A virtual camera module inherited from the RTS2 camera module is built as a device component working on the RTS2 framework. To improve the controllability and robustness, a virtualized layer is designed using the master-slave software paradigm, and the virtual camera module is mapped to the 32 real cameras of LAMOST. The new system is deployed in the actual environment and experimentally tested. Finally, multiple observations are conducted using this new RTS2-framework-based control system. The new camera-control system is found to satisfy the requirements for automatic camera control in LAMOST. This is the first time that RTS2 has been applied to a large telescope, and provides a referential solution for full RTS2 introduction to the LAMOST observatory control system.

  15. Effects of Hot Streak Shape on Rotor Heating in a High-Subsonic Single-Stage Turbine

    Science.gov (United States)

    Dorney, Daniel J.; Gundy-Burlet, Karen L.; Norvig, Peter (Technical Monitor)

    1999-01-01

    Experimental data have shown that combustor temperature non-uniformities can lead to the excessive heating of first-stage rotor blades in turbines. This heating of the rotor blades can lead to thermal fatigue and degrade turbine performance. The results of recent studies have shown that variations in the circumferential location (clocking) of the hot streak relative to the first-stage vane airfoils can be used to minimize the adverse effects of the hot streak. The effects of the hot streak/airfoil count ratio on the heating patterns of turbine airfoils have also been evaluated. In the present investigation, three-dimensional unsteady Navier-Stokes simulations have been performed for a single-stage high-pressure turbine operating in high subsonic flow. In addition to a simulation of the baseline turbine, simulations have been performed for circular and elliptical hot streaks of varying sizes in an effort to represent different combustor designs. The predicted results for the baseline simulation show good agreement with the available experimental data. The results of the hot streak simulations indicate: that a) elliptical hot streaks mix more rapidly than circular hot streaks, b) for small hot streak surface area the average rotor temperature is not a strong function of hot streak temperature ratio or shape, and c) hot streaks with larger surface area interact with the secondary flows at the rotor hub endwall, generating an additional high temperature region.

  16. Development of miniaturized proximity focused streak tubes for visible light and x-ray applications. Final report and progress, April-September 1977

    International Nuclear Information System (INIS)

    Cuny, J.J.; Knight, A.J.

    1978-02-01

    Research performed to develop miniaturized proximity focused streak camera tubes (PFST) for application in the visible and the x-ray modes of operation is described. The objective of this research was to provide an engineering design and to fabricate a visible and an x-ray prototype tube to be provided to LASL for test and evaluation. Materials selection and fabrication procedures, particularly the joining of beryllium to a suitable support ring for use as the x-ray window, are described in detail. The visible and x-ray PFST's were successfully fabricated

  17. Diving-related visual loss in the setting of angioid streaks: report of two cases.

    Science.gov (United States)

    Angulo Bocco, Maria I; Spielberg, Leigh; Coppens, Greet; Catherine, Janet; Verougstraete, Claire; Leys, Anita M

    2012-01-01

    The purpose of this study was to report diving-related visual loss in the setting of angioid streaks. Observational case reports of two patients with angioid streaks suffering sudden visual loss immediately after diving. Two young adult male patients presented with visual loss after diving headfirst. Funduscopy revealed angioid streaks, peau d'orange, subretinal hemorrhages, and ruptures of Bruch membrane. Choroidal neovascularization developed during follow-up. Both patients had an otherwise uneventful personal and familial medical history. In patients with angioid streaks, diving headfirst can lead to subretinal hemorrhages and traumatic ruptures in Bruch membrane and increase the risk of maculopathy. Ophthalmologists should caution patients with angioid streaks against diving headfirst.

  18. External Mask Based Depth and Light Field Camera

    Science.gov (United States)

    2013-12-08

    External mask based depth and light field camera Dikpal Reddy NVIDIA Research Santa Clara, CA dikpalr@nvidia.com Jiamin Bai University of California...passive depth acquisition technology is illustrated by the emergence of light field camera companies like Lytro [1], Raytrix [2] and Pelican Imaging

  19. Camera Based Navigation System with Augmented Reality

    Directory of Open Access Journals (Sweden)

    M. Marcu

    2012-06-01

    Full Text Available Nowadays smart mobile devices have enough processing power, memory, storage and always connected wireless communication bandwidth that makes them available for any type of application. Augmented reality (AR proposes a new type of applications that tries to enhance the real world by superimposing or combining virtual objects or computer generated information with it. In this paper we present a camera based navigation system with augmented reality integration. The proposed system aims to the following: the user points the camera of the smartphone towards a point of interest, like a building or any other place, and the application searches for relevant information about that specific place and superimposes the data over the video feed on the display. When the user moves the camera away, changing its orientation, the data changes as well, in real-time, with the proper information about the place that is now in the camera view.

  20. Picosecond Streaked K-Shell Spectroscopy of Near Solid-Density Aluminum Plasmas

    Science.gov (United States)

    Stillman, C. R.; Nilson, P. M.; Ivancic, S. T.; Mileham, C.; Froula, D. H.; Golovkin, I. E.

    2016-10-01

    The thermal x-ray emission from rapidly heated solid targets containing a buried-aluminum layer was measured. The targets were driven by high-contrast 1 ω or 2 ω laser pulses at focused intensities up to 1 ×1019W/Wcm2 cm2 . A streaked x-ray spectrometer recorded the Al Heα and lithium-like satellite lines with 2-ps temporal resolution and moderate resolving power (E/E ΔE 700). Time-integrated measurements over the same spectral range were used to correct the streaked data for variations in photocathode sensitivity. Line widths and intensity ratios from the streaked data were interpreted using a collisional radiative atomic model to provide the average plasma conditions in the buried layer as a function of time. It was observed that the resonance line tends toward lower photon energies at high electron densities. The measured shifts will be compared to predicted shifts from Stark-operator calculations at the inferred plasma conditions. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944, the office of Fusion Energy Sciences Award Number DE-SC0012317, and the Stewardship Science Graduate Fellowship Grant Number DE-NA0002135.

  1. 100-ps framing-camera tube

    International Nuclear Information System (INIS)

    Kalibjian, R.

    1978-01-01

    The optoelectronic framing-camera tube described is capable of recording two-dimensional image frames with high spatial resolution in the <100-ps range. Framing is performed by streaking a two-dimensional electron image across narrow slits. The resulting dissected electron line images from the slits are restored into framed images by a restorer deflector operating synchronously with the dissector deflector. The number of framed images on the tube's viewing screen equals the number of dissecting slits in the tube. Performance has been demonstrated in a prototype tube by recording 135-ps-duration framed images of 2.5-mm patterns at the cathode. The limitation in the framing speed is in the external drivers for the deflectors and not in the tube design characteristics. Faster frame speeds in the <100-ps range can be obtained by use of faster deflection drivers

  2. Cassava brown streak disease in Rwanda, the associated viruses and disease phenotypes.

    Science.gov (United States)

    Munganyinka, E; Ateka, E M; Kihurani, A W; Kanyange, M C; Tairo, F; Sseruwagi, P; Ndunguru, J

    2018-02-01

    Cassava brown streak disease (CBSD) was first observed on cassava ( Manihot esculenta ) in Rwanda in 2009. In 2014 eight major cassava-growing districts in the country were surveyed to determine the distribution and variability of symptom phenotypes associated with CBSD, and the genetic diversity of cassava brown streak viruses. Distribution of the CBSD symptom phenotypes and their combinations varied greatly between districts, cultivars and their associated viruses. The symptoms on leaf alone recorded the highest (32.2%) incidence, followed by roots (25.7%), leaf + stem (20.3%), leaf + root (10.4%), leaf + stem + root (5.2%), stem + root (3.7%), and stem (2.5%) symptoms. Analysis by RT-PCR showed that single infections of Ugandan cassava brown streak virus (UCBSV) were most common (74.2% of total infections) and associated with all the seven phenotypes studied. Single infections of Cassava brown streak virus (CBSV) were predominant (15.3% of total infections) in CBSD-affected plants showing symptoms on stems alone. Mixed infections (CBSV + UCBSV) comprised 10.5% of total infections and predominated in the combinations of leaf + stem + root phenotypes. Phylogenetic analysis and the estimates of evolutionary divergence, using partial sequences (210 nt) of the coat protein gene, revealed that in Rwanda there is one type of CBSV and an indication of diverse UCBSV. This study is the first to report the occurrence and distribution of both CBSV and UCBSV based on molecular techniques in Rwanda.

  3. Streaked x-ray spectrometer having a discrete selection of Bragg geometries for Omega

    Energy Technology Data Exchange (ETDEWEB)

    Millecchia, M.; Regan, S. P.; Bahr, R. E.; Romanofsky, M.; Sorce, C. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623-1299 (United States)

    2012-10-15

    The streaked x-ray spectrometer (SXS) is used with streak cameras [D. H. Kalantar, P. M. Bell, R. L. Costa, B. A. Hammel, O. L. Landen, T. J. Orzechowski, J. D. Hares, and A. K. L. Dymoke-Bradshaw, in 22nd International Congress on High-Speed Photography and Photonics, edited by D. L. Paisley and A. M. Frank (SPIE, Bellingham, WA, 1997), Vol. 2869, p. 680] positioned with a ten-inch manipulator on OMEGA [T. R. Boehly et al., Opt. Commun. 133, 495 (1997)] and OMEGA EP [L. J. Waxer et al., Presented at CLEO/QELS 2008, San Jose, CA, 4-9 May 2008 (Paper JThB1)] for time-resolved, x-ray spectroscopy of laser-produced plasmas in the 1.4- to 20-keV photon-energy range. These experiments require measuring a portion of this photon-energy range to monitor a particular emission or absorption feature of interest. The SXS relies on a pinned mechanical reference system to create a discrete set of Bragg reflection geometries for a variety of crystals. A wide selection of spectral windows is achieved accurately and efficiently using this technique. It replaces the previous spectrometer designs that had a continuous Bragg angle adjustment and required a tedious alignment calibration procedure. The number of spectral windows needed for the SXS was determined by studying the spectral ranges selected by OMEGA users over the last decade. These selections are easily configured in the SXS using one of the 25 discrete Bragg reflection geometries and one of the six types of Bragg crystals, including two curved crystals.

  4. X-ray imaging of JET. A design study for a streak camera application

    International Nuclear Information System (INIS)

    Bateman, J.E.; Hobby, M.G.

    1980-03-01

    A single dimensional imaging system is proposed which will image a strip of the JET plasma up to 320 times per shot with a time resolution of better than 50 μs using the bremsstrahlung X-rays. The images are obtained by means of a pinhole camera followed by an X-ray image intensifier system the output of which is in turn digitised by a photodiode array. The information is stored digitally in a fast memory and is immediately available for display or analysis. (author)

  5. A generic model for camera based intelligent road crowd control ...

    African Journals Online (AJOL)

    This research proposes a model for intelligent traffic flow control by implementing camera based surveillance and feedback system. A series of cameras are set minimum three signals ahead from the target junction. The complete software system is developed to help integrating the multiple camera on road as feedback to ...

  6. Spatiotemporal mechanical variation reveals critical role for rho kinase during primitive streak morphogenesis.

    Science.gov (United States)

    Henkels, Julia; Oh, Jaeho; Xu, Wenwei; Owen, Drew; Sulchek, Todd; Zamir, Evan

    2013-02-01

    Large-scale morphogenetic movements during early embryo development are driven by complex changes in biochemical and biophysical factors. Current models for amniote primitive streak morphogenesis and gastrulation take into account numerous genetic pathways but largely ignore the role of mechanical forces. Here, we used atomic force microscopy (AFM) to obtain for the first time precise biomechanical properties of the early avian embryo. Our data reveal that the primitive streak is significantly stiffer than neighboring regions of the epiblast, and that it is stiffer than the pre-primitive streak epiblast. To test our hypothesis that these changes in mechanical properties are due to a localized increase of actomyosin contractility, we inhibited actomyosin contractility via the Rho kinase (ROCK) pathway using the small-molecule inhibitor Y-27632. Our results using several different assays show the following: (1) primitive streak formation was blocked; (2) the time-dependent increase in primitive streak stiffness was abolished; and (3) convergence of epiblast cells to the midline was inhibited. Taken together, our data suggest that actomyosin contractility is necessary for primitive streak morphogenesis, and specifically, ROCK plays a critical role. To better understand the underlying mechanisms of this fundamental process, future models should account for the findings presented in this study.

  7. Whole body scan system based on γ camera

    International Nuclear Information System (INIS)

    Ma Tianyu; Jin Yongjie

    2001-01-01

    Most existing domestic γ cameras can not perform whole body scan protocol, which is of important use in clinic. The authors designed a set of whole body scan system, which is made up of a scan bed, an ISA interface card controlling the scan bed and the data acquisition software based on a data acquisition and image processing system for γ cameras. The image was obtained in clinical experiment, and the authors think it meets the need of clinical diagnosis. Application of this system in γ cameras can provide whole body scan function at low cost

  8. Electro-optical design of a long slit streak tube

    Science.gov (United States)

    Tian, Liping; Tian, Jinshou; Wen, Wenlong; Chen, Ping; Wang, Xing; Hui, Dandan; Wang, Junfeng

    2017-11-01

    A small size and long slit streak tube with high spatial resolution was designed and optimized. Curved photocathode and screen were adopted to increase the photocathode working area and spatial resolution. High physical temporal resolution obtained by using a slit accelerating electrode. Deflection sensitivity of the streak tube was improved by adopting two-folded deflection plates. The simulations indicate that the photocathode effective working area can reach 30mm × 5mm. The static spatial resolution is higher than 40lp/mm and 12lp/mm along scanning and slit directions respectively while the physical temporal resolution is higher than 60ps. The magnification is 0.75 and 0.77 in scanning and slit directions. And also, the deflection sensitivity is as high as 37mm/kV. The external dimension of the streak tube are only ∅74mm×231mm. Thus, it can be applied to laser imaging radar system for large field of view and high range precision detection.

  9. Slope streaks on Mars: A new “wet” mechanism

    Science.gov (United States)

    Kreslavsky, Mikhail A.; Head, James W.

    2009-06-01

    Slope steaks are one of the most intriguing modern phenomena observed on Mars. They have been mostly interpreted as some specific type of granular flow. We propose another mechanism for slope streak formation on Mars. It involves natural seasonal formation of a modest amount of highly concentrated chloride brines within a seasonal thermal skin, and runaway propagation of percolation fronts. Given the current state of knowledge of temperature regimes and the composition and structure of the surface layer in the slope streak regions, this mechanism is consistent with the observational constraints; it requires an assumption that a significant part of the observed chlorine to be in form of calcium and ferric chloride, and a small part of the observed hydrogen to be in form of water ice. This "wet" mechanism has a number of appealing advantages in comparison to the widely accepted "dry" granular flow mechanism. Potential tests for the "wet" mechanism include better modeling of the temperature regime and observations of the seasonality of streak formation.

  10. Pea Streak Virus Recorded in Europe

    Czech Academy of Sciences Publication Activity Database

    Sarkisova, Tatiana; Bečková, M.; Fránová, Jana; Petrzik, Karel

    2016-01-01

    Roč. 52, č. 3 (2016), s. 164-166 ISSN 1212-2580 R&D Projects: GA MZe QH71145 Institutional support: RVO:60077344 Keywords : Pea streak virus * alfalfa * carlavirus * partial sequence Subject RIV: EE - Microbiology, Virology Impact factor: 0.742, year: 2016

  11. Global Calibration of Multiple Cameras Based on Sphere Targets

    Directory of Open Access Journals (Sweden)

    Junhua Sun

    2016-01-01

    Full Text Available Global calibration methods for multi-camera system are critical to the accuracy of vision measurement. Proposed in this paper is such a method based on several groups of sphere targets and a precision auxiliary camera. Each camera to be calibrated observes a group of spheres (at least three, while the auxiliary camera observes all the spheres. The global calibration can be achieved after each camera reconstructs the sphere centers in its field of view. In the process of reconstructing a sphere center, a parameter equation is used to describe the sphere projection model. Theoretical analysis and computer simulation are carried out to analyze the factors that affect the calibration accuracy. Simulation results show that the parameter equation can largely improve the reconstruction accuracy. In the experiments, a two-camera system calibrated by our method is used to measure a distance about 578 mm, and the root mean squared error is within 0.14 mm. Furthermore, the experiments indicate that the method has simple operation and good flexibility, especially for the onsite multiple cameras without common field of view.

  12. A cooperative control algorithm for camera based observational systems.

    Energy Technology Data Exchange (ETDEWEB)

    Young, Joseph G.

    2012-01-01

    Over the last several years, there has been considerable growth in camera based observation systems for a variety of safety, scientific, and recreational applications. In order to improve the effectiveness of these systems, we frequently desire the ability to increase the number of observed objects, but solving this problem is not as simple as adding more cameras. Quite often, there are economic or physical restrictions that prevent us from adding additional cameras to the system. As a result, we require methods that coordinate the tracking of objects between multiple cameras in an optimal way. In order to accomplish this goal, we present a new cooperative control algorithm for a camera based observational system. Specifically, we present a receding horizon control where we model the underlying optimal control problem as a mixed integer linear program. The benefit of this design is that we can coordinate the actions between each camera while simultaneously respecting its kinematics. In addition, we further improve the quality of our solution by coupling our algorithm with a Kalman filter. Through this integration, we not only add a predictive component to our control, but we use the uncertainty estimates provided by the filter to encourage the system to periodically observe any outliers in the observed area. This combined approach allows us to intelligently observe the entire region of interest in an effective and thorough manner.

  13. Multi-camera synchronization core implemented on USB3 based FPGA platform

    Science.gov (United States)

    Sousa, Ricardo M.; Wäny, Martin; Santos, Pedro; Dias, Morgado

    2015-03-01

    Centered on Awaiba's NanEye CMOS image sensor family and a FPGA platform with USB3 interface, the aim of this paper is to demonstrate a new technique to synchronize up to 8 individual self-timed cameras with minimal error. Small form factor self-timed camera modules of 1 mm x 1 mm or smaller do not normally allow external synchronization. However, for stereo vision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is required to synchronize multiple cameras. In this work, the challenge of synchronizing multiple selftimed cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the cameras. To that effect, a control core was created to constantly monitor the operating frequency of each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line period. To ensure phase synchronization between frames, a Master-Slave interface was implemented. A single camera is defined as the Master, with its operating frequency being controlled directly through a PC based interface. The remaining cameras are setup in Slave mode and are interfaced directly with the Master camera control module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The result of this work will allow the implementation of smaller than 3mm diameter 3D stereo vision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.

  14. Detection, Occurrence, and Survey of Rice Stripe and Black-Streaked Dwarf Diseases in Zhejiang Province, China

    OpenAIRE

    Heng-mu ZHANG; Hua-di WANG; Jian YANG; Michael J ADAMS; Jian-ping CHEN

    2013-01-01

    The major viral diseases that occur on rice plants in Zhejiang Province, eastern China, are stripe and rice black-streaked dwarf diseases. Rice stripe disease is only caused by rice stripe tenuivirus (RSV), while rice black-streaked dwarf disease can be caused by rice black-streaked dwarf fijivirus (RBSDV) and/or southern rice black-streaked dwarf fijivirus (SRBSDV). Here we review the characterization of these viruses, methods for their detection, and extensive surveys showing their occurren...

  15. Novel technique for addressing streak artifact in gated dual-source MDCT angiography utilizing ECG-editing

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Laura T.; Boll, Daniel T. [Duke University Medical Center, Department of Radiology, Box 3808, Durham, NC (United States)

    2008-11-15

    Streak artifact is an important source of image degradation in computed tomographic imaging. In coronary MDCT angiography, streak artifact from pacemaker leads in the SVC can render segments of the right coronary artery uninterpretable. With current technology in clinical practice, there is no effective way to eliminate streak artifact in coronary MDCT angiography entirely. We propose a technique to minimize the impact of streak artifact in retrospectively gated coronary MDCT angiography by utilizing small shifts in the reconstruction window. In our experience, previously degraded portions of the coronary vasculature were able to be well evaluated using this technique. (orig.)

  16. New light field camera based on physical based rendering tracing

    Science.gov (United States)

    Chung, Ming-Han; Chang, Shan-Ching; Lee, Chih-Kung

    2014-03-01

    Even though light field technology was first invented more than 50 years ago, it did not gain popularity due to the limitation imposed by the computation technology. With the rapid advancement of computer technology over the last decade, the limitation has been uplifted and the light field technology quickly returns to the spotlight of the research stage. In this paper, PBRT (Physical Based Rendering Tracing) was introduced to overcome the limitation of using traditional optical simulation approach to study the light field camera technology. More specifically, traditional optical simulation approach can only present light energy distribution but typically lack the capability to present the pictures in realistic scenes. By using PBRT, which was developed to create virtual scenes, 4D light field information was obtained to conduct initial data analysis and calculation. This PBRT approach was also used to explore the light field data calculation potential in creating realistic photos. Furthermore, we integrated the optical experimental measurement results with PBRT in order to place the real measurement results into the virtually created scenes. In other words, our approach provided us with a way to establish a link of virtual scene with the real measurement results. Several images developed based on the above-mentioned approaches were analyzed and discussed to verify the pros and cons of the newly developed PBRT based light field camera technology. It will be shown that this newly developed light field camera approach can circumvent the loss of spatial resolution associated with adopting a micro-lens array in front of the image sensors. Detailed operational constraint, performance metrics, computation resources needed, etc. associated with this newly developed light field camera technique were presented in detail.

  17. The effect of wall temperature distribution on streaks in compressible turbulent boundary layer

    Science.gov (United States)

    Zhang, Zhao; Tao, Yang; Xiong, Neng; Qian, Fengxue

    2018-05-01

    The thermal boundary condition at wall is very important for the compressible flow due to the coupling of the energy equation, and a lot of research works about it were carried out in past decades. In most of these works, the wall was assumed as adiabatic or uniform isothermal surface; the flow over a thermal wall with some special temperature distribution was seldom studied. Lagha studied the effect of uniform isothermal wall on the streaks, and pointed out that higher the wall temperature is, the longer the streak (POF, 2011, 23, 015106). So, we designed streamwise stripes of wall temperature distribution on the compressible turbulent boundary layer at Mach 3.0 to learn the effect on the streaks by means of direct numerical simulation in this paper. The mean wall temperature is equal to the adiabatic case approximately, and the width of the temperature stripes is in the same order as the width of the streaks. The streak patterns in near-wall region with different temperature stripes are shown in the paper. Moreover, we find that there is a reduction of friction velocity with the wall temperature stripes when compared with the adiabatic case.

  18. Nondipole effects in attosecond photoelectron streaking

    DEFF Research Database (Denmark)

    Spiewanowski, Maciek; Madsen, Lars Bojer

    2012-01-01

    The influence of nondipole terms on the time delay in photoionization by an extreme-ultraviolet attosecond pulse in the presence of a near-infrared femtosecond laser pulse from 1s, 2s, and 2p states in hydrogen is investigated. In this attosecond photoelectron streaking process, the relative...

  19. A compact low cost “master–slave” double crystal monochromator for x-ray cameras calibration of the Laser MégaJoule Facility

    Energy Technology Data Exchange (ETDEWEB)

    Hubert, S., E-mail: sebastien.hubert@cea.fr; Prévot, V.

    2014-12-21

    The Alternative Energies and Atomic Energy Commission (CEA-CESTA, France) built a specific double crystal monochromator (DCM) to perform calibration of x-ray cameras (CCD, streak and gated cameras) by means of a multiple anode diode type x-ray source for the MégaJoule Laser Facility. This DCM, based on pantograph geometry, was specifically modeled to respond to relevant engineering constraints and requirements. The major benefits are mechanical drive of the second crystal on the first one, through a single drive motor, as well as compactness of the entire device. Designed for flat beryl or Ge crystals, this DCM covers the 0.9–10 keV range of our High Energy X-ray Source. In this paper we present the mechanical design of the DCM, its features quantitatively measured and its calibration to finally provide monochromatized spectra displaying spectral purities better than 98%.

  20. A real-time camera calibration system based on OpenCV

    Science.gov (United States)

    Zhang, Hui; Wang, Hua; Guo, Huinan; Ren, Long; Zhou, Zuofeng

    2015-07-01

    Camera calibration is one of the essential steps in the computer vision research. This paper describes a real-time OpenCV based camera calibration system, and developed and implemented in the VS2008 environment. Experimental results prove that the system to achieve a simple and fast camera calibration, compared with MATLAB, higher precision and does not need manual intervention, and can be widely used in various computer vision system.

  1. A mathematical model for camera calibration based on straight lines

    Directory of Open Access Journals (Sweden)

    Antonio M. G. Tommaselli

    2005-12-01

    Full Text Available In other to facilitate the automation of camera calibration process, a mathematical model using straight lines was developed, which is based on the equivalent planes mathematical model. Parameter estimation of the developed model is achieved by the Least Squares Method with Conditions and Observations. The same method of adjustment was used to implement camera calibration with bundles, which is based on points. Experiments using simulated and real data have shown that the developed model based on straight lines gives results comparable to the conventional method with points. Details concerning the mathematical development of the model and experiments with simulated and real data will be presented and the results with both methods of camera calibration, with straight lines and with points, will be compared.

  2. Object Detection and Tracking-Based Camera Calibration for Normalized Human Height Estimation

    Directory of Open Access Journals (Sweden)

    Jaehoon Jung

    2016-01-01

    Full Text Available This paper presents a normalized human height estimation algorithm using an uncalibrated camera. To estimate the normalized human height, the proposed algorithm detects a moving object and performs tracking-based automatic camera calibration. The proposed method consists of three steps: (i moving human detection and tracking, (ii automatic camera calibration, and (iii human height estimation and error correction. The proposed method automatically calibrates camera by detecting moving humans and estimates the human height using error correction. The proposed method can be applied to object-based video surveillance systems and digital forensic.

  3. Streak artifacts on Kidney CT: Ionic vs nonionic contrast media

    International Nuclear Information System (INIS)

    Cho, Eun Ok; Kim, Won Hong; Jung, Myung Suk; Kim, Yong Hoon; Hur, Gham

    1993-01-01

    The authors reviewed findings of enhanced abdominal computed tomography (CT) scans to know the difference between a higher dose of conventional ionic contrast media(iothalamate meglumine) and a lower dose of a new, nonionic contrast material(ioversol). One hundred adult patients were divided into two groups. Each group consisted of 50 patients. Iothalamate meglumine and ioversol were intravenously administered in each group. The radio of the male to female in the former was 28:22, and the latter 29:21. We examine the degree of renal streak artifact and measure the Hounsfield number of urine in renal collecting system. There were significant differences of the degree of the streak artifact depending upon the osmolality of contrast media used and that was related with urine CT number(P value<0.005). We authors conclude that nonionic low osmolar contrast media is prone to cause streak artifacts and distortions of renal image than conventional ionic high osmolar contrast media

  4. POD analysis of the instability mode of a low-speed streak in a laminar boundary layer

    Science.gov (United States)

    Deng, Si-Chao; Pan, Chong; Wang, Jin-Jun; Rinoshika, Akira

    2017-12-01

    The instability of one single low-speed streak in a zero-pressure-gradient laminar boundary layer is investigated experimentally via both hydrogen bubble visualization and planar particle image velocimetry (PIV) measurement. A single low-speed streak is generated and destabilized by the wake of an interference wire positioned normal to the wall and in the upstream. The downstream development of the streak includes secondary instability and self-reproduction process, which leads to the generation of two additional streaks appearing on either side of the primary one. A proper orthogonal decomposition (POD) analysis of PIV measured velocity field is used to identify the components of the streak instability in the POD mode space: for a sinuous/varicose type of POD mode, its basis functions present anti-symmetric/symmetric distributions about the streak centerline in the streamwise component, and the symmetry condition reverses in the spanwise component. It is further shown that sinuous mode dominates the turbulent kinematic energy (TKE) through the whole streak evolution process, the TKE content first increases along the streamwise direction to a saturation value and then decays slowly. In contrast, varicose mode exhibits a sustained growth of the TKE content, suggesting an increasing competition of varicose instability against sinuous instability.

  5. High dynamic range image acquisition based on multiplex cameras

    Science.gov (United States)

    Zeng, Hairui; Sun, Huayan; Zhang, Tinghua

    2018-03-01

    High dynamic image is an important technology of photoelectric information acquisition, providing higher dynamic range and more image details, and it can better reflect the real environment, light and color information. Currently, the method of high dynamic range image synthesis based on different exposure image sequences cannot adapt to the dynamic scene. It fails to overcome the effects of moving targets, resulting in the phenomenon of ghost. Therefore, a new high dynamic range image acquisition method based on multiplex cameras system was proposed. Firstly, different exposure images sequences were captured with the camera array, using the method of derivative optical flow based on color gradient to get the deviation between images, and aligned the images. Then, the high dynamic range image fusion weighting function was established by combination of inverse camera response function and deviation between images, and was applied to generated a high dynamic range image. The experiments show that the proposed method can effectively obtain high dynamic images in dynamic scene, and achieves good results.

  6. Streaked spectrometry using multilayer x-ray-interference mirrors to investigate energy transport in laser-plasma applications

    International Nuclear Information System (INIS)

    Stradling, G.L.; Barbee, T.W. Jr.; Henke, B.L.; Campbell, E.M.; Mead, W.C.

    1981-08-01

    Transport of energy in laser-produced plasmas is scrutinized by devising spectrally and temporally identifiable characteristics in the x-ray emission history which identify the heat-front position at various times in the heating process. Measurements of the relative turn-on times of these characteristics show the rate of energy transport between various points. These measurements can in turn constrain models of energy transport phenomena. We are time-resolving spectrally distinguishable subkilovolt x-ray emissions from different layers of a disk target to examine the transport rate of energy into the target. A similar technique is used to measure the lateral expansion rate of the plasma spot. A soft x-ray streak camera with 15-psec temporal resolution is used to make the temporal measurements. Spectral discrimination of the incident signal is provided by multilayer x-ray interference mirrors

  7. Camera calibration method of binocular stereo vision based on OpenCV

    Science.gov (United States)

    Zhong, Wanzhen; Dong, Xiaona

    2015-10-01

    Camera calibration, an important part of the binocular stereo vision research, is the essential foundation of 3D reconstruction of the spatial object. In this paper, the camera calibration method based on OpenCV (open source computer vision library) is submitted to make the process better as a result of obtaining higher precision and efficiency. First, the camera model in OpenCV and an algorithm of camera calibration are presented, especially considering the influence of camera lens radial distortion and decentering distortion. Then, camera calibration procedure is designed to compute those parameters of camera and calculate calibration errors. High-accurate profile extraction algorithm and a checkboard with 48 corners have also been used in this part. Finally, results of calibration program are presented, demonstrating the high efficiency and accuracy of the proposed approach. The results can reach the requirement of robot binocular stereo vision.

  8. Framing-camera tube developed for sub-100-ps range

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    A new framing-camera tube, developed by Electronics Engineering, is capable of recording two-dimensional image frames with high spatial resolution in the sub-100-ps range. Framing is performed by streaking a two-dimensional electron image across narrow slits; the resulting electron-line images from the slits are restored into a framed image by a restorer deflector operating synchronously with the dissector deflector. We have demonstrated its performance in a prototype tube by recording 125-ps-duration framed images of 2.5-mm patterns. The limitation in the framing speed is in the external electronic drivers for the deflectors and not in the tube design characteristics. Shorter frame durations (below 100 ps) can be obtained by use of faster deflection drivers

  9. X-ray streak crystal spectography

    International Nuclear Information System (INIS)

    Kauffman, R.L.; Brown, T.; Medecki, H.

    1983-01-01

    We have built an x-ray streaked crystal spectrograph for making time-resolved x-ray spectral measurements. This instrument can access Bragg angles from 11 0 to 38 0 and x-ray spectra from 200 eV to greater than 10 keV. We have demonstrated resolving powers, E/δE > 200 at 1 keV and time resolution less than 20 psec. A description of the instrument and an example of the data is given

  10. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    Science.gov (United States)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  11. BENCHMARKING THE OPTICAL RESOLVING POWER OF UAV BASED CAMERA SYSTEMS

    Directory of Open Access Journals (Sweden)

    H. Meißner

    2017-08-01

    Full Text Available UAV based imaging and 3D object point generation is an established technology. Some of the UAV users try to address (very highaccuracy applications, i.e. inspection or monitoring scenarios. In order to guarantee such level of detail and accuracy high resolving imaging systems are mandatory. Furthermore, image quality considerably impacts photogrammetric processing, as the tie point transfer, mandatory for forming the block geometry, fully relies on the radiometric quality of images. Thus, empirical testing of radiometric camera performance is an important issue, in addition to standard (geometric calibration, which normally is covered primarily. Within this paper the resolving power of ten different camera/lens installations has been investigated. Selected systems represent different camera classes, like DSLRs, system cameras, larger format cameras and proprietary systems. As the systems have been tested in wellcontrolled laboratory conditions and objective quality measures have been derived, individual performance can be compared directly, thus representing a first benchmark on radiometric performance of UAV cameras. The results have shown, that not only the selection of appropriate lens and camera body has an impact, in addition the image pre-processing, i.e. the use of a specific debayering method, significantly influences the final resolving power.

  12. Genetic Analysis of Streaked and Abnormal Floret Mutant st-fon

    Directory of Open Access Journals (Sweden)

    De-xi CHEN

    2013-07-01

    Full Text Available A double mutant with streaked leaf and abnormal floret was found and temporarily named streaked leaf and floral organ number mutant (st-fon. For this mutant, besides white streak appeared on culm, leaves and panicles, the number of floral organs increased and florets cracked. The extreme phenotype was that several small florets grew from one floret or branch rachis in small florets extended and developed into panicles. By using transmission electron microscope to observe the ultrastructure of white histocytes of leaves at the seedling stage, the white tissues which showed abnormal plastids, lamellas and thylakoids could not develop into normal chloroplast, and the development of chloroplast was blocked at the early growth stage of plastid. Scanning electron microscope and paraffin section were also used to observe the development of floral organs, and the results indicated that the development of floral meristem was out of order and unlimited, whereas in the twisty leaves, vascular bundle sheath cells grew excessively, or some bubbly cells increased. Genetic analyses carried out by means of cross and backcross with four normal-leaf-color materials revealed that the mutant is of cytoplasm inheritance.

  13. Proceedings of the 18th international congress on high speed photography and photonics

    International Nuclear Information System (INIS)

    Anon.

    1988-01-01

    The subjects addressed at the conference presented in this book include image converter and intensifier cameras; opto-mechanical high speed cameras; X-ray generator and radiography; and Holography and interferometry. The papers include Flash x-ray cineradiography; New picosecond synchroscan streak image tube; and Streak camera CCD readout system

  14. A Compton camera application for the GAMOS GEANT4-based framework

    Energy Technology Data Exchange (ETDEWEB)

    Harkness, L.J., E-mail: ljh@ns.ph.liv.ac.uk [Oliver Lodge Laboratory, The University of Liverpool, Liverpool L69 7ZE (United Kingdom); Arce, P. [Department of Basic Research, CIEMAT, Madrid (Spain); Judson, D.S.; Boston, A.J.; Boston, H.C.; Cresswell, J.R.; Dormand, J.; Jones, M.; Nolan, P.J.; Sampson, J.A.; Scraggs, D.P.; Sweeney, A. [Oliver Lodge Laboratory, The University of Liverpool, Liverpool L69 7ZE (United Kingdom); Lazarus, I.; Simpson, J. [STFC Daresbury Laboratory, Daresbury, Warrington WA4 4AD (United Kingdom)

    2012-04-11

    Compton camera systems can be used to image sources of gamma radiation in a variety of applications such as nuclear medicine, homeland security and nuclear decommissioning. To locate gamma-ray sources, a Compton camera employs electronic collimation, utilising Compton kinematics to reconstruct the paths of gamma rays which interact within the detectors. The main benefit of this technique is the ability to accurately identify and locate sources of gamma radiation within a wide field of view, vastly improving the efficiency and specificity over existing devices. Potential advantages of this imaging technique, along with advances in detector technology, have brought about a rapidly expanding area of research into the optimisation of Compton camera systems, which relies on significant input from Monte-Carlo simulations. In this paper, the functionality of a Compton camera application that has been integrated into GAMOS, the GEANT4-based Architecture for Medicine-Oriented Simulations, is described. The application simplifies the use of GEANT4 for Monte-Carlo investigations by employing a script based language and plug-in technology. To demonstrate the use of the Compton camera application, simulated data have been generated using the GAMOS application and acquired through experiment for a preliminary validation, using a Compton camera configured with double sided high purity germanium strip detectors. Energy spectra and reconstructed images for the data sets are presented.

  15. Convolutional Neural Network-Based Shadow Detection in Images Using Visible Light Camera Sensor

    Directory of Open Access Journals (Sweden)

    Dong Seop Kim

    2018-03-01

    Full Text Available Recent developments in intelligence surveillance camera systems have enabled more research on the detection, tracking, and recognition of humans. Such systems typically use visible light cameras and images, in which shadows make it difficult to detect and recognize the exact human area. Near-infrared (NIR light cameras and thermal cameras are used to mitigate this problem. However, such instruments require a separate NIR illuminator, or are prohibitively expensive. Existing research on shadow detection in images captured by visible light cameras have utilized object and shadow color features for detection. Unfortunately, various environmental factors such as illumination change and brightness of background cause detection to be a difficult task. To overcome this problem, we propose a convolutional neural network-based shadow detection method. Experimental results with a database built from various outdoor surveillance camera environments, and from the context-aware vision using image-based active recognition (CAVIAR open database, show that our method outperforms previous works.

  16. Analyses of Twelve New Whole Genome Sequences of Cassava Brown Streak Viruses and Ugandan Cassava Brown Streak Viruses from East Africa: Diversity, Supercomputing and Evidence for Further Speciation

    Science.gov (United States)

    Ndunguru, Joseph; Sseruwagi, Peter; Tairo, Fred; Stomeo, Francesca; Maina, Solomon; Djinkeng, Appolinaire; Kehoe, Monica; Boykin, Laura M.

    2015-01-01

    Cassava brown streak disease is caused by two devastating viruses, Cassava brown streak virus (CBSV) and Ugandan cassava brown streak virus (UCBSV) which are frequently found infecting cassava, one of sub-Saharan Africa’s most important staple food crops. Each year these viruses cause losses of up to $100 million USD and can leave entire families without their primary food source, for an entire year. Twelve new whole genomes, including seven of CBSV and five of UCBSV were uncovered in this research, doubling the genomic sequences available in the public domain for these viruses. These new sequences disprove the assumption that the viruses are limited by agro-ecological zones, show that current diagnostic primers are insufficient to provide confident diagnosis of these viruses and give rise to the possibility that there may be as many as four distinct species of virus. Utilizing NGS sequencing technologies and proper phylogenetic practices will rapidly increase the solution to sustainable cassava production. PMID:26439260

  17. Analyses of Twelve New Whole Genome Sequences of Cassava Brown Streak Viruses and Ugandan Cassava Brown Streak Viruses from East Africa: Diversity, Supercomputing and Evidence for Further Speciation.

    Directory of Open Access Journals (Sweden)

    Joseph Ndunguru

    Full Text Available Cassava brown streak disease is caused by two devastating viruses, Cassava brown streak virus (CBSV and Ugandan cassava brown streak virus (UCBSV which are frequently found infecting cassava, one of sub-Saharan Africa's most important staple food crops. Each year these viruses cause losses of up to $100 million USD and can leave entire families without their primary food source, for an entire year. Twelve new whole genomes, including seven of CBSV and five of UCBSV were uncovered in this research, doubling the genomic sequences available in the public domain for these viruses. These new sequences disprove the assumption that the viruses are limited by agro-ecological zones, show that current diagnostic primers are insufficient to provide confident diagnosis of these viruses and give rise to the possibility that there may be as many as four distinct species of virus. Utilizing NGS sequencing technologies and proper phylogenetic practices will rapidly increase the solution to sustainable cassava production.

  18. Feature-based automatic color calibration for networked camera system

    Science.gov (United States)

    Yamamoto, Shoji; Taki, Keisuke; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2011-01-01

    In this paper, we have developed a feature-based automatic color calibration by using an area-based detection and adaptive nonlinear regression method. Simple color matching of chartless is achieved by using the characteristic of overlapping image area with each camera. Accurate detection of common object is achieved by the area-based detection that combines MSER with SIFT. Adaptive color calibration by using the color of detected object is calculated by nonlinear regression method. This method can indicate the contribution of object's color for color calibration, and automatic selection notification for user is performed by this function. Experimental result show that the accuracy of the calibration improves gradually. It is clear that this method can endure practical use of multi-camera color calibration if an enough sample is obtained.

  19. Avoiding acidic region streaking in two-dimensional gel ...

    Indian Academy of Sciences (India)

    Supplementary figure 6. 2DE gel images ... Number of acidic streaks. Fedyunin et al. 2012. 4.02. 6. Zuo et al. 2000. 2.54. 9. Valenete et ... CE, 3rd 2009 Proteasomal protein degradation in ... Nandakumar MP, Shen J, Raman B and Marten MR.

  20. Atomic and molecular phases through attosecond streaking

    DEFF Research Database (Denmark)

    Baggesen, Jan Conrad; Madsen, Lars Bojer

    2011-01-01

    phase of the atomic or molecular ionization matrix elements from the two states through the interference from the two channels. The interference may change the phase of the photoelectron streaking signal within the envelope of the infrared field, an effect to be accounted for when reconstructing short...... pulses from the photoelectron signal and in attosecond time-resolved measurements....

  1. Automatic inference of geometric camera parameters and inter-camera topology in uncalibrated disjoint surveillance cameras

    Science.gov (United States)

    den Hollander, Richard J. M.; Bouma, Henri; Baan, Jan; Eendebak, Pieter T.; van Rest, Jeroen H. C.

    2015-10-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short configuration time, and the use of video analytics in a wider range of scenarios, including ad-hoc crisis situations and large scale surveillance systems. We show an autocalibration method entirely based on pedestrian detections in surveillance video in multiple non-overlapping cameras. In this paper, we show the two main components of automatic calibration. The first shows the intra-camera geometry estimation that leads to an estimate of the tilt angle, focal length and camera height, which is important for the conversion from pixels to meters and vice versa. The second component shows the inter-camera topology inference that leads to an estimate of the distance between cameras, which is important for spatio-temporal analysis of multi-camera tracking. This paper describes each of these methods and provides results on realistic video data.

  2. The AOTF-Based NO2 Camera

    Science.gov (United States)

    Dekemper, E.; Fussen, D.; Vanhellemont, F.; Vanhamel, J.; Pieroux, D.; Berkenbosch, S.

    2017-12-01

    In an urban environment, nitrogen dioxide is emitted by a multitude of static and moving point sources (cars, industry, power plants, heating systems,…). Air quality models generally rely on a limited number of monitoring stations which do not capture the whole pattern, neither allow for full validation. So far, there has been a lack of instrument capable of measuring NO2 fields with the necessary spatio-temporal resolution above major point sources (power plants), or more extended ones (cities). We have developed a new type of passive remote sensing instrument aiming at the measurement of 2-D distributions of NO2 slant column densities (SCDs) with a high spatial (meters) and temporal (minutes) resolution. The measurement principle has some similarities with the popular filter-based SO2 camera (used in volcanic and industrial sulfur emissions monitoring) as it relies on spectral images taken at wavelengths where the molecule absorption cross section is different. But contrary to the SO2 camera, the spectral selection is performed by an acousto-optical tunable filter (AOTF) capable of resolving the target molecule's spectral features. A first prototype was successfully tested with the plume of a coal-firing power plant in Romania, revealing the dynamics of the formation of NO2 in the early plume. A lighter version of the NO2 camera is now being tested on other targets, such as oil refineries and urban air masses.

  3. Numerical Investigation on the Influence of Hot Streak Temperature Ratio in a High-Pressure Stage of Vaneless Counter-Rotating Turbine

    Directory of Open Access Journals (Sweden)

    Zhao Qingjun

    2007-01-01

    Full Text Available The results of recent studies have shown that combustor exit temperature distortion can cause excessive heat load of high-pressure turbine (HPT rotor blades. The heating of HPT rotor blades can lead to thermal fatigue and degrade turbine performance. In order to explore the influence of hot streak temperature ratio on the temperature distributions of HPT airfoil surface, three-dimensional multiblade row unsteady Navier-Stokes simulations have been performed in a vaneless counter-rotating turbine (VCRT. The hot streak temperature ratios from 1.0 (without hot streak to 2.4 were used in these numerical simulations, including 1.0, 1.2, 1.6, 2.0, and 2.4 temperature ratios. The hot streak is circular in shape with a diameter equal to 25% of the span. The center of the hot streak is located at 50% of span and 0% of pitch (the leading edge of the HPT stator vane. The predicted results show that the hot streak is relatively unaffected as it migrates through the HPT stator. The hot streak mixes with the vane wake and convects towards the pressure surface (PS of the HPT rotor when it moves over the vane surface of the HPT stator. The heat load of the HPT rotor increases with the increase of the hot streak temperature ratio. The existence of the inlet temperature distortion induces a thin layer of cooler air in the HPT rotor, which separates the PS of the HPT rotor from the hotter fluid. The numerical results also indicating the migration characteristics of the hot streak in the HPT rotor are predominated by the combined effects of secondary flow and buoyancy. The combined effects that induce the high-temperature fluid migrate towards the hub on the HPT rotor. The effect of the secondary flow on the hotter fluid increases as the hot streak temperature ratio is increased. The influence of buoyancy is directly proportional to the hot streak temperature ratio. The predicted results show that the increase of the hot streak temperature ratio trends to increase

  4. Movement-based interaction in camera spaces: a conceptual framework

    DEFF Research Database (Denmark)

    Eriksson, Eva; Hansen, Thomas Riisgaard; Lykke-Olesen, Andreas

    2007-01-01

    In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movementbased projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space,...

  5. Development of a visible framing camera diagnostic for the study of current initiation in z-pinch plasmas

    International Nuclear Information System (INIS)

    Muron, D.J.; Hurst, M.J.; Derzon, M.S.

    1996-01-01

    The authors assembled and tested a visible framing camera system to take 5 ns FWHM images of the early time emission from a z-pinch plasma. This diagnostic was used in conjunction with a visible streak camera allowing early time emissions measurements to diagnose current initiation. Individual frames from gated image intensifiers were proximity coupled to charge injection device (CID) cameras and read out at video rate and 8-bit resolution. A mirror was used to view the pinch from a 90-degree angle. The authors observed the destruction of the mirror surface, due to the high surface heating, and the subsequent reduction in signal reflected from the mirror. Images were obtained that showed early time ejecta and a nonuniform emission from the target. This initial test of the equipment highlighted problems with this measurement. They observed non-uniformities in early time emission. This is believed to be due to either spatially varying current density or heating of the foam. Images were obtained that showed early time ejecta from the target. The results and suggestions for improvement are discussed in the text

  6. Intensified CCD for ultrafast diagnostics

    International Nuclear Information System (INIS)

    Cheng, J.; Tripp, G.; Coleman, L.

    1978-01-01

    Many of the present laser fusion diagnostics are recorded on either ultrafast streak cameras or on oscilloscopes. For those experiments in which a large volume of data is accumulated, direct computer processing of the information becomes important. We describe an approach which uses a RCA 52501 back-thinned CCD sensor to obtain direct electron readouts for both the streak camera and the CRT. Performance of the 100 GHz streak camera and the 4 GHz CRT are presented. Design parameters and computer interfacing for both systems are described in detail

  7. Image Mosaicking Approach for a Double-Camera System in the GaoFen2 Optical Remote Sensing Satellite Based on the Big Virtual Camera.

    Science.gov (United States)

    Cheng, Yufeng; Jin, Shuying; Wang, Mi; Zhu, Ying; Dong, Zhipeng

    2017-06-20

    The linear array push broom imaging mode is widely used for high resolution optical satellites (HROS). Using double-cameras attached by a high-rigidity support along with push broom imaging is one method to enlarge the field of view while ensuring high resolution. High accuracy image mosaicking is the key factor of the geometrical quality of complete stitched satellite imagery. This paper proposes a high accuracy image mosaicking approach based on the big virtual camera (BVC) in the double-camera system on the GaoFen2 optical remote sensing satellite (GF2). A big virtual camera can be built according to the rigorous imaging model of a single camera; then, each single image strip obtained by each TDI-CCD detector can be re-projected to the virtual detector of the big virtual camera coordinate system using forward-projection and backward-projection to obtain the corresponding single virtual image. After an on-orbit calibration and relative orientation, the complete final virtual image can be obtained by stitching the single virtual images together based on their coordinate information on the big virtual detector image plane. The paper subtly uses the concept of the big virtual camera to obtain a stitched image and the corresponding high accuracy rational function model (RFM) for concurrent post processing. Experiments verified that the proposed method can achieve seamless mosaicking while maintaining the geometric accuracy.

  8. Construct and face validity of a virtual reality-based camera navigation curriculum.

    Science.gov (United States)

    Shetty, Shohan; Panait, Lucian; Baranoski, Jacob; Dudrick, Stanley J; Bell, Robert L; Roberts, Kurt E; Duffy, Andrew J

    2012-10-01

    Camera handling and navigation are essential skills in laparoscopic surgery. Surgeons rely on camera operators, usually the least experienced members of the team, for visualization of the operative field. Essential skills for camera operators include maintaining orientation, an effective horizon, appropriate zoom control, and a clean lens. Virtual reality (VR) simulation may be a useful adjunct to developing camera skills in a novice population. No standardized VR-based camera navigation curriculum is currently available. We developed and implemented a novel curriculum on the LapSim VR simulator platform for our residents and students. We hypothesize that our curriculum will demonstrate construct and face validity in our trainee population, distinguishing levels of laparoscopic experience as part of a realistic training curriculum. Overall, 41 participants with various levels of laparoscopic training completed the curriculum. Participants included medical students, surgical residents (Postgraduate Years 1-5), fellows, and attendings. We stratified subjects into three groups (novice, intermediate, and advanced) based on previous laparoscopic experience. We assessed face validity with a questionnaire. The proficiency-based curriculum consists of three modules: camera navigation, coordination, and target visualization using 0° and 30° laparoscopes. Metrics include time, target misses, drift, path length, and tissue contact. We analyzed data using analysis of variance and Student's t-test. We noted significant differences in repetitions required to complete the curriculum: 41.8 for novices, 21.2 for intermediates, and 11.7 for the advanced group (P medical students during their surgery rotations. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Sub-Camera Calibration of a Penta-Camera

    Science.gov (United States)

    Jacobsen, K.; Gerke, M.

    2016-03-01

    Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test) of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors for corresponding

  10. A Demographic Model to Evaluate Population Declines in the Endangered Streaked Horned Lark

    Directory of Open Access Journals (Sweden)

    Alaine F. Camfield

    2011-12-01

    Full Text Available The Streaked Horned Lark (Eremophila alpestris strigata is listed as endangered by the State of Washington, USA and by Canada under the Species at Risk Act and is also classified as a federal candidate for listing under the Endangered Species Act in the USA. A substantial portion of Streaked Horned Lark habitat has been lost or degraded, and range contraction has occurred in Oregon, Washington, and British Columbia. We estimate the vital rates (fecundity, adult and juvenile survival and population growth rate (λ for Streaked Horned Larks breeding in Washington, USA and conduct a Life-Stage Simulation Analysis (LSA to evaluate which vital rate has the greatest influence on λ. We simulated changes in the three vital rates to examine how much they would need to be adjusted either independently or in concert to achieve a stable Streaked Horned Lark population (λ = 1. We also evaluated which fecundity component (the number of fledglings per egg laid or renesting interval had the greatest impact on λ. The estimate of population growth suggests that Streaked Horned Larks in Washington are declining rapidly (λ = 0.62 ± 0.10 and that local breeding sites are not sustainable without immigration. The LSA results indicate that adult survival had the greatest influence on λ, followed by juvenile survival and fecundity. However, increases in vital rates led to λ = 1 only when adult survival was raised from 0.47 to 0.85, juvenile survival from 0.17 to 0.58, and fecundity from 0.91 to 3.09. Increases in breeding success and decreases in the renesting interval influenced λ similarly; however, λ did not reach 1 even when breeding success was raised to 100% or renesting intervals were reduced to 1 day. Only when all three vital rates were increased simultaneously did λ approach 1 without requiring highly unrealistic increases in each vital rate. We conclude that conservation activities need to target all or multiple vital rates to be successful. The

  11. Advances in top-down and bottom-up approaches to video-based camera tracking

    OpenAIRE

    Marimón Sanjuán, David

    2007-01-01

    Video-based camera tracking consists in trailing the three dimensional pose followed by a mobile camera using video as sole input. In order to estimate the pose of a camera with respect to a real scene, one or more three dimensional references are needed. Examples of such references are landmarks with known geometric shape, or objects for which a model is generated beforehand. By comparing what is seen by a camera with what is geometrically known from reality, it is possible to recover the po...

  12. Advances in top-down and bottom-up approaches to video-based camera tracking

    OpenAIRE

    Marimón Sanjuán, David; Ebrahimi, Touradj

    2008-01-01

    Video-based camera tracking consists in trailing the three dimensional pose followed by a mobile camera using video as sole input. In order to estimate the pose of a camera with respect to a real scene, one or more three dimensional references are needed. Examples of such references are landmarks with known geometric shape, or objects for which a model is generated beforehand. By comparing what is seen by a camera with what is geometrically known from reality, it is possible to recover the po...

  13. Prism-based single-camera system for stereo display

    Science.gov (United States)

    Zhao, Yue; Cui, Xiaoyu; Wang, Zhiguo; Chen, Hongsheng; Fan, Heyu; Wu, Teresa

    2016-06-01

    This paper combines the prism and single camera and puts forward a method of stereo imaging with low cost. First of all, according to the principle of geometrical optics, we can deduce the relationship between the prism single-camera system and dual-camera system, and according to the principle of binocular vision we can deduce the relationship between binoculars and dual camera. Thus we can establish the relationship between the prism single-camera system and binoculars and get the positional relation of prism, camera, and object with the best effect of stereo display. Finally, using the active shutter stereo glasses of NVIDIA Company, we can realize the three-dimensional (3-D) display of the object. The experimental results show that the proposed approach can make use of the prism single-camera system to simulate the various observation manners of eyes. The stereo imaging system, which is designed by the method proposed by this paper, can restore the 3-D shape of the object being photographed factually.

  14. A Portable, Inexpensive, Nonmydriatic Fundus Camera Based on the Raspberry Pi® Computer

    Directory of Open Access Journals (Sweden)

    Bailey Y. Shen

    2017-01-01

    Full Text Available Purpose. Nonmydriatic fundus cameras allow retinal photography without pharmacologic dilation of the pupil. However, currently available nonmydriatic fundus cameras are bulky, not portable, and expensive. Taking advantage of recent advances in mobile technology, we sought to create a nonmydriatic fundus camera that was affordable and could be carried in a white coat pocket. Methods. We built a point-and-shoot prototype camera using a Raspberry Pi computer, an infrared-sensitive camera board, a dual infrared and white light light-emitting diode, a battery, a 5-inch touchscreen liquid crystal display, and a disposable 20-diopter condensing lens. Our prototype camera was based on indirect ophthalmoscopy with both infrared and white lights. Results. The prototype camera measured 133mm×91mm×45mm and weighed 386 grams. The total cost of the components, including the disposable lens, was $185.20. The camera was able to obtain good-quality fundus images without pharmacologic dilation of the pupils. Conclusion. A fully functional, inexpensive, handheld, nonmydriatic fundus camera can be easily assembled from a relatively small number of components. With modest improvements, such a camera could be useful for a variety of healthcare professionals, particularly those who work in settings where a traditional table-mounted nonmydriatic fundus camera would be inconvenient.

  15. Global Calibration of Multi-Cameras Based on Refractive Projection and Ray Tracing

    Directory of Open Access Journals (Sweden)

    Mingchi Feng

    2017-10-01

    Full Text Available Multi-camera systems are widely applied in the three dimensional (3D computer vision, especially when multiple cameras are distributed on both sides of the measured object. The calibration methods of multi-camera systems are critical to the accuracy of vision measurement and the key is to find an appropriate calibration target. In this paper, a high-precision camera calibration method for multi-camera systems based on transparent glass checkerboards and ray tracing is described, and is used to calibrate multiple cameras distributed on both sides of the glass checkerboard. Firstly, the intrinsic parameters of each camera are obtained by Zhang’s calibration method. Then, multiple cameras capture several images from the front and back of the glass checkerboard with different orientations, and all images contain distinct grid corners. As the cameras on one side are not affected by the refraction of glass checkerboard, extrinsic parameters can be directly calculated. However, the cameras on the other side are influenced by the refraction of glass checkerboard, and the direct use of projection model will produce a calibration error. A multi-camera calibration method using refractive projection model and ray tracing is developed to eliminate this error. Furthermore, both synthetic and real data are employed to validate the proposed approach. The experimental results of refractive calibration show that the error of the 3D reconstruction is smaller than 0.2 mm, the relative errors of both rotation and translation are less than 0.014%, and the mean and standard deviation of reprojection error of the four-camera system are 0.00007 and 0.4543 pixels, respectively. The proposed method is flexible, highly accurate, and simple to carry out.

  16. Studies on a silicon-photomultiplier-based camera for Imaging Atmospheric Cherenkov Telescopes

    Science.gov (United States)

    Arcaro, C.; Corti, D.; De Angelis, A.; Doro, M.; Manea, C.; Mariotti, M.; Rando, R.; Reichardt, I.; Tescaro, D.

    2017-12-01

    Imaging Atmospheric Cherenkov Telescopes (IACTs) represent a class of instruments which are dedicated to the ground-based observation of cosmic VHE gamma ray emission based on the detection of the Cherenkov radiation produced in the interaction of gamma rays with the Earth atmosphere. One of the key elements of such instruments is a pixelized focal-plane camera consisting of photodetectors. To date, photomultiplier tubes (PMTs) have been the common choice given their high photon detection efficiency (PDE) and fast time response. Recently, silicon photomultipliers (SiPMs) are emerging as an alternative. This rapidly evolving technology has strong potential to become superior to that based on PMTs in terms of PDE, which would further improve the sensitivity of IACTs, and see a price reduction per square millimeter of detector area. We are working to develop a SiPM-based module for the focal-plane cameras of the MAGIC telescopes to probe this technology for IACTs with large focal plane cameras of an area of few square meters. We will describe the solutions we are exploring in order to balance a competitive performance with a minimal impact on the overall MAGIC camera design using ray tracing simulations. We further present a comparative study of the overall light throughput based on Monte Carlo simulations and considering the properties of the major hardware elements of an IACT.

  17. Relationships between early spring wheat streak mosaic severity levels and grain yield: Implications for management decisions

    Science.gov (United States)

    Wheat streak mosaic (WSM) caused by Wheat streak mosaic virus, which is transmitted by the wheat curl mite (Aceria tosichella), is a major yield-limiting disease in the Texas High Plains. In addition to its impact on grain production, the disease reduces water-use efficiency by affecting root develo...

  18. Molecular characterization of Banana streak virus isolate from Musa Acuminata in China.

    Science.gov (United States)

    Zhuang, Jun; Wang, Jian-Hua; Zhang, Xin; Liu, Zhi-Xin

    2011-12-01

    Banana streak virus (BSV), a member of genus Badnavirus, is a causal agent of banana streak disease throughout the world. The genetic diversity of BSVs from different regions of banana plantations has previously been investigated, but there are relatively few reports of the genetic characteristic of episomal (non-integrated) BSV genomes isolated from China. Here, the complete genome, a total of 7722bp (GenBank accession number DQ092436), of an isolate of Banana streak virus (BSV) on cultivar Cavendish (BSAcYNV) in Yunnan, China was determined. The genome organises in the typical manner of badnaviruses. The intergenic region of genomic DNA contains a large stem-loop, which may contribute to the ribosome shift into the following open reading frames (ORFs). The coding region of BSAcYNV consists of three overlapping ORFs, ORF1 with a non-AUG start codon and ORF2 encoding two small proteins are individually involved in viral movement and ORF3 encodes a polyprotein. Besides the complete genome, a defective genome lacking the whole RNA leader region and a majority of ORF1 and which encompasses 6525bp was also isolated and sequenced from this BSV DNA reservoir in infected banana plants. Sequence analyses showed that BSAcYNV has closest similarity in terms of genome organization and the coding assignments with an BSV isolate from Vietnam (BSAcVNV). The corresponding coding regions shared identities of 88% and -95% at nucleotide and amino acid levels, respectively. Phylogenetic analysis also indicated BSAcYNV shared the closest geographical evolutionary relationship to BSAcVNV among sequenced banana streak badnaviruses.

  19. Calibration of high resolution digital camera based on different photogrammetric methods

    International Nuclear Information System (INIS)

    Hamid, N F A; Ahmad, A

    2014-01-01

    This paper presents method of calibrating high-resolution digital camera based on different configuration which comprised of stereo and convergent. Both methods are performed in the laboratory and in the field calibration. Laboratory calibration is based on a 3D test field where a calibration plate of dimension 0.4 m × 0.4 m with grid of targets at different height is used. For field calibration, it uses the same concept of 3D test field which comprised of 81 target points located on a flat ground and the dimension is 9 m × 9 m. In this study, a non-metric high resolution digital camera called Canon Power Shot SX230 HS was calibrated in the laboratory and in the field using different configuration for data acquisition. The aim of the calibration is to investigate the behavior of the internal digital camera whether all the digital camera parameters such as focal length, principal point and other parameters remain the same or vice-versa. In the laboratory, a scale bar is placed in the test field for scaling the image and approximate coordinates were used for calibration process. Similar method is utilized in the field calibration. For both test fields, the digital images were acquired within short period using stereo and convergent configuration. For field calibration, aerial digital images were acquired using unmanned aerial vehicle (UAV) system. All the images were processed using photogrammetric calibration software. Different calibration results were obtained for both laboratory and field calibrations. The accuracy of the results is evaluated based on standard deviation. In general, for photogrammetric applications and other applications the digital camera must be calibrated for obtaining accurate measurement or results. The best method of calibration depends on the type of applications. Finally, for most applications the digital camera is calibrated on site, hence, field calibration is the best method of calibration and could be employed for obtaining accurate

  20. PC based simulation of gamma camera for training of operating and maintenance staff

    International Nuclear Information System (INIS)

    Singh, B.; Kataria, S.K.; Samuel, A.M.

    2000-01-01

    Gamma camera- a sophisticated imaging system is used for functional assessment of biological subsystems/organs in nuclear medicine. The radioactive tracer attached to the native substance is injected into the patient. The distribution of radioactivity in the patient is imaged by the gamma camera. This report describes a PC based package for simulation of gamma cameras and effect of malfunctioning of its subsystems on images of different phantoms

  1. Joint Calibration of 3d Laser Scanner and Digital Camera Based on Dlt Algorithm

    Science.gov (United States)

    Gao, X.; Li, M.; Xing, L.; Liu, Y.

    2018-04-01

    Design a calibration target that can be scanned by 3D laser scanner while shot by digital camera, achieving point cloud and photos of a same target. A method to joint calibrate 3D laser scanner and digital camera based on Direct Linear Transformation algorithm was proposed. This method adds a distortion model of digital camera to traditional DLT algorithm, after repeating iteration, it can solve the inner and external position element of the camera as well as the joint calibration of 3D laser scanner and digital camera. It comes to prove that this method is reliable.

  2. A maize resistance gene functions against bacterial streak disease in rice.

    Science.gov (United States)

    Zhao, Bingyu; Lin, Xinghua; Poland, Jesse; Trick, Harold; Leach, Jan; Hulbert, Scot

    2005-10-25

    Although cereal crops all belong to the grass family (Poacea), most of their diseases are specific to a particular species. Thus, a given cereal species is typically resistant to diseases of other grasses, and this nonhost resistance is generally stable. To determine the feasibility of transferring nonhost resistance genes (R genes) between distantly related grasses to control specific diseases, we identified a maize R gene that recognizes a rice pathogen, Xanthomonas oryzae pv. oryzicola, which causes bacterial streak disease. Bacterial streak is an important disease of rice in Asia, and no simply inherited sources of resistance have been identified in rice. Although X. o. pv. oryzicola does not cause disease on maize, we identified a maize gene, Rxo1, that conditions a resistance reaction to a diverse collection of pathogen strains. Surprisingly, Rxo1 also controls resistance to the unrelated pathogen Burkholderia andropogonis, which causes bacterial stripe of sorghum and maize. The same gene thus controls resistance reactions to both pathogens and nonpathogens of maize. Rxo1 has a nucleotide-binding site-leucine-rich repeat structure, similar to many previously identified R genes. Most importantly, Rxo1 functions after transfer as a transgene to rice, demonstrating the feasibility of nonhost R gene transfer between cereals and providing a valuable tool for controlling bacterial streak disease.

  3. traits and resistance to maize streak virus disease in kenya

    African Journals Online (AJOL)

    African Crop Science Journal, Vol. 14. No. 4, pp. ... Kenya Agricultural Research Institute, Muguga-South, P.O. Box 30148, Nairobi, Kenya .... streak disease has been identified in various maize recycling and development of pure-lines at.

  4. Introgression of chromosome segments from multiple alien species in wheat breeding lines with wheat streak mosaic virus resistance

    Science.gov (United States)

    Pyramiding of alien-derived Wheat streak mosaic virus (WSMV) resistance and resistance enhancing genes in wheat is a costeffective and environmentally safe strategy for disease control. PCR-based markers and cytogenetic analysis with genomic in situ hybridisation were applied to identify alien chrom...

  5. High speed photography, videography, and photonics III; Proceedings of the Meeting, San Diego, CA, August 22, 23, 1985

    Science.gov (United States)

    Ponseggi, B. G. (Editor); Johnson, H. C. (Editor)

    1985-01-01

    Papers are presented on the picosecond electronic framing camera, photogrammetric techniques using high-speed cineradiography, picosecond semiconductor lasers for characterizing high-speed image shutters, the measurement of dynamic strain by high-speed moire photography, the fast framing camera with independent frame adjustments, design considerations for a data recording system, and nanosecond optical shutters. Consideration is given to boundary-layer transition detectors, holographic imaging, laser holographic interferometry in wind tunnels, heterodyne holographic interferometry, a multispectral video imaging and analysis system, a gated intensified camera, a charge-injection-device profile camera, a gated silicon-intensified-target streak tube and nanosecond-gated photoemissive shutter tubes. Topics discussed include high time-space resolved photography of lasers, time-resolved X-ray spectrographic instrumentation for laser studies, a time-resolving X-ray spectrometer, a femtosecond streak camera, streak tubes and cameras, and a short pulse X-ray diagnostic development facility.

  6. A G-APD based Camera for Imaging Atmospheric Cherenkov Telescopes

    International Nuclear Information System (INIS)

    Anderhub, H.; Backes, M.; Biland, A.; Boller, A.; Braun, I.; Bretz, T.; Commichau, S.; Commichau, V.; Dorner, D.; Gendotti, A.; Grimm, O.; Gunten, H. von; Hildebrand, D.; Horisberger, U.; Koehne, J.-H.; Kraehenbuehl, T.; Kranich, D.; Lorenz, E.; Lustermann, W.; Mannheim, K.

    2011-01-01

    Imaging Atmospheric Cherenkov Telescopes (IACT) for Gamma-ray astronomy are presently using photomultiplier tubes as photo sensors. Geiger-mode avalanche photodiodes (G-APD) promise an improvement in sensitivity and, important for this application, ease of construction, operation and ruggedness. G-APDs have proven many of their features in the laboratory, but a qualified assessment of their performance in an IACT camera is best undertaken with a prototype. This paper describes the design and construction of a full-scale camera based on G-APDs realized within the FACT project (First G-APD Cherenkov Telescope).

  7. ANTIOXIDANT EFFECTS OF L-SERINE AGAINST FATTY STREAK FORMATION IN HYPERCHOLESTEROLEMIC ANIMALS

    Directory of Open Access Journals (Sweden)

    Ahmad Movahedian

    2010-12-01

    Full Text Available   Abstract INTRODUCTION: Peroxidation of blood lipoproteins is regarded as a key event in the development of atherosclerosis. Evidence suggests that oxidative modification of amino acids in low-density lipoprotein (LDL particles leads to its convert into an atherogenic form, which is taken up by macrophages. Therefore the reduction of oxidative modification of lipoproteins by increasing plasma antioxidant capacity may prevent cardiovascular disease. methods: In this study, the antioxidant and anti-fatty streak effects of L-serine were investigated in hypercholesterolemic rabbits. Rabbits were randomly divided into three groups which were fed high-cholesterol diet (hypercholesterolemic control group, high-cholesterol + L-serine diet (treatment group, and normal diet (control for twelve weeks and then blood samples were obtained to measure plasma cholesterol, triglyceride (TG, high-density lipoprotein (HDL, low-density lipoprotein (LDL, antioxidant capacity (AC, malondialdehyde (MDA, and conjugated dienes (CDS. Right and left coronary arteries were also obtained for histological evaluation. results: No significant difference was observed in plasma cholesterol, TG, HDL, LDL and CDS levels between treatment and hypercholesterolemic control groups (P>0.05. The levels of plasma MDA and AC were 0.29‌ µM and 56%, respectively in the treatment group which showed a significant change in comparison with hypercholesterolemic control groups (P<0.05. The mean size of produced fatty streak also showed significant reduction in the treatment group compared to the hypercholesterolemic group (P<0.05. CONCLUSIONS: The results showed that L-serine has antioxidant and anti-fatty streak effects without any influence on plasma lipid levels in hypercholesterolemic rabbits.     Keywords: Atherosclerosis, cholesterol, L-serine, antioxidant, lipids, fatty streak.

  8. Cassava brown streak disease effects on leaf metabolites and ...

    African Journals Online (AJOL)

    Cassava brown streak disease effects on leaf metabolites and pigment accumulation. ... Total reducing sugar and starch content also dropped significantly (-30 and -60%, respectively), much as NASE 14 maintained a relatively higher amount of carbohydrates. Leaf protein levels were significantly reduced at a rate of 0.07 ...

  9. Camera Coverage Estimation Based on Multistage Grid Subdivision

    Directory of Open Access Journals (Sweden)

    Meizhen Wang

    2017-04-01

    Full Text Available Visual coverage is one of the most important quality indexes for depicting the usability of an individual camera or camera network. It is the basis for camera network deployment, placement, coverage-enhancement, planning, etc. Precision and efficiency are critical influences on applications, especially those involving several cameras. This paper proposes a new method to efficiently estimate superior camera coverage. First, the geographic area that is covered by the camera and its minimum bounding rectangle (MBR without considering obstacles is computed using the camera parameters. Second, the MBR is divided into grids using the initial grid size. The status of the four corners of each grid is estimated by a line of sight (LOS algorithm. If the camera, considering obstacles, covers a corner, the status is represented by 1, otherwise by 0. Consequently, the status of a grid can be represented by a code that is a combination of 0s or 1s. If the code is not homogeneous (not four 0s or four 1s, the grid will be divided into four sub-grids until the sub-grids are divided into a specific maximum level or their codes are homogeneous. Finally, after performing the process above, total camera coverage is estimated according to the size and status of all grids. Experimental results illustrate that the proposed method’s accuracy is determined by the method that divided the coverage area into the smallest grids at the maximum level, while its efficacy is closer to the method that divided the coverage area into the initial grids. It considers both efficiency and accuracy. The initial grid size and maximum level are two critical influences on the proposed method, which can be determined by weighing efficiency and accuracy.

  10. Reconstruction in PET cameras with irregular sampling and depth of interaction capability

    International Nuclear Information System (INIS)

    Virador, P.R.G.; Moses, W.W.; Huesman, R.H.

    1998-01-01

    The authors present 2D reconstruction algorithms for a rectangular PET camera capable of measuring depth of interaction (DOI). The camera geometry leads to irregular radial and angular sampling of the tomographic data. DOI information increases sampling density, allowing the use of evenly spaced quarter-crystal width radial bins with minimal interpolation of irregularly spaced data. In the regions where DOI does not increase sampling density (chords normal to crystal faces), fine radial sinogram binning leads to zero efficiency bins if uniform angular binning is used. These zero efficiency sinogram bins lead to streak artifacts if not corrected. To minimize these unnormalizable sinogram bins the authors use two angular binning schemes: Fixed Width and Natural Width. Fixed Width uses a fixed angular width except in the problem regions where appropriately chosen widths are applied. Natural Width uses angle widths which are derived from intrinsic detector sampling. Using a modified filtered-backprojection algorithm to accommodate these angular binning schemes, the authors reconstruct artifact free images with nearly isotropic and position independent spatial resolution. Results from Monte Carlo data indicate that they have nearly eliminated image degradation due to crystal penetration

  11. Pedicle streaking: A novel and simple aid in pedicle positioning in free tissue transfer

    Directory of Open Access Journals (Sweden)

    Aditya Aggarwal

    2015-01-01

    Full Text Available Introduction: The pedicle positioning in free tissue transfer is critical to its success. Long thin pedicles are especially prone to this complication where even a slight twist in the perforator can result in flap loss. Pedicles passing through the long tunnels are similarly at risk. Streaking the pedicle with methylene blue is a simple and safe method which increases the safety of free tissue transfer. Materials and Methods: Once the flap is islanded on the pedicle and the vascularity of the flap is confirmed, the pedicle is streaked with methylene blue dye at a distance of 6-7 mm. The streaking starts from the origin of the vessels and continued distally on to the under surface of flap to mark the complete course of the pedicle in alignment. The presence of streaking in some parts and not in rest indicates twist in the pedicle. Observation and Results: Four hundred and sixty five free flaps have been done at our centre in the last 5 years. The overall success rate of free flaps is 95.3% (22 free flap failures. There has not been a single case of pedicle twist leading to flap congestion and failure. Conclusion: This simple and novel method is very reliable for pedicle positioning avoiding any twist necessary for successful free tissue transfer.

  12. Human tracking over camera networks: a review

    Science.gov (United States)

    Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang

    2017-12-01

    In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.

  13. Accurate measurement of imaging photoplethysmographic signals based camera using weighted average

    Science.gov (United States)

    Pang, Zongguang; Kong, Lingqin; Zhao, Yuejin; Sun, Huijuan; Dong, Liquan; Hui, Mei; Liu, Ming; Liu, Xiaohua; Liu, Lingling; Li, Xiaohui; Li, Rongji

    2018-01-01

    Imaging Photoplethysmography (IPPG) is an emerging technique for the extraction of vital signs of human being using video recordings. IPPG technology with its advantages like non-contact measurement, low cost and easy operation has become one research hot spot in the field of biomedicine. However, the noise disturbance caused by non-microarterial area cannot be removed because of the uneven distribution of micro-arterial, different signal strength of each region, which results in a low signal noise ratio of IPPG signals and low accuracy of heart rate. In this paper, we propose a method of improving the signal noise ratio of camera-based IPPG signals of each sub-region of the face using a weighted average. Firstly, we obtain the region of interest (ROI) of a subject's face based camera. Secondly, each region of interest is tracked and feature-based matched in each frame of the video. Each tracked region of face is divided into 60x60 pixel block. Thirdly, the weights of PPG signal of each sub-region are calculated, based on the signal-to-noise ratio of each sub-region. Finally, we combine the IPPG signal from all the tracked ROI using weighted average. Compared with the existing approaches, the result shows that the proposed method takes modest but significant effects on improvement of signal noise ratio of camera-based PPG estimated and accuracy of heart rate measurement.

  14. Development of a Data Reduction Algorithm for Optical Wide Field Patrol (OWL) II: Improving Measurement of Lengths of Detected Streaks

    Science.gov (United States)

    Park, Sun-Youp; Choi, Jin; Roh, Dong-Goo; Park, Maru; Jo, Jung Hyun; Yim, Hong-Suh; Park, Young-Sik; Bae, Young-Ho; Park, Jang-Hyun; Moon, Hong-Kyu; Choi, Young-Jun; Cho, Sungki; Choi, Eun-Jung

    2016-09-01

    As described in the previous paper (Park et al. 2013), the detector subsystem of optical wide-field patrol (OWL) provides many observational data points of a single artificial satellite or space debris in the form of small streaks, using a chopper system and a time tagger. The position and the corresponding time data are matched assuming that the length of a streak on the CCD frame is proportional to the time duration of the exposure during which the chopper blades do not obscure the CCD window. In the previous study, however, the length was measured using the diagonal of the rectangle of the image area containing the streak; the results were quite ambiguous and inaccurate, allowing possible matching error of positions and time data. Furthermore, because only one (position, time) data point is created from one streak, the efficiency of the observation decreases. To define the length of a streak correctly, it is important to locate the endpoints of a streak. In this paper, a method using a differential convolution mask pattern is tested. This method can be used to obtain the positions where the pixel values are changed sharply. These endpoints can be regarded as directly detected positional data, and the number of data points is doubled by this result.

  15. Luminescence-induced noise in single photon sources based on BBO crystals

    Czech Academy of Sciences Publication Activity Database

    Machulka, R.; Lemr, Karel; Haderka, Ondřej; Lamperti, M.; Allevi, A.; Bondani, M.

    2014-01-01

    Roč. 47, č. 21 (2014), s. 215501 ISSN 0953-4075 R&D Projects: GA ČR GAP205/12/0382 Institutional support: RVO:68378271 Keywords : luminescence * BBO crystal * photon source * noise * streak camera Subject RIV: BH - Optics , Masers, Lasers Impact factor: 1.975, year: 2014

  16. Quantitative trait loci for resistance to maize streak virus disease in ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-07-18

    Jul 18, 2008 ... African Journal of Biotechnology Vol. ... development ... Biotechnology Center, Kenya Agricultural Research Institute, P.O. Box 58711-00200, Nairobi, ... Maize streak virus disease is an important disease of maize in Kenya.

  17. People counting with stereo cameras : two template-based solutions

    NARCIS (Netherlands)

    Englebienne, Gwenn; van Oosterhout, Tim; Kröse, B.J.A.

    2012-01-01

    People counting is a challenging task with many applications. We propose a method with a fixed stereo camera that is based on projecting a template onto the depth image. The method was tested on a challenging outdoor dataset with good results and runs in real time.

  18. Construction of a frameless camera-based stereotactic neuronavigator.

    Science.gov (United States)

    Cornejo, A; Algorri, M E

    2004-01-01

    We built an infrared vision system to be used as the real time 3D motion sensor in a prototype low cost, high precision, frameless neuronavigator. The objective of the prototype is to develop accessible technology for increased availability of neuronavigation systems in research labs and small clinics and hospitals. We present our choice of technology including camera and IR emitter characteristics. We describe the methodology for setting up the 3D motion sensor, from the arrangement of the cameras and the IR emitters on surgical instruments, to triangulation equations from stereo camera pairs, high bandwidth computer communication with the cameras and real time image processing algorithms. We briefly cover the issues of camera calibration and characterization. Although our performance results do not yet fully meet the high precision, real time requirements of neuronavigation systems we describe the current improvements being made to the 3D motion sensor that will make it suitable for surgical applications.

  19. Development of plenoptic infrared camera using low dimensional material based photodetectors

    Science.gov (United States)

    Chen, Liangliang

    Infrared (IR) sensor has extended imaging from submicron visible spectrum to tens of microns wavelength, which has been widely used for military and civilian application. The conventional bulk semiconductor materials based IR cameras suffer from low frame rate, low resolution, temperature dependent and highly cost, while the unusual Carbon Nanotube (CNT), low dimensional material based nanotechnology has been made much progress in research and industry. The unique properties of CNT lead to investigate CNT based IR photodetectors and imaging system, resolving the sensitivity, speed and cooling difficulties in state of the art IR imagings. The reliability and stability is critical to the transition from nano science to nano engineering especially for infrared sensing. It is not only for the fundamental understanding of CNT photoresponse induced processes, but also for the development of a novel infrared sensitive material with unique optical and electrical features. In the proposed research, the sandwich-structured sensor was fabricated within two polymer layers. The substrate polyimide provided sensor with isolation to background noise, and top parylene packing blocked humid environmental factors. At the same time, the fabrication process was optimized by real time electrical detection dielectrophoresis and multiple annealing to improve fabrication yield and sensor performance. The nanoscale infrared photodetector was characterized by digital microscopy and precise linear stage in order for fully understanding it. Besides, the low noise, high gain readout system was designed together with CNT photodetector to make the nano sensor IR camera available. To explore more of infrared light, we employ compressive sensing algorithm into light field sampling, 3-D camera and compressive video sensing. The redundant of whole light field, including angular images for light field, binocular images for 3-D camera and temporal information of video streams, are extracted and

  20. Trained neurons-based motion detection in optical camera communications

    Science.gov (United States)

    Teli, Shivani; Cahyadi, Willy Anugrah; Chung, Yeon Ho

    2018-04-01

    A concept of trained neurons-based motion detection (TNMD) in optical camera communications (OCC) is proposed. The proposed TNMD is based on neurons present in a neural network that perform repetitive analysis in order to provide efficient and reliable motion detection in OCC. This efficient motion detection can be considered another functionality of OCC in addition to two traditional functionalities of illumination and communication. To verify the proposed TNMD, the experiments were conducted in an indoor static downlink OCC, where a mobile phone front camera is employed as the receiver and an 8 × 8 red, green, and blue (RGB) light-emitting diode array as the transmitter. The motion is detected by observing the user's finger movement in the form of centroid through the OCC link via a camera. Unlike conventional trained neurons approaches, the proposed TNMD is trained not with motion itself but with centroid data samples, thus providing more accurate detection and far less complex detection algorithm. The experiment results demonstrate that the TNMD can detect all considered motions accurately with acceptable bit error rate (BER) performances at a transmission distance of up to 175 cm. In addition, while the TNMD is performed, a maximum data rate of 3.759 kbps over the OCC link is obtained. The OCC with the proposed TNMD combined can be considered an efficient indoor OCC system that provides illumination, communication, and motion detection in a convenient smart home environment.

  1. High voltage short plus generation based on avalanche circuit

    International Nuclear Information System (INIS)

    Hu Yuanfeng; Yu Xiaoqi

    2006-01-01

    Simulate the avalanche circuit in series with PSPICE module, design the high voltage short plus generation circuit by avalanche transistor in series for the sweep deflection circuit of streak camera. The output voltage ranges 1.2 KV into 50 ohm load. The rise time of the circuit is less than 3 ns. (authors)

  2. Persistence of Smectic-A Oily Streaks into the Nematic Phase by UV Irradiation of Reactive Mesogens

    Directory of Open Access Journals (Sweden)

    Ines Gharbi

    2017-12-01

    Full Text Available Thin smectic liquid crystal films with competing boundary conditions (planar and homeotropic at opposing surfaces form well-known striated structures known as “oily streaks”, which are a series of hemicylindrical caps that run perpendicular to the easy axis of the planar substrate. The streaks vanish on heating into the nematic phase, where the film becomes uniform and exhibits hybrid alignment. On adding sufficient reactive mesogen and polymerizing, the oily streak texture is maintained on heating through the entire nematic phase until reaching the bulk isotropic phase, above which the texture vanishes. Depending on the liquid crystal thickness, the oily streak structure may be retrieved after cooling, which demonstrates the strong impact of the polymer backbone on the liquid crystal texture. Polarizing optical, atomic force, and scanning electron microscopy data are presented.

  3. Texton-based super-resolution for achieving high spatiotemporal resolution in hybrid camera system

    Science.gov (United States)

    Kamimura, Kenji; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi

    2010-05-01

    Many super-resolution methods have been proposed to enhance the spatial resolution of images by using iteration and multiple input images. In a previous paper, we proposed the example-based super-resolution method to enhance an image through pixel-based texton substitution to reduce the computational cost. In this method, however, we only considered the enhancement of a texture image. In this study, we modified this texton substitution method for a hybrid camera to reduce the required bandwidth of a high-resolution video camera. We applied our algorithm to pairs of high- and low-spatiotemporal-resolution videos, which were synthesized to simulate a hybrid camera. The result showed that the fine detail of the low-resolution video can be reproduced compared with bicubic interpolation and the required bandwidth could be reduced to about 1/5 in a video camera. It was also shown that the peak signal-to-noise ratios (PSNRs) of the images improved by about 6 dB in a trained frame and by 1.0-1.5 dB in a test frame, as determined by comparison with the processed image using bicubic interpolation, and the average PSNRs were higher than those obtained by the well-known Freeman’s patch-based super-resolution method. Compared with that of the Freeman’s patch-based super-resolution method, the computational time of our method was reduced to almost 1/10.

  4. Simulation-Based Optimization of Camera Placement in the Context of Industrial Pose Estimation

    DEFF Research Database (Denmark)

    Jørgensen, Troels Bo; Iversen, Thorbjørn Mosekjær; Lindvig, Anders Prier

    2018-01-01

    In this paper, we optimize the placement of a camera in simulation in order to achieve a high success rate for a pose estimation problem. This is achieved by simulating 2D images from a stereo camera in a virtual scene. The stereo images are then used to generate 3D point clouds based on two diff...

  5. Secure Chaotic Map Based Block Cryptosystem with Application to Camera Sensor Networks

    Directory of Open Access Journals (Sweden)

    Muhammad Khurram Khan

    2011-01-01

    Full Text Available Recently, Wang et al. presented an efficient logistic map based block encryption system. The encryption system employs feedback ciphertext to achieve plaintext dependence of sub-keys. Unfortunately, we discovered that their scheme is unable to withstand key stream attack. To improve its security, this paper proposes a novel chaotic map based block cryptosystem. At the same time, a secure architecture for camera sensor network is constructed. The network comprises a set of inexpensive camera sensors to capture the images, a sink node equipped with sufficient computation and storage capabilities and a data processing server. The transmission security between the sink node and the server is gained by utilizing the improved cipher. Both theoretical analysis and simulation results indicate that the improved algorithm can overcome the flaws and maintain all the merits of the original cryptosystem. In addition, computational costs and efficiency of the proposed scheme are encouraging for the practical implementation in the real environment as well as camera sensor network.

  6. Efficient color correction method for smartphone camera-based health monitoring application.

    Science.gov (United States)

    Duc Dang; Chae Ho Cho; Daeik Kim; Oh Seok Kwon; Jo Woon Chong

    2017-07-01

    Smartphone health monitoring applications are recently highlighted due to the rapid development of hardware and software performance of smartphones. However, color characteristics of images captured by different smartphone models are dissimilar each other and this difference may give non-identical health monitoring results when the smartphone health monitoring applications monitor physiological information using their embedded smartphone cameras. In this paper, we investigate the differences in color properties of the captured images from different smartphone models and apply a color correction method to adjust dissimilar color values obtained from different smartphone cameras. Experimental results show that the color corrected images using the correction method provide much smaller color intensity errors compared to the images without correction. These results can be applied to enhance the consistency of smartphone camera-based health monitoring applications by reducing color intensity errors among the images obtained from different smartphones.

  7. Picosecond x-ray streak camera studies

    International Nuclear Information System (INIS)

    Kasyanov, Yu.S.; Malyutin, A.A.; Richardson, M.C.; Chevokin, V.K.

    1975-01-01

    Some initial results of direct measurement of picosecond x-ray emission from laser-produced plasmas are presented. A PIM-UMI 93 image converter tube, incorporating an x-ray sensitive photocathode, linear deflection, and three stages of image amplification was used to analyse the x-ray radiation emanating from plasmas produced from solid Ti targets by single high-intensity picosecond laser pulses. From such plasmas, the x-ray emission typically persisted for times of 60psec. However, it is shown that this detection system should be capable of resolving x-ray phenomena of much shorter duration. (author)

  8. Parallelised photoacoustic signal acquisition using a Fabry-Perot sensor and a camera-based interrogation scheme

    Science.gov (United States)

    Saeb Gilani, T.; Villringer, C.; Zhang, E.; Gundlach, H.; Buchmann, J.; Schrader, S.; Laufer, J.

    2018-02-01

    Tomographic photoacoustic (PA) images acquired using a Fabry-Perot (FP) based scanner offer high resolution and image fidelity but can result in long acquisition times due to the need for raster scanning. To reduce the acquisition times, a parallelised camera-based PA signal detection scheme is developed. The scheme is based on using a sCMOScamera and FPI sensors with high homogeneity of optical thickness. PA signals were acquired using the camera-based setup and the signal to noise ratio (SNR) was measured. A comparison of the SNR of PA signal detected using 1) a photodiode in a conventional raster scanning detection scheme and 2) a sCMOS camera in parallelised detection scheme is made. The results show that the parallelised interrogation scheme has the potential to provide high speed PA imaging.

  9. International Congress on High Speed Photography and Photonics, 17th, Pretoria, Republic of South Africa, Sept. 1-5, 1986, Proceedings. Volumes 1 & 2

    Science.gov (United States)

    McDowell, M. W.; Hollingworth, D.

    1986-01-01

    The present conference discusses topics in mining applications of high speed photography, ballistic, shock wave and detonation studies employing high speed photography, laser and X-ray diagnostics, biomechanical photography, millisec-microsec-nanosec-picosec-femtosec photographic methods, holographic, schlieren, and interferometric techniques, and videography. Attention is given to such issues as the pulse-shaping of ultrashort optical pulses, the performance of soft X-ray streak cameras, multiple-frame image tube operation, moire-enlargement motion-raster photography, two-dimensional imaging with tomographic techniques, photochron TV streak cameras, and streak techniques in detonics.

  10. Inactivation of the Huntington's disease gene (Hdh impairs anterior streak formation and early patterning of the mouse embryo

    Directory of Open Access Journals (Sweden)

    Conlon Ronald A

    2005-08-01

    Full Text Available Abstract Background Huntingtin, the HD gene encoded protein mutated by polyglutamine expansion in Huntington's disease, is required in extraembryonic tissues for proper gastrulation, implicating its activities in nutrition or patterning of the developing embryo. To test these possibilities, we have used whole mount in situ hybridization to examine embryonic patterning and morphogenesis in homozygous Hdhex4/5 huntingtin deficient embryos. Results In the absence of huntingtin, expression of nutritive genes appears normal but E7.0–7.5 embryos exhibit a unique combination of patterning defects. Notable are a shortened primitive streak, absence of a proper node and diminished production of anterior streak derivatives. Reduced Wnt3a, Tbx6 and Dll1 expression signify decreased paraxial mesoderm and reduced Otx2 expression and lack of headfolds denote a failure of head development. In addition, genes initially broadly expressed are not properly restricted to the posterior, as evidenced by the ectopic expression of Nodal, Fgf8 and Gsc in the epiblast and T (Brachyury and Evx1 in proximal mesoderm derivatives. Despite impaired posterior restriction and anterior streak deficits, overall anterior/posterior polarity is established. A single primitive streak forms and marker expression shows that the anterior epiblast and anterior visceral endoderm (AVE are specified. Conclusion Huntingtin is essential in the early patterning of the embryo for formation of the anterior region of the primitive streak, and for down-regulation of a subset of dynamic growth and transcription factor genes. These findings provide fundamental starting points for identifying the novel cellular and molecular activities of huntingtin in the extraembryonic tissues that govern normal anterior streak development. This knowledge may prove to be important for understanding the mechanism by which the dominant polyglutamine expansion in huntingtin determines the loss of neurons in

  11. Inactivation of the Huntington's disease gene (Hdh) impairs anterior streak formation and early patterning of the mouse embryo.

    Science.gov (United States)

    Woda, Juliana M; Calzonetti, Teresa; Hilditch-Maguire, Paige; Duyao, Mabel P; Conlon, Ronald A; MacDonald, Marcy E

    2005-08-18

    Huntingtin, the HD gene encoded protein mutated by polyglutamine expansion in Huntington's disease, is required in extraembryonic tissues for proper gastrulation, implicating its activities in nutrition or patterning of the developing embryo. To test these possibilities, we have used whole mount in situ hybridization to examine embryonic patterning and morphogenesis in homozygous Hdh(ex4/5) huntingtin deficient embryos. In the absence of huntingtin, expression of nutritive genes appears normal but E7.0-7.5 embryos exhibit a unique combination of patterning defects. Notable are a shortened primitive streak, absence of a proper node and diminished production of anterior streak derivatives. Reduced Wnt3a, Tbx6 and Dll1 expression signify decreased paraxial mesoderm and reduced Otx2 expression and lack of headfolds denote a failure of head development. In addition, genes initially broadly expressed are not properly restricted to the posterior, as evidenced by the ectopic expression of Nodal, Fgf8 and Gsc in the epiblast and T (Brachyury) and Evx1 in proximal mesoderm derivatives. Despite impaired posterior restriction and anterior streak deficits, overall anterior/posterior polarity is established. A single primitive streak forms and marker expression shows that the anterior epiblast and anterior visceral endoderm (AVE) are specified. Huntingtin is essential in the early patterning of the embryo for formation of the anterior region of the primitive streak, and for down-regulation of a subset of dynamic growth and transcription factor genes. These findings provide fundamental starting points for identifying the novel cellular and molecular activities of huntingtin in the extraembryonic tissues that govern normal anterior streak development. This knowledge may prove to be important for understanding the mechanism by which the dominant polyglutamine expansion in huntingtin determines the loss of neurons in Huntington's disease.

  12. NEMA NU-1 2007 based and independent quality control software for gamma cameras and SPECT

    International Nuclear Information System (INIS)

    Vickery, A; Joergensen, T; De Nijs, R

    2011-01-01

    A thorough quality assurance of gamma and SPECT cameras requires a careful handling of the measured quality control (QC) data. Most gamma camera manufacturers provide the users with camera specific QC Software. This QC software is indeed a useful tool for the following of day-to-day performance of a single camera. However, when it comes to objective performance comparison of different gamma cameras and a deeper understanding of the calculated numbers, the use of camera specific QC software without access to the source code is rather avoided. Calculations and definitions might differ, and manufacturer independent standardized results are preferred. Based upon the NEMA Standards Publication NU 1-2007, we have developed a suite of easy-to-use data handling software for processing acquired QC data providing the user with instructive images and text files with the results.

  13. A Cherenkov camera with integrated electronics based on the 'Smart Pixel' concept

    International Nuclear Information System (INIS)

    Bulian, Norbert; Hirsch, Thomas; Hofmann, Werner; Kihm, Thomas; Kohnle, Antje; Panter, Michael; Stein, Michael

    2000-01-01

    An option for the cameras of the HESS telescopes, the concept of a modular camera based on 'Smart Pixels' was developed. A Smart Pixel contains the photomultiplier, the high voltage supply for the photomultiplier, a dual-gain sample-and-hold circuit with a 14 bit dynamic range, a time-to-voltage converter, a trigger discriminator, trigger logic to detect a coincidence of X=1...7 neighboring pixels, and an analog ratemeter. The Smart Pixels plug into a common backplane which provides power, communicates trigger signals between neighboring pixels, and holds a digital control bus as well as an analog bus for multiplexed readout of pixel signals. The performance of the Smart Pixels has been studied using a 19-pixel test camera

  14. Visceral endoderm and the primitive streak interact to build the fetal-placental interface of the mouse gastrula.

    Science.gov (United States)

    Rodriguez, Adriana M; Downs, Karen M

    2017-12-01

    Hypoblast/visceral endoderm assists in amniote nutrition, axial positioning and formation of the gut. Here, we provide evidence, currently limited to humans and non-human primates, that hypoblast is a purveyor of extraembryonic mesoderm in the mouse gastrula. Fate mapping a unique segment of axial extraembryonic visceral endoderm associated with the allantoic component of the primitive streak, and referred to as the "AX", revealed that visceral endoderm supplies the placentae with extraembryonic mesoderm. Exfoliation of the AX was dependent upon contact with the primitive streak, which modulated Hedgehog signaling. Resolution of the AX's epithelial-to-mesenchymal transition (EMT) by Hedgehog shaped the allantois into its characteristic projectile and individualized placental arterial vessels. A unique border cell separated the delaminating AX from the yolk sac blood islands which, situated beyond the limit of the streak, were not formed by an EMT. Over time, the AX became the hindgut lip, which contributed extensively to the posterior interface, including both embryonic and extraembryonic tissues. The AX, in turn, imparted antero-posterior (A-P) polarity on the primitive streak and promoted its elongation and differentiation into definitive endoderm. Results of heterotopic grafting supported mutually interactive functions of the AX and primitive streak, showing that together, they self-organized into a complete version of the fetal-placental interface, forming an elongated structure that exhibited A-P polarity and was composed of the allantois, an AX-derived rod-like axial extension reminiscent of the embryonic notochord, the placental arterial vasculature and visceral endoderm/hindgut. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Relative Panoramic Camera Position Estimation for Image-Based Virtual Reality Networks in Indoor Environments

    Science.gov (United States)

    Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.

    2017-09-01

    Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.

  16. RELATIVE PANORAMIC CAMERA POSITION ESTIMATION FOR IMAGE-BASED VIRTUAL REALITY NETWORKS IN INDOOR ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    M. Nakagawa

    2017-09-01

    Full Text Available Image-based virtual reality (VR is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.

  17. Development of a compact scintillator-based high-resolution Compton camera for molecular imaging

    Energy Technology Data Exchange (ETDEWEB)

    Kishimoto, A., E-mail: daphne3h-aya@ruri.waseda.jp [Research Institute for Science and Engineering, Waseda University, 3-4-1 Ohkubo, Shinjuku, Tokyo (Japan); Kataoka, J.; Koide, A.; Sueoka, K.; Iwamoto, Y.; Taya, T. [Research Institute for Science and Engineering, Waseda University, 3-4-1 Ohkubo, Shinjuku, Tokyo (Japan); Ohsuka, S. [Central Research Laboratory, Hamamatsu Photonics K.K., 5000 Hirakuchi, Hamakita-ku, Hamamatsu, Shizuoka (Japan)

    2017-02-11

    The Compton camera, which shows gamma-ray distribution utilizing the kinematics of Compton scattering, is a promising detector capable of imaging across a wide range of energy. In this study, we aim to construct a small-animal molecular imaging system in a wide energy range by using the Compton camera. We developed a compact medical Compton camera based on a Ce-doped Gd{sub 3}Al{sub 2}Ga{sub 3}O{sub 12} (Ce:GAGG) scintillator and multi-pixel photon counter (MPPC). A basic performance confirmed that for 662 keV, the typical energy resolution was 7.4 % (FWHM) and the angular resolution was 4.5° (FWHM). We then used the medical Compton camera to conduct imaging experiments based on a 3-D imaging reconstruction algorithm using the multi-angle data acquisition method. The result confirmed that for a {sup 137}Cs point source at a distance of 4 cm, the image had a spatial resolution of 3.1 mm (FWHM). Furthermore, we succeeded in producing 3-D multi-color image of different simultaneous energy sources ({sup 22}Na [511 keV], {sup 137}Cs [662 keV], and {sup 54}Mn [834 keV]).

  18. Neutron cameras for ITER

    International Nuclear Information System (INIS)

    Johnson, L.C.; Barnes, C.W.; Batistoni, P.

    1998-01-01

    Neutron cameras with horizontal and vertical views have been designed for ITER, based on systems used on JET and TFTR. The cameras consist of fan-shaped arrays of collimated flight tubes, with suitably chosen detectors situated outside the biological shield. The sight lines view the ITER plasma through slots in the shield blanket and penetrate the vacuum vessel, cryostat, and biological shield through stainless steel windows. This paper analyzes the expected performance of several neutron camera arrangements for ITER. In addition to the reference designs, the authors examine proposed compact cameras, in which neutron fluxes are inferred from 16 N decay gammas in dedicated flowing water loops, and conventional cameras with fewer sight lines and more limited fields of view than in the reference designs. It is shown that the spatial sampling provided by the reference designs is sufficient to satisfy target measurement requirements and that some reduction in field of view may be permissible. The accuracy of measurements with 16 N-based compact cameras is not yet established, and they fail to satisfy requirements for parameter range and time resolution by large margins

  19. Parallel Computational Intelligence-Based Multi-Camera Surveillance System

    OpenAIRE

    Orts-Escolano, Sergio; Garcia-Rodriguez, Jose; Morell, Vicente; Cazorla, Miguel; Azorin-Lopez, Jorge; García-Chamizo, Juan Manuel

    2014-01-01

    In this work, we present a multi-camera surveillance system based on the use of self-organizing neural networks to represent events on video. The system processes several tasks in parallel using GPUs (graphic processor units). It addresses multiple vision tasks at various levels, such as segmentation, representation or characterization, analysis and monitoring of the movement. These features allow the construction of a robust representation of the environment and interpret the behavior of mob...

  20. High-resolution Compton cameras based on Si/CdTe double-sided strip detectors

    International Nuclear Information System (INIS)

    Odaka, Hirokazu; Ichinohe, Yuto; Takeda, Shin'ichiro; Fukuyama, Taro; Hagino, Koichi; Saito, Shinya; Sato, Tamotsu; Sato, Goro; Watanabe, Shin; Kokubun, Motohide; Takahashi, Tadayuki; Yamaguchi, Mitsutaka

    2012-01-01

    We have developed a new Compton camera based on silicon (Si) and cadmium telluride (CdTe) semiconductor double-sided strip detectors (DSDs). The camera consists of a 500-μm-thick Si-DSD and four layers of 750-μm-thick CdTe-DSDs all of which have common electrode configuration segmented into 128 strips on each side with pitches of 250μm. In order to realize high angular resolution and to reduce size of the detector system, a stack of DSDs with short stack pitches of 4 mm is utilized to make the camera. Taking advantage of the excellent energy and position resolutions of the semiconductor devices, the camera achieves high angular resolutions of 4.5° at 356 keV and 3.5° at 662 keV. To obtain such high resolutions together with an acceptable detection efficiency, we demonstrate data reduction methods including energy calibration using Compton scattering continuum and depth sensing in the CdTe-DSD. We also discuss imaging capability of the camera and show simultaneous multi-energy imaging.

  1. An Approach to Evaluate Stability for Cable-Based Parallel Camera Robots with Hybrid Tension-Stiffness Properties

    Directory of Open Access Journals (Sweden)

    Huiling Wei

    2015-12-01

    Full Text Available This paper focuses on studying the effect of cable tensions and stiffness on the stability of cable-based parallel camera robots. For this purpose, the tension factor and the stiffness factor are defined, and the expression of stability is deduced. A new approach is proposed to calculate the hybrid-stability index with the minimum cable tension and the minimum singular value. Firstly, the kinematic model of a cable-based parallel camera robot is established. Based on the model, the tensions are solved and a tension factor is defined. In order to obtain the tension factor, an optimization of the cable tensions is carried out. Then, an expression of the system's stiffness is deduced and a stiffness factor is defined. Furthermore, an approach to evaluate the stability of the cable-based camera robots with hybrid tension-stiffness properties is presented. Finally, a typical three-degree-of-freedom cable-based parallel camera robot with four cables is studied as a numerical example. The simulation results show that the approach is both reasonable and effective.

  2. Significance and transmission of maize streak virus disease in Africa ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-12-29

    Dec 29, 2008 ... soil nutrients, altitude and temperature on the biology of maize streak virus (MSV) / vector populations is discussed. ... status of maize host plants and its effects on population dynamics of Cicadulina mbila Naudé. (Homoptera: ..... time necessary for the leafhopper to reach the mesophyll of the leaf and ingest ...

  3. Motion camera based on a custom vision sensor and an FPGA architecture

    Science.gov (United States)

    Arias-Estrada, Miguel

    1998-09-01

    A digital camera for custom focal plane arrays was developed. The camera allows the test and development of analog or mixed-mode arrays for focal plane processing. The camera is used with a custom sensor for motion detection to implement a motion computation system. The custom focal plane sensor detects moving edges at the pixel level using analog VLSI techniques. The sensor communicates motion events using the event-address protocol associated to a temporal reference. In a second stage, a coprocessing architecture based on a field programmable gate array (FPGA) computes the time-of-travel between adjacent pixels. The FPGA allows rapid prototyping and flexible architecture development. Furthermore, the FPGA interfaces the sensor to a compact PC computer which is used for high level control and data communication to the local network. The camera could be used in applications such as self-guided vehicles, mobile robotics and smart surveillance systems. The programmability of the FPGA allows the exploration of further signal processing like spatial edge detection or image segmentation tasks. The article details the motion algorithm, the sensor architecture, the use of the event- address protocol for velocity vector computation and the FPGA architecture used in the motion camera system.

  4. A drone detection with aircraft classification based on a camera array

    Science.gov (United States)

    Liu, Hao; Qu, Fangchao; Liu, Yingjian; Zhao, Wei; Chen, Yitong

    2018-03-01

    In recent years, because of the rapid popularity of drones, many people have begun to operate drones, bringing a range of security issues to sensitive areas such as airports and military locus. It is one of the important ways to solve these problems by realizing fine-grained classification and providing the fast and accurate detection of different models of drone. The main challenges of fine-grained classification are that: (1) there are various types of drones, and the models are more complex and diverse. (2) the recognition test is fast and accurate, in addition, the existing methods are not efficient. In this paper, we propose a fine-grained drone detection system based on the high resolution camera array. The system can quickly and accurately recognize the detection of fine grained drone based on hd camera.

  5. MR imaging of medullary streaks in osteosclerosis: a case report

    International Nuclear Information System (INIS)

    Lee, Hak Soo; Joo, Kyung Bin; Park, Tae Soo; Song, Ho Taek; Kim, Yong Soo; Park, Dong Woo; Park, Choong Ki

    2000-01-01

    We present a case of medullary sclerosis of the appendicular skeleton in a patient with chronic renal insufficiency for whom MR imaging findings were characteristic. T1- and T2-weighted MR images showed multiple vertical lines (medullary streaks) of low signal intensity in the metaphyses and diaphyses of the distal femur and proximal tibia

  6. Camera calibration based on the back projection process

    Science.gov (United States)

    Gu, Feifei; Zhao, Hong; Ma, Yueyang; Bu, Penghui

    2015-12-01

    Camera calibration plays a crucial role in 3D measurement tasks of machine vision. In typical calibration processes, camera parameters are iteratively optimized in the forward imaging process (FIP). However, the results can only guarantee the minimum of 2D projection errors on the image plane, but not the minimum of 3D reconstruction errors. In this paper, we propose a universal method for camera calibration, which uses the back projection process (BPP). In our method, a forward projection model is used to obtain initial intrinsic and extrinsic parameters with a popular planar checkerboard pattern. Then, the extracted image points are projected back into 3D space and compared with the ideal point coordinates. Finally, the estimation of the camera parameters is refined by a non-linear function minimization process. The proposed method can obtain a more accurate calibration result, which is more physically useful. Simulation and practical data are given to demonstrate the accuracy of the proposed method.

  7. Vibration extraction based on fast NCC algorithm and high-speed camera.

    Science.gov (United States)

    Lei, Xiujun; Jin, Yi; Guo, Jie; Zhu, Chang'an

    2015-09-20

    In this study, a high-speed camera system is developed to complete the vibration measurement in real time and to overcome the mass introduced by conventional contact measurements. The proposed system consists of a notebook computer and a high-speed camera which can capture the images as many as 1000 frames per second. In order to process the captured images in the computer, the normalized cross-correlation (NCC) template tracking algorithm with subpixel accuracy is introduced. Additionally, a modified local search algorithm based on the NCC is proposed to reduce the computation time and to increase efficiency significantly. The modified algorithm can rapidly accomplish one displacement extraction 10 times faster than the traditional template matching without installing any target panel onto the structures. Two experiments were carried out under laboratory and outdoor conditions to validate the accuracy and efficiency of the system performance in practice. The results demonstrated the high accuracy and efficiency of the camera system in extracting vibrating signals.

  8. Parallel Computational Intelligence-Based Multi-Camera Surveillance System

    Directory of Open Access Journals (Sweden)

    Sergio Orts-Escolano

    2014-04-01

    Full Text Available In this work, we present a multi-camera surveillance system based on the use of self-organizing neural networks to represent events on video. The system processes several tasks in parallel using GPUs (graphic processor units. It addresses multiple vision tasks at various levels, such as segmentation, representation or characterization, analysis and monitoring of the movement. These features allow the construction of a robust representation of the environment and interpret the behavior of mobile agents in the scene. It is also necessary to integrate the vision module into a global system that operates in a complex environment by receiving images from multiple acquisition devices at video frequency. Offering relevant information to higher level systems, monitoring and making decisions in real time, it must accomplish a set of requirements, such as: time constraints, high availability, robustness, high processing speed and re-configurability. We have built a system able to represent and analyze the motion in video acquired by a multi-camera network and to process multi-source data in parallel on a multi-GPU architecture.

  9. Time response characteristics of X-ray detector system on Silex-Ⅰ laser facility

    International Nuclear Information System (INIS)

    Yi Rongqing; He Xiao'an; Li Hang; Du Huabing; Zhang Haiying; Cao Zhurong

    2013-01-01

    On the Silex-Ⅰ laser facility, the time response characteristics of XRD detector were studied. A laser with a pulse of 32 fs and a wavelength of 800 nm was used to irradiate a plane Au target. X-ray calibrated method of time of exposure X-ray framing camera and time resolution of X-ray streak camera was explored. The time response characteristics of XRD detector and time process of X-ray emission were obtained from experiment. We obtained X-ray calibration method of time of exposure X-ray framing camera and time resolution of X-ray streak camera. (authors)

  10. An autonomous sensor module based on a legacy CCTV camera

    Science.gov (United States)

    Kent, P. J.; Faulkner, D. A. A.; Marshall, G. F.

    2016-10-01

    A UK MoD funded programme into autonomous sensors arrays (SAPIENT) has been developing new, highly capable sensor modules together with a scalable modular architecture for control and communication. As part of this system there is a desire to also utilise existing legacy sensors. The paper reports upon the development of a SAPIENT-compliant sensor module using a legacy Close-Circuit Television (CCTV) pan-tilt-zoom (PTZ) camera. The PTZ camera sensor provides three modes of operation. In the first mode, the camera is automatically slewed to acquire imagery of a specified scene area, e.g. to provide "eyes-on" confirmation for a human operator or for forensic purposes. In the second mode, the camera is directed to monitor an area of interest, with zoom level automatically optimized for human detection at the appropriate range. Open source algorithms (using OpenCV) are used to automatically detect pedestrians; their real world positions are estimated and communicated back to the SAPIENT central fusion system. In the third mode of operation a "follow" mode is implemented where the camera maintains the detected person within the camera field-of-view without requiring an end-user to directly control the camera with a joystick.

  11. Design of gamma camera data acquisition system based on PCI9810

    International Nuclear Information System (INIS)

    Zhao Yuanyuan; Zhao Shujun; Liu Yang

    2004-01-01

    This paper describe the design of gamma camera's data acquisition system, which is based on PCI9810 data acquisition card of ADLink Technology Inc. The main function of PCI9810 and the program of data acquisition system are described. (authors)

  12. Fast imaging applications in the Nuclear Test Program

    International Nuclear Information System (INIS)

    Lear, R.

    1983-01-01

    Applications of fast imaging employ both streak cameras and fast framing techniques. Image intensifier tubes are gated to provide fast two-dimensional shutters of 2 to 3 ns duration with shatter ratios of greater than 10 6 and resolution greater than 10 4 pixels. Shutters of less than 1 ns have been achieved with experimental tubes. Characterization data demonstrate the importance of tube and pulser design. Streak cameras are used to simultaneously record temporal and intensity information from up to 200 spatial points. Streak cameras are combined with remote readout for downhole uses and are coupled to fiber optic cables for uphole uses. Optical wavelength multiplexing is being studied as a means of compressing additional image data onto optical fibers. Performance data demonstrate trade-offs between image resolution and system sensitivity

  13. Wavefront analysis for plenoptic camera imaging

    International Nuclear Information System (INIS)

    Luan Yin-Sen; Xu Bing; Yang Ping; Tang Guo-Mao

    2017-01-01

    The plenoptic camera is a single lens stereo camera which can retrieve the direction of light rays while detecting their intensity distribution. In this paper, to reveal more truths of plenoptic camera imaging, we present the wavefront analysis for the plenoptic camera imaging from the angle of physical optics but not from the ray tracing model of geometric optics. Specifically, the wavefront imaging model of a plenoptic camera is analyzed and simulated by scalar diffraction theory and the depth estimation is redescribed based on physical optics. We simulate a set of raw plenoptic images of an object scene, thereby validating the analysis and derivations and the difference between the imaging analysis methods based on geometric optics and physical optics are also shown in simulations. (paper)

  14. Person re-identification using height-based gait in colour depth camera

    NARCIS (Netherlands)

    John, V.; Englebienne, G.; Kröse, B.

    2013-01-01

    We address the problem of person re-identification in colour-depth camera using the height temporal information of people. Our proposed gait-based feature corresponds to the frequency response of the height temporal information. We demonstrate that the discriminative periodic motion associated with

  15. The Light Field Attachment: Turning a DSLR into a Light Field Camera Using a Low Budget Camera Ring

    KAUST Repository

    Wang, Yuwang

    2016-11-16

    We propose a concept for a lens attachment that turns a standard DSLR camera and lens into a light field camera. The attachment consists of 8 low-resolution, low-quality side cameras arranged around the central high-quality SLR lens. Unlike most existing light field camera architectures, this design provides a high-quality 2D image mode, while simultaneously enabling a new high-quality light field mode with a large camera baseline but little added weight, cost, or bulk compared with the base DSLR camera. From an algorithmic point of view, the high-quality light field mode is made possible by a new light field super-resolution method that first improves the spatial resolution and image quality of the side cameras and then interpolates additional views as needed. At the heart of this process is a super-resolution method that we call iterative Patch- And Depth-based Synthesis (iPADS), which combines patch-based and depth-based synthesis in a novel fashion. Experimental results obtained for both real captured data and synthetic data confirm that our method achieves substantial improvements in super-resolution for side-view images as well as the high-quality and view-coherent rendering of dense and high-resolution light fields.

  16. Time-resolved brightness measurements by streaking

    Science.gov (United States)

    Torrance, Joshua S.; Speirs, Rory W.; McCulloch, Andrew J.; Scholten, Robert E.

    2018-03-01

    Brightness is a key figure of merit for charged particle beams, and time-resolved brightness measurements can elucidate the processes involved in beam creation and manipulation. Here we report on a simple, robust, and widely applicable method for the measurement of beam brightness with temporal resolution by streaking one-dimensional pepperpots, and demonstrate the technique to characterize electron bunches produced from a cold-atom electron source. We demonstrate brightness measurements with 145 ps temporal resolution and a minimum resolvable emittance of 40 nm rad. This technique provides an efficient method of exploring source parameters and will prove useful for examining the efficacy of techniques to counter space-charge expansion, a critical hurdle to achieving single-shot imaging of atomic scale targets.

  17. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks.

    Science.gov (United States)

    Zhang, Cuicui; Liang, Xuefeng; Matsuyama, Takashi

    2014-12-08

    Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the "small sample size" (SSS) problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1) how to define diverse base classifiers from the small data; (2) how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0-1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  18. Generic Learning-Based Ensemble Framework for Small Sample Size Face Recognition in Multi-Camera Networks

    Directory of Open Access Journals (Sweden)

    Cuicui Zhang

    2014-12-01

    Full Text Available Multi-camera networks have gained great interest in video-based surveillance systems for security monitoring, access control, etc. Person re-identification is an essential and challenging task in multi-camera networks, which aims to determine if a given individual has already appeared over the camera network. Individual recognition often uses faces as a trial and requires a large number of samples during the training phrase. This is difficult to fulfill due to the limitation of the camera hardware system and the unconstrained image capturing conditions. Conventional face recognition algorithms often encounter the “small sample size” (SSS problem arising from the small number of training samples compared to the high dimensionality of the sample space. To overcome this problem, interest in the combination of multiple base classifiers has sparked research efforts in ensemble methods. However, existing ensemble methods still open two questions: (1 how to define diverse base classifiers from the small data; (2 how to avoid the diversity/accuracy dilemma occurring during ensemble. To address these problems, this paper proposes a novel generic learning-based ensemble framework, which augments the small data by generating new samples based on a generic distribution and introduces a tailored 0–1 knapsack algorithm to alleviate the diversity/accuracy dilemma. More diverse base classifiers can be generated from the expanded face space, and more appropriate base classifiers are selected for ensemble. Extensive experimental results on four benchmarks demonstrate the higher ability of our system to cope with the SSS problem compared to the state-of-the-art system.

  19. Web Camera Based Eye Tracking to Assess Visual Memory on a Visual Paired Comparison Task

    Directory of Open Access Journals (Sweden)

    Nicholas T. Bott

    2017-06-01

    Full Text Available Background: Web cameras are increasingly part of the standard hardware of most smart devices. Eye movements can often provide a noninvasive “window on the brain,” and the recording of eye movements using web cameras is a burgeoning area of research.Objective: This study investigated a novel methodology for administering a visual paired comparison (VPC decisional task using a web camera.To further assess this method, we examined the correlation between a standard eye-tracking camera automated scoring procedure [obtaining images at 60 frames per second (FPS] and a manually scored procedure using a built-in laptop web camera (obtaining images at 3 FPS.Methods: This was an observational study of 54 clinically normal older adults.Subjects completed three in-clinic visits with simultaneous recording of eye movements on a VPC decision task by a standard eye tracker camera and a built-in laptop-based web camera. Inter-rater reliability was analyzed using Siegel and Castellan's kappa formula. Pearson correlations were used to investigate the correlation between VPC performance using a standard eye tracker camera and a built-in web camera.Results: Strong associations were observed on VPC mean novelty preference score between the 60 FPS eye tracker and 3 FPS built-in web camera at each of the three visits (r = 0.88–0.92. Inter-rater agreement of web camera scoring at each time point was high (κ = 0.81–0.88. There were strong relationships on VPC mean novelty preference score between 10, 5, and 3 FPS training sets (r = 0.88–0.94. Significantly fewer data quality issues were encountered using the built-in web camera.Conclusions: Human scoring of a VPC decisional task using a built-in laptop web camera correlated strongly with automated scoring of the same task using a standard high frame rate eye tracker camera.While this method is not suitable for eye tracking paradigms requiring the collection and analysis of fine-grained metrics, such as

  20. Goal-oriented rectification of camera-based document images.

    Science.gov (United States)

    Stamatopoulos, Nikolaos; Gatos, Basilis; Pratikakis, Ioannis; Perantonis, Stavros J

    2011-04-01

    Document digitization with either flatbed scanners or camera-based systems results in document images which often suffer from warping and perspective distortions that deteriorate the performance of current OCR approaches. In this paper, we present a goal-oriented rectification methodology to compensate for undesirable document image distortions aiming to improve the OCR result. Our approach relies upon a coarse-to-fine strategy. First, a coarse rectification is accomplished with the aid of a computationally low cost transformation which addresses the projection of a curved surface to a 2-D rectangular area. The projection of the curved surface on the plane is guided only by the textual content's appearance in the document image while incorporating a transformation which does not depend on specific model primitives or camera setup parameters. Second, pose normalization is applied on the word level aiming to restore all the local distortions of the document image. Experimental results on various document images with a variety of distortions demonstrate the robustness and effectiveness of the proposed rectification methodology using a consistent evaluation methodology that encounters OCR accuracy and a newly introduced measure using a semi-automatic procedure.

  1. Observations of the Perseids 2013 using SPOSH cameras

    Science.gov (United States)

    Margonis, A.; Elgner, S.; Christou, A.; Oberst, J.; Flohrer, J.

    2013-09-01

    Earth is constantly bombard by debris, most of which disintegrates in the upper atmosphere. The collision of a dust particle, having a mass of approximately 1g or larger, with the Earth's atmosphere results into a visible streak of light in the night sky, called meteor. Comets produce new meteoroids each time they come close to the Sun due to sublimation processes. These fresh particles are moving around the Sun in orbits similar to their parent comet forming meteoroid streams. For this reason, the intersection of Earth's orbital path with different comets, gives rise to anumber of meteor showers throughout the year. The Perseids are one of the most prominent annual meteor showers occurring every summer, having its origin in Halley-type comet 109P/Swift-Tuttle. The dense core of this stream passes Earth's orbit on the 12th of August when more than 100 meteors per hour can been seen by a single observer under ideal conditions. The Technical University of Berlin (TUB) and the German Aerospace Center (DLR) together with the Armagh observatory organize meteor campaigns every summer observing the activity of the Perseids meteor shower. The observations are carried out using the Smart Panoramic Optical Sensor Head (SPOSH) camera system [2] which has been developed by DLR and Jena-Optronik GmbH under an ESA/ESTEC contract. The camera was designed to image faint, short-lived phenomena on dark planetary hemispheres. The camera is equipped with a highly sensitive back-illuminated CCD chip having a pixel resolution of 1024x1024. The custom-made fish-eye lens offers a 120°x120° field-of-view (168° over the diagonal) making the monitoring of nearly the whole night sky possible (Fig. 1). This year the observations will take place between 3rd and 10th of August to cover the meteor activity of the Perseids just before their maximum. The SPOSH cameras will be deployed at two remote sites located in high altitudes in the Greek Peloponnese peninsula. The baseline of ∼50km

  2. Growth and Characterization of Nanostructured Glass Ceramic Scintillators for Miniature High-Energy Radiation Sensors

    Science.gov (United States)

    2013-10-01

    rise time was resolved using Kerr gating technique with 8 ps resolution. Spectro -temporal dynamics was resolved using streak camera and tunable pump...mol% CeF3 doped glass under UV light irradiation). Fig. 5. Radioluminescence (RL) spectra of all the CeF3 doped glasses...technique with 8 ps resolution. Spectro -temporal dynamics was resolved using streak camera and tunable pump at second/third harmonic (400/267nm) and XUV

  3. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera

    Directory of Open Access Journals (Sweden)

    Hyungjin Kim

    2015-08-01

    Full Text Available Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments

  4. A Probabilistic Feature Map-Based Localization System Using a Monocular Camera.

    Science.gov (United States)

    Kim, Hyungjin; Lee, Donghwa; Oh, Taekjun; Choi, Hyun-Taek; Myung, Hyun

    2015-08-31

    Image-based localization is one of the most widely researched localization techniques in the robotics and computer vision communities. As enormous image data sets are provided through the Internet, many studies on estimating a location with a pre-built image-based 3D map have been conducted. Most research groups use numerous image data sets that contain sufficient features. In contrast, this paper focuses on image-based localization in the case of insufficient images and features. A more accurate localization method is proposed based on a probabilistic map using 3D-to-2D matching correspondences between a map and a query image. The probabilistic feature map is generated in advance by probabilistic modeling of the sensor system as well as the uncertainties of camera poses. Using the conventional PnP algorithm, an initial camera pose is estimated on the probabilistic feature map. The proposed algorithm is optimized from the initial pose by minimizing Mahalanobis distance errors between features from the query image and the map to improve accuracy. To verify that the localization accuracy is improved, the proposed algorithm is compared with the conventional algorithm in a simulation and realenvironments.

  5. Handheld Longwave Infrared Camera Based on Highly-Sensitive Quantum Well Infrared Photodetectors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a compact handheld longwave infrared camera based on quantum well infrared photodetector (QWIP) focal plane array (FPA) technology. Based on...

  6. Qualification Tests of Micro-camera Modules for Space Applications

    Science.gov (United States)

    Kimura, Shinichi; Miyasaka, Akira

    Visual capability is very important for space-based activities, for which small, low-cost space cameras are desired. Although cameras for terrestrial applications are continually being improved, little progress has been made on cameras used in space, which must be extremely robust to withstand harsh environments. This study focuses on commercial off-the-shelf (COTS) CMOS digital cameras because they are very small and are based on an established mass-market technology. Radiation and ultrahigh-vacuum tests were conducted on a small COTS camera that weighs less than 100 mg (including optics). This paper presents the results of the qualification tests for COTS cameras and for a small, low-cost COTS-based space camera.

  7. Convolutional Neural Network-Based Human Detection in Nighttime Images Using Visible Light Camera Sensors.

    Science.gov (United States)

    Kim, Jong Hyun; Hong, Hyung Gil; Park, Kang Ryoung

    2017-05-08

    Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR) illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1) and two open databases (Korea advanced institute of science and technology (KAIST) and computer vision center (CVC) databases), as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.

  8. Convolutional Neural Network-Based Human Detection in Nighttime Images Using Visible Light Camera Sensors

    Directory of Open Access Journals (Sweden)

    Jong Hyun Kim

    2017-05-01

    Full Text Available Because intelligent surveillance systems have recently undergone rapid growth, research on accurately detecting humans in videos captured at a long distance is growing in importance. The existing research using visible light cameras has mainly focused on methods of human detection for daytime hours when there is outside light, but human detection during nighttime hours when there is no outside light is difficult. Thus, methods that employ additional near-infrared (NIR illuminators and NIR cameras or thermal cameras have been used. However, in the case of NIR illuminators, there are limitations in terms of the illumination angle and distance. There are also difficulties because the illuminator power must be adaptively adjusted depending on whether the object is close or far away. In the case of thermal cameras, their cost is still high, which makes it difficult to install and use them in a variety of places. Because of this, research has been conducted on nighttime human detection using visible light cameras, but this has focused on objects at a short distance in an indoor environment or the use of video-based methods to capture multiple images and process them, which causes problems related to the increase in the processing time. To resolve these problems, this paper presents a method that uses a single image captured at night on a visible light camera to detect humans in a variety of environments based on a convolutional neural network. Experimental results using a self-constructed Dongguk night-time human detection database (DNHD-DB1 and two open databases (Korea advanced institute of science and technology (KAIST and computer vision center (CVC databases, as well as high-accuracy human detection in a variety of environments, show that the method has excellent performance compared to existing methods.

  9. A high spatio-temporal resolution optical pyrometer at the ORION laser facility.

    Science.gov (United States)

    Floyd, Emma; Gumbrell, Edward T; Fyrth, Jim; Luis, James D; Skidmore, Jonathan W; Patankar, Siddharth; Giltrap, Samuel; Smith, Roland

    2016-11-01

    A streaked pyrometer has been designed to measure the temperature of ≈100 μm diameter heated targets in the warm dense matter region. The diagnostic has picosecond time resolution. Spatial resolution is limited by the streak camera to 4 μm in one dimension; the imaging system has superior resolution of 1 μm. High light collection efficiency means that the diagnostic can transmit a measurable quantity of thermal emission at temperatures as low as 1 eV to the detector. This is achieved through the use of an f/1.4 objective, and a minimum number of reflecting and refracting surfaces to relay the image over 8 m with no vignetting over a 0.4 mm field of view with 12.5× magnification. All the system optics are highly corrected, to allow imaging with minimal aberrations over a broad spectral range. The detector is a highly sensitive Axis Photonique streak camera with a P820PSU streak tube. For the first time, two of these cameras have been absolutely calibrated at 1 ns and 2 ns sweep speeds under full operational conditions and over 8 spectral bands between 425 nm and 650 nm using a high-stability picosecond white light source. Over this range the cameras had a response which varied between 47 ± 8 and 14 ± 4 photons/count. The calibration of the optical imaging system makes absolute temperature measurements possible. Color temperature measurements are also possible due to the wide spectral range over which the system is calibrated; two different spectral bands can be imaged onto different parts of the photocathode of the same streak camera.

  10. Spectral colors capture and reproduction based on digital camera

    Science.gov (United States)

    Chen, Defen; Huang, Qingmei; Li, Wei; Lu, Yang

    2018-01-01

    The purpose of this work is to develop a method for the accurate reproduction of the spectral colors captured by digital camera. The spectral colors being the purest color in any hue, are difficult to reproduce without distortion on digital devices. In this paper, we attempt to achieve accurate hue reproduction of the spectral colors by focusing on two steps of color correction: the capture of the spectral colors and the color characterization of digital camera. Hence it determines the relationship among the spectral color wavelength, the RGB color space of the digital camera device and the CIEXYZ color space. This study also provides a basis for further studies related to the color spectral reproduction on digital devices. In this paper, methods such as wavelength calibration of the spectral colors and digital camera characterization were utilized. The spectrum was obtained through the grating spectroscopy system. A photo of a clear and reliable primary spectrum was taken by adjusting the relative parameters of the digital camera, from which the RGB values of color spectrum was extracted in 1040 equally-divided locations. Calculated using grating equation and measured by the spectrophotometer, two wavelength values were obtained from each location. The polynomial fitting method for the camera characterization was used to achieve color correction. After wavelength calibration, the maximum error between the two sets of wavelengths is 4.38nm. According to the polynomial fitting method, the average color difference of test samples is 3.76. This has satisfied the application needs of the spectral colors in digital devices such as display and transmission.

  11. Oct4 is required ~E7.5 for proliferation in the primitive streak.

    Directory of Open Access Journals (Sweden)

    Brian DeVeale

    2013-11-01

    Full Text Available Oct4 is a widely recognized pluripotency factor as it maintains Embryonic Stem (ES cells in a pluripotent state, and, in vivo, prevents the inner cell mass (ICM in murine embryos from differentiating into trophectoderm. However, its function in somatic tissue after this developmental stage is not well characterized. Using a tamoxifen-inducible Cre recombinase and floxed alleles of Oct4, we investigated the effect of depleting Oct4 in mouse embryos between the pre-streak and headfold stages, ~E6.0-E8.0, when Oct4 is found in dynamic patterns throughout the embryonic compartment of the mouse egg cylinder. We found that depletion of Oct4 ~E7.5 resulted in a severe phenotype, comprised of craniorachischisis, random heart tube orientation, failed turning, defective somitogenesis and posterior truncation. Unlike in ES cells, depletion of the pluripotency factors Sox2 and Oct4 after E7.0 does not phenocopy, suggesting that ~E7.5 Oct4 is required within a network that is altered relative to the pluripotency network. Oct4 is not required in extraembryonic tissue for these processes, but is required to maintain cell viability in the embryo and normal proliferation within the primitive streak. Impaired expansion of the primitive streak occurs coincident with Oct4 depletion ∼E7.5 and precedes deficient convergent extension which contributes to several aspects of the phenotype.

  12. Image compensation for camera and lighting variability

    Science.gov (United States)

    Daley, Wayne D.; Britton, Douglas F.

    1996-12-01

    With the current trend of integrating machine vision systems in industrial manufacturing and inspection applications comes the issue of camera and illumination stabilization. Unless each application is built around a particular camera and highly controlled lighting environment, the interchangeability of cameras of fluctuations in lighting become a problem as each camera usually has a different response. An empirical approach is proposed where color tile data is acquired using the camera of interest, and a mapping is developed to some predetermined reference image using neural networks. A similar analytical approach based on a rough analysis of the imaging systems is also considered for deriving a mapping between cameras. Once a mapping has been determined, all data from one camera is mapped to correspond to the images of the other prior to performing any processing on the data. Instead of writing separate image processing algorithms for the particular image data being received, the image data is adjusted based on each particular camera and lighting situation. All that is required when swapping cameras is the new mapping for the camera being inserted. The image processing algorithms can remain the same as the input data has been adjusted appropriately. The results of utilizing this technique are presented for an inspection application.

  13. A semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration.

    Science.gov (United States)

    Rau, Jiann-Yeou; Yeh, Po-Chia

    2012-01-01

    The generation of photo-realistic 3D models is an important task for digital recording of cultural heritage objects. This study proposes an image-based 3D modeling pipeline which takes advantage of a multi-camera configuration and multi-image matching technique that does not require any markers on or around the object. Multiple digital single lens reflex (DSLR) cameras are adopted and fixed with invariant relative orientations. Instead of photo-triangulation after image acquisition, calibration is performed to estimate the exterior orientation parameters of the multi-camera configuration which can be processed fully automatically using coded targets. The calibrated orientation parameters of all cameras are applied to images taken using the same camera configuration. This means that when performing multi-image matching for surface point cloud generation, the orientation parameters will remain the same as the calibrated results, even when the target has changed. Base on this invariant character, the whole 3D modeling pipeline can be performed completely automatically, once the whole system has been calibrated and the software was seamlessly integrated. Several experiments were conducted to prove the feasibility of the proposed system. Images observed include that of a human being, eight Buddhist statues, and a stone sculpture. The results for the stone sculpture, obtained with several multi-camera configurations were compared with a reference model acquired by an ATOS-I 2M active scanner. The best result has an absolute accuracy of 0.26 mm and a relative accuracy of 1:17,333. It demonstrates the feasibility of the proposed low-cost image-based 3D modeling pipeline and its applicability to a large quantity of antiques stored in a museum.

  14. Mixel camera--a new push-broom camera concept for high spatial resolution keystone-free hyperspectral imaging.

    Science.gov (United States)

    Høye, Gudrun; Fridman, Andrei

    2013-05-06

    Current high-resolution push-broom hyperspectral cameras introduce keystone errors to the captured data. Efforts to correct these errors in hardware severely limit the optical design, in particular with respect to light throughput and spatial resolution, while at the same time the residual keystone often remains large. The mixel camera solves this problem by combining a hardware component--an array of light mixing chambers--with a mathematical method that restores the hyperspectral data to its keystone-free form, based on the data that was recorded onto the sensor with large keystone. A Virtual Camera software, that was developed specifically for this purpose, was used to compare the performance of the mixel camera to traditional cameras that correct keystone in hardware. The mixel camera can collect at least four times more light than most current high-resolution hyperspectral cameras, and simulations have shown that the mixel camera will be photon-noise limited--even in bright light--with a significantly improved signal-to-noise ratio compared to traditional cameras. A prototype has been built and is being tested.

  15. Electron streaking in the autoionization region of H2

    International Nuclear Information System (INIS)

    Palacios, Alicia; González-Castrillo, Alberto; Martín, Fernando

    2015-01-01

    We use a UV-pump/IR-probe scheme, combining a single attosecond UV pulse and a 750 nm IR pulse, to explore laser-assisted photoionization of the hydrogen molecule in the autoionization region. The electron energy distributions exhibit unusual streaking patterns that are explored for different angles of the electron ejection with respect to the polarization vector and the molecular axis. Moreover, by controlling the time delay between the pulses, we observe that one can suppress the autoionization channel. (paper)

  16. SENSITIVITY TEMPERATURE DEPENDENCE RESEARCH OF TV-CAMERAS BASED ON SILICON MATRIXES

    Directory of Open Access Journals (Sweden)

    Alexey N. Starchenko

    2017-07-01

    Full Text Available Subject of Research. The research is dedicated to the analysis of sensitivity change patterns of the cameras based on silicon CMOS-matrixes in various ambient temperatures. This information is necessary for the correct camera application for photometric measurements in-situ. The paper deals with studies of sensitivity variations of two digital cameras with different silicon CMOS matrixes in visible and near IR regions of the spectrum at temperature change. Method. Due to practical restrictions the temperature changes were recorded in separate spectral intervals important for practical use of the cameras. The experiments were carried out with the use of a climatic chamber, providing change and keeping the temperature range from minus 40 to plus 50 °C at a pitch of 10 о С. Two cameras were chosen for research: VAC-135-IP with OmniVision OV9121 matrix and VAC-248-IP with OnSemiconductor VITA2000 matrix. The two tested devices were placed in a climatic chamber at the same time and illuminated by one radiation source with a color temperature about 3000 K in order to eliminate a number of methodological errors. Main Results. The temperature dependence of the signals was shown to be linear and the matrixes sensitivities were determined. The results obtained are consistent with theoretical views, in general. The coefficients of thermal sensitivity were computed by these dependencies. It is shown that the greatest affect of temperature on the sensitivity occurs in the area (0.7–1.1 mkm. Temperature coefficients of sensitivity increase with the downward radiation wavelength increase. The experiments carried out have shown that it is necessary to take into account the changes in temperature sensitivity of silicon matrixes in the red and near in IR regions of the spectrum. The effect reveals itself in a clearly negative way in cameras with an amplitude resolution of 10-12 bits used for aerospace and space spectrozonal photography. Practical Relevance

  17. Pedestrian Detection Based on Adaptive Selection of Visible Light or Far-Infrared Light Camera Image by Fuzzy Inference System and Convolutional Neural Network-Based Verification.

    Science.gov (United States)

    Kang, Jin Kyu; Hong, Hyung Gil; Park, Kang Ryoung

    2017-07-08

    A number of studies have been conducted to enhance the pedestrian detection accuracy of intelligent surveillance systems. However, detecting pedestrians under outdoor conditions is a challenging problem due to the varying lighting, shadows, and occlusions. In recent times, a growing number of studies have been performed on visible light camera-based pedestrian detection systems using a convolutional neural network (CNN) in order to make the pedestrian detection process more resilient to such conditions. However, visible light cameras still cannot detect pedestrians during nighttime, and are easily affected by shadows and lighting. There are many studies on CNN-based pedestrian detection through the use of far-infrared (FIR) light cameras (i.e., thermal cameras) to address such difficulties. However, when the solar radiation increases and the background temperature reaches the same level as the body temperature, it remains difficult for the FIR light camera to detect pedestrians due to the insignificant difference between the pedestrian and non-pedestrian features within the images. Researchers have been trying to solve this issue by inputting both the visible light and the FIR camera images into the CNN as the input. This, however, takes a longer time to process, and makes the system structure more complex as the CNN needs to process both camera images. This research adaptively selects a more appropriate candidate between two pedestrian images from visible light and FIR cameras based on a fuzzy inference system (FIS), and the selected candidate is verified with a CNN. Three types of databases were tested, taking into account various environmental factors using visible light and FIR cameras. The results showed that the proposed method performs better than the previously reported methods.

  18. Towards Adaptive Virtual Camera Control In Computer Games

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We inves- tigate the relationship between camera placement and playing behaviour in games and build a user...... model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ...... camera control in games is discussed....

  19. Rho kinase activity controls directional cell movements during primitive streak formation in the rabbit embryo.

    Science.gov (United States)

    Stankova, Viktoria; Tsikolia, Nikoloz; Viebahn, Christoph

    2015-01-01

    During animal gastrulation, the specification of the embryonic axes is accompanied by epithelio-mesenchymal transition (EMT), the first major change in cell shape after fertilization. EMT takes place in disparate topographical arrangements, such as the circular blastopore of amphibians, the straight primitive streak of birds and mammals or in intermediate gastrulation forms of other amniotes such as reptiles. Planar cell movements are prime candidates to arrange specific modes of gastrulation but there is no consensus view on their role in different vertebrate classes. Here, we test the impact of interfering with Rho kinase-mediated cell movements on gastrulation topography in blastocysts of the rabbit, which has a flat embryonic disc typical for most mammals. Time-lapse video microscopy, electron microscopy, gene expression and morphometric analyses of the effect of inhibiting ROCK activity showed - besides normal specification of the organizer region - a dose-dependent disruption of primitive streak formation; this disruption resulted in circular, arc-shaped or intermediate forms, reminiscent of those found in amphibians, fishes and reptiles. Our results reveal a crucial role of ROCK-controlled directional cell movements during rabbit primitive streak formation and highlight the possibility that temporal and spatial modulation of cell movements were instrumental for the evolution of gastrulation forms. © 2015. Published by The Company of Biologists Ltd.

  20. Selecting a digital camera for telemedicine.

    Science.gov (United States)

    Patricoski, Chris; Ferguson, A Stewart

    2009-06-01

    The digital camera is an essential component of store-and-forward telemedicine (electronic consultation). There are numerous makes and models of digital cameras on the market, and selecting a suitable consumer-grade camera can be complicated. Evaluation of digital cameras includes investigating the features and analyzing image quality. Important features include the camera settings, ease of use, macro capabilities, method of image transfer, and power recharging. Consideration needs to be given to image quality, especially as it relates to color (skin tones) and detail. It is important to know the level of the photographer and the intended application. The goal is to match the characteristics of the camera with the telemedicine program requirements. In the end, selecting a digital camera is a combination of qualitative (subjective) and quantitative (objective) analysis. For the telemedicine program in Alaska in 2008, the camera evaluation and decision process resulted in a specific selection based on the criteria developed for our environment.

  1. Angioid streaks in a case of Camurati–Engelmann disease

    Directory of Open Access Journals (Sweden)

    Betül Tugcu

    2017-01-01

    Full Text Available Camurati–Engelmann disease (CED is a rare autosomal dominant disease with various phenotypic expressions. The hallmark of the disease is bilateral symmetric diaphyseal hyperostosis of the long bones with progressive involvement of the metaphysis. Ocular manifestations occur rarely and mainly result from bony overgrowth of the orbit and optic canal stenosis. We report a case of CED showing angioid streaks (ASs in both fundi with no macular involvement and discuss the possible theories of the pathogenesis of AS in this disease.

  2. Perceptual Color Characterization of Cameras

    Directory of Open Access Journals (Sweden)

    Javier Vazquez-Corral

    2014-12-01

    Full Text Available Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as \\(XYZ\\, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a \\(3 \\times 3\\ matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson al., to perform a perceptual color characterization. In particular, we search for the \\(3 \\times 3\\ matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE \\(\\Delta E\\ error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3for the \\(\\Delta E\\ error, 7& for the S-CIELAB error and 13% for the CID error measures.

  3. Wired and Wireless Camera Triggering with Arduino

    Science.gov (United States)

    Kauhanen, H.; Rönnholm, P.

    2017-10-01

    Synchronous triggering is an important task that allows simultaneous data capture from multiple cameras. Accurate synchronization enables 3D measurements of moving objects or from a moving platform. In this paper, we describe one wired and four wireless variations of Arduino-based low-cost remote trigger systems designed to provide a synchronous trigger signal for industrial cameras. Our wireless systems utilize 315 MHz or 434 MHz frequencies with noise filtering capacitors. In order to validate the synchronization accuracy, we developed a prototype of a rotating trigger detection system (named RoTriDeS). This system is suitable to detect the triggering accuracy of global shutter cameras. As a result, the wired system indicated an 8.91 μs mean triggering time difference between two cameras. Corresponding mean values for the four wireless triggering systems varied between 7.92 and 9.42 μs. Presented values include both camera-based and trigger-based desynchronization. Arduino-based triggering systems appeared to be feasible, and they have the potential to be extended to more complicated triggering systems.

  4. An evolution of image source camera attribution approaches.

    Science.gov (United States)

    Jahanirad, Mehdi; Wahab, Ainuddin Wahid Abdul; Anuar, Nor Badrul

    2016-05-01

    Camera attribution plays an important role in digital image forensics by providing the evidence and distinguishing characteristics of the origin of the digital image. It allows the forensic analyser to find the possible source camera which captured the image under investigation. However, in real-world applications, these approaches have faced many challenges due to the large set of multimedia data publicly available through photo sharing and social network sites, captured with uncontrolled conditions and undergone variety of hardware and software post-processing operations. Moreover, the legal system only accepts the forensic analysis of the digital image evidence if the applied camera attribution techniques are unbiased, reliable, nondestructive and widely accepted by the experts in the field. The aim of this paper is to investigate the evolutionary trend of image source camera attribution approaches from fundamental to practice, in particular, with the application of image processing and data mining techniques. Extracting implicit knowledge from images using intrinsic image artifacts for source camera attribution requires a structured image mining process. In this paper, we attempt to provide an introductory tutorial on the image processing pipeline, to determine the general classification of the features corresponding to different components for source camera attribution. The article also reviews techniques of the source camera attribution more comprehensively in the domain of the image forensics in conjunction with the presentation of classifying ongoing developments within the specified area. The classification of the existing source camera attribution approaches is presented based on the specific parameters, such as colour image processing pipeline, hardware- and software-related artifacts and the methods to extract such artifacts. The more recent source camera attribution approaches, which have not yet gained sufficient attention among image forensics

  5. Electronics for the camera of the First G-APD Cherenkov Telescope (FACT) for ground based gamma-ray astronomy

    International Nuclear Information System (INIS)

    Anderhub, H; Biland, A; Boller, A; Braun, I; Commichau, V; Djambazov, L; Dorner, D; Gendotti, A; Grimm, O; Gunten, H P von; Hildebrand, D; Horisberger, U; Huber, B; Kim, K-S; Krähenbühl, T; Backes, M; Köhne, J-H; Krumm, B; Bretz, T; Farnier, C

    2012-01-01

    Within the FACT project, we construct a new type of camera based on Geiger-mode avalanche photodiodes (G-APDs). Compared to photomultipliers, G-APDs are more robust, need a lower operation voltage and have the potential of higher photon-detection efficiency and lower cost, but were never fully tested in the harsh environments of Cherenkov telescopes. The FACT camera consists of 1440 G-APD pixels and readout channels, based on the DRS4 (Domino Ring Sampler) analog pipeline chip and commercial Ethernet components. Preamplifiers, trigger system, digitization, slow control and power converters are integrated into the camera.

  6. Generalized free-space diffuse photon transport model based on the influence analysis of a camera lens diaphragm.

    Science.gov (United States)

    Chen, Xueli; Gao, Xinbo; Qu, Xiaochao; Chen, Duofang; Ma, Xiaopeng; Liang, Jimin; Tian, Jie

    2010-10-10

    The camera lens diaphragm is an important component in a noncontact optical imaging system and has a crucial influence on the images registered on the CCD camera. However, this influence has not been taken into account in the existing free-space photon transport models. To model the photon transport process more accurately, a generalized free-space photon transport model is proposed. It combines Lambertian source theory with analysis of the influence of the camera lens diaphragm to simulate photon transport process in free space. In addition, the radiance theorem is also adopted to establish the energy relationship between the virtual detector and the CCD camera. The accuracy and feasibility of the proposed model is validated with a Monte-Carlo-based free-space photon transport model and physical phantom experiment. A comparison study with our previous hybrid radiosity-radiance theorem based model demonstrates the improvement performance and potential of the proposed model for simulating photon transport process in free space.

  7. Investigation of an Autofocusing Method for Visible Aerial Cameras Based on Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Zhichao Chen

    2016-01-01

    Full Text Available In order to realize the autofocusing in aerial camera, an autofocusing system is established and its characteristics such as working principle and optical-mechanical structure and focus evaluation function are investigated. The reason for defocusing in aviation camera is analyzed and several autofocusing methods along with appropriate focus evaluation functions are introduced based on the image processing techniques. The proposed autofocusing system is designed and implemented using two CMOS detectors. The experiment results showed that the proposed method met the aviation camera focusing accuracy requirement, and a maximum focusing error of less than half of the focus depth is achieved. The system designed in this paper can find the optical imaging focal plane in real-time; as such, this novel design has great potential in practical engineering, especially aerospace applications.

  8. Gamma camera based FDG PET in oncology

    International Nuclear Information System (INIS)

    Park, C. H.

    2002-01-01

    Positron Emission Tomography(PET) was introduced as a research tool in the 1970s and it took about 20 years before PET became an useful clinical imaging modality. In the USA, insurance coverage for PET procedures in the 1990s was the turning point, I believe, for this progress. Initially PET was used in neurology but recently more than 80% of PET procedures are in oncological applications. I firmly believe, in the 21st century, one can not manage cancer patients properly without PET and PET is very important medical imaging modality in basic and clinical sciences. PET is grouped into 2 categories; conventional (c) and gamma camera based ( CB ) PET. CB PET is more readily available utilizing dual-head gamma cameras and commercially available FDG to many medical centers at low cost to patients. In fact there are more CB PET in operation than cPET in the USA. CB PET is inferior to cPET in its performance but clinical studies in oncology is feasible without expensive infrastructures such as staffing, rooms and equipments. At Ajou university Hospital, CBPET was installed in late 1997 for the first time in Korea as well as in Asia and the system has been used successfully and effectively in oncological applications. Our was the fourth PET operation in Korea and I believe this may have been instrumental for other institutions got interested in clinical PET. The following is a brief description of our clinical experience of FDG CBPET in oncology

  9. Speed of sound and photoacoustic imaging with an optical camera based ultrasound detection system

    Science.gov (United States)

    Nuster, Robert; Paltauf, Guenther

    2017-07-01

    CCD camera based optical ultrasound detection is a promising alternative approach for high resolution 3D photoacoustic imaging (PAI). To fully exploit its potential and to achieve an image resolution SOS) in the image reconstruction algorithm. Hence, in the proposed work the idea and a first implementation are shown how speed of sound imaging can be added to a previously developed camera based PAI setup. The current setup provides SOS-maps with a spatial resolution of 2 mm and an accuracy of the obtained absolute SOS values of about 1%. The proposed dual-modality setup has the potential to provide highly resolved and perfectly co-registered 3D photoacoustic and SOS images.

  10. Nuclear import of Maize fine streak virus proteins in Drosophila S2 cells

    Science.gov (United States)

    Maize fine streak virus (MFSV) is a member of the genus Nucleorhabdovirus, family Rhabdoviridae and is transmitted by the leafhopper Graminella nigrifons. The virus replicates in both its plant host and in its insect vector. Nucleorhabdoviruses replicate in the nucleus and assemble at the inner nu...

  11. Research on the underwater target imaging based on the streak tube laser lidar

    Science.gov (United States)

    Cui, Zihao; Tian, Zhaoshuo; Zhang, Yanchao; Bi, Zongjie; Yang, Gang; Gu, Erdan

    2018-03-01

    A high frame rate streak tube imaging lidar (STIL) for real-time 3D imaging of underwater targets is presented in this paper. The system uses 532nm pulse laser as the light source, the maximum repetition rate is 120Hz, and the pulse width is 8ns. LabVIEW platform is used in the system, the system control, synchronous image acquisition, 3D data processing and display are realized through PC. 3D imaging experiment of underwater target is carried out in a flume with attenuation coefficient of 0.2, and the images of different depth and different material targets are obtained, the imaging frame rate is 100Hz, and the maximum detection depth is 31m. For an underwater target with a distance of 22m, the high resolution 3D image real-time acquisition is realized with range resolution of 1cm and space resolution of 0.3cm, the spatial relationship of the targets can be clearly identified by the image. The experimental results show that STIL has a good application prospect in underwater terrain detection, underwater search and rescue, and other fields.

  12. Optical character recognition of camera-captured images based on phase features

    Science.gov (United States)

    Diaz-Escobar, Julia; Kober, Vitaly

    2015-09-01

    Nowadays most of digital information is obtained using mobile devices specially smartphones. In particular, it brings the opportunity for optical character recognition in camera-captured images. For this reason many recognition applications have been recently developed such as recognition of license plates, business cards, receipts and street signal; document classification, augmented reality, language translator and so on. Camera-captured images are usually affected by geometric distortions, nonuniform illumination, shadow, noise, which make difficult the recognition task with existing systems. It is well known that the Fourier phase contains a lot of important information regardless of the Fourier magnitude. So, in this work we propose a phase-based recognition system exploiting phase-congruency features for illumination/scale invariance. The performance of the proposed system is tested in terms of miss classifications and false alarms with the help of computer simulation.

  13. An Airborne Multispectral Imaging System Based on Two Consumer-Grade Cameras for Agricultural Remote Sensing

    Directory of Open Access Journals (Sweden)

    Chenghai Yang

    2014-06-01

    Full Text Available This paper describes the design and evaluation of an airborne multispectral imaging system based on two identical consumer-grade cameras for agricultural remote sensing. The cameras are equipped with a full-frame complementary metal oxide semiconductor (CMOS sensor with 5616 × 3744 pixels. One camera captures normal color images, while the other is modified to obtain near-infrared (NIR images. The color camera is also equipped with a GPS receiver to allow geotagged images. A remote control is used to trigger both cameras simultaneously. Images are stored in 14-bit RAW and 8-bit JPEG files in CompactFlash cards. The second-order transformation was used to align the color and NIR images to achieve subpixel alignment in four-band images. The imaging system was tested under various flight and land cover conditions and optimal camera settings were determined for airborne image acquisition. Images were captured at altitudes of 305–3050 m (1000–10,000 ft and pixel sizes of 0.1–1.0 m were achieved. Four practical application examples are presented to illustrate how the imaging system was used to estimate cotton canopy cover, detect cotton root rot, and map henbit and giant reed infestations. Preliminary analysis of example images has shown that this system has potential for crop condition assessment, pest detection, and other agricultural applications.

  14. Extended spectrum SWIR camera with user-accessible Dewar

    Science.gov (United States)

    Benapfl, Brendan; Miller, John Lester; Vemuri, Hari; Grein, Christoph; Sivananthan, Siva

    2017-02-01

    Episensors has developed a series of extended short wavelength infrared (eSWIR) cameras based on high-Cd concentration Hg1-xCdxTe absorbers. The cameras have a bandpass extending to 3 microns cutoff wavelength, opening new applications relative to traditional InGaAs-based cameras. Applications and uses are discussed and examples given. A liquid nitrogen pour-filled version was initially developed. This was followed by a compact Stirling-cooled version with detectors operating at 200 K. Each camera has unique sensitivity and performance characteristics. The cameras' size, weight and power specifications are presented along with images captured with band pass filters and eSWIR sources to demonstrate spectral response beyond 1.7 microns. The soft seal Dewars of the cameras are designed for accessibility, and can be opened and modified in a standard laboratory environment. This modular approach allows user flexibility for swapping internal components such as cold filters and cold stops. The core electronics of the Stirlingcooled camera are based on a single commercial field programmable gate array (FPGA) that also performs on-board non-uniformity corrections, bad pixel replacement, and directly drives any standard HDMI display.

  15. Florida-specific NTCIP management information base (MIB) for closed-circuit television (CCTV) camera : final draft.

    Science.gov (United States)

    2009-01-01

    Description: This following MIB has been developed for use by FDOT. This : proposed Florida-Specific NTCIP Management Information Base (MIB) For : Closed-Circuit Television (CCTV) Camera MIB is based on the following : documentations: : NTCIP 120...

  16. Spectrally-Tunable Infrared Camera Based on Highly-Sensitive Quantum Well Infrared Photodetectors, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a SPECTRALLY-TUNABLE INFRARED CAMERA based on quantum well infrared photodetector (QWIP) focal plane array (FPA) technology. This will build on...

  17. Upgrading of analogue cameras using modern PC based computer

    International Nuclear Information System (INIS)

    Pardom, M.F.; Matos, L.

    2002-01-01

    Aim: The use of computers along with analogue cameras enables them to perform tasks involving time-activity parameters. The INFORMENU system converts a modern PC computer into a dedicated nuclear medicine computer system with a total cost affordable to emerging economic countries, and easily adaptable to all existing cameras. Materials and Methods: In collaboration with nuclear medicine physicians, an application including hardware and software was developed by a private firm. The system runs smoothly on Windows 98 and its operation is very easy. The main features are comparable to the brand commercial computer systems; such as image resolution until 1024 x 1024, low count loss at high count rate, uniformity correction, integrated graphical and text reporting, and user defined clinical protocols. Results: The system is used in more than 20 private and public institutions. The count loss is less than 1% in all the routine work, improvement of uniformity correction of 3-5 times, improved utility of the analogue cameras. Conclusion: The INFORMENU system improves the utility of analogue cameras permitting the inclusion of dynamic clinical protocols and quantifications, helping the development of the nuclear medicine practice. The operation and maintenance costs were lowered. The end users improve their knowledge of modern nuclear medicine

  18. Global calibration of multi-cameras with non-overlapping fields of view based on photogrammetry and reconfigurable target

    Science.gov (United States)

    Xia, Renbo; Hu, Maobang; Zhao, Jibin; Chen, Songlin; Chen, Yueling

    2018-06-01

    Multi-camera vision systems are often needed to achieve large-scale and high-precision measurement because these systems have larger fields of view (FOV) than a single camera. Multiple cameras may have no or narrow overlapping FOVs in many applications, which pose a huge challenge to global calibration. This paper presents a global calibration method for multi-cameras without overlapping FOVs based on photogrammetry technology and a reconfigurable target. Firstly, two planar targets are fixed together and made into a long target according to the distance between the two cameras to be calibrated. The relative positions of the two planar targets can be obtained by photogrammetric methods and used as invariant constraints in global calibration. Then, the reprojection errors of target feature points in the two cameras’ coordinate systems are calculated at the same time and optimized by the Levenberg–Marquardt algorithm to find the optimal solution of the transformation matrix between the two cameras. Finally, all the camera coordinate systems are converted to the reference coordinate system in order to achieve global calibration. Experiments show that the proposed method has the advantages of high accuracy (the RMS error is 0.04 mm) and low cost and is especially suitable for on-site calibration.

  19. A mobile device-based imaging spectrometer for environmental monitoring by attaching a lightweight small module to a commercial digital camera.

    Science.gov (United States)

    Cai, Fuhong; Lu, Wen; Shi, Wuxiong; He, Sailing

    2017-11-15

    Spatially-explicit data are essential for remote sensing of ecological phenomena. Lately, recent innovations in mobile device platforms have led to an upsurge in on-site rapid detection. For instance, CMOS chips in smart phones and digital cameras serve as excellent sensors for scientific research. In this paper, a mobile device-based imaging spectrometer module (weighing about 99 g) is developed and equipped on a Single Lens Reflex camera. Utilizing this lightweight module, as well as commonly used photographic equipment, we demonstrate its utility through a series of on-site multispectral imaging, including ocean (or lake) water-color sensing and plant reflectance measurement. Based on the experiments we obtain 3D spectral image cubes, which can be further analyzed for environmental monitoring. Moreover, our system can be applied to many kinds of cameras, e.g., aerial camera and underwater camera. Therefore, any camera can be upgraded to an imaging spectrometer with the help of our miniaturized module. We believe it has the potential to become a versatile tool for on-site investigation into many applications.

  20. Streak-photographic investigation of shock wave emission after laser-induced plasma formation in water

    Science.gov (United States)

    Noack, Joachim; Vogel, Alfred

    1995-05-01

    The shock wave emission after dielectric breakdown in water was investigated to assess potential shock wave effects in plasma mediated tissue ablation and intraocular photodisruption. Of particular interest was the dependence of shock wave pressure as a function of distance from the plasma for different laser pulse energies. We have generated plasmas in water with a Nd:YAG laser system delivering pulses of 6 ns duration. The pulses, with energies between 0.4 and 36 mJ (approximately equals 180 times threshold), were focused into a cuvette containing distilled water. The shock wave was visualized with streak photography combined with a schlieren technique. An important advantage of this technique is that the shock position as a function of time can directly be obtained from a single streak and hence a single event. Other methods (e.g. flash photography or passage time measurements between fixed locations) in contrast rely on reproducible events. Using the shock wave speed obtained from the streak images, shock wave peak pressures were calculated providing detailed information on the propagation of the shock. The shock peak pressure as a function of distance r from the optical axis was found to decrease faster than 1/r2 in regions up to distances of 100-150 micrometers . For larger distances it was found to be roughly proportional to 1/r. The scaling law for maximum shock pressure p, at a given distance was found to be proportional to the square root of the laser pulse energy E for distances of 50-200 micrometers from the optical axis.

  1. a Uav-Based Low-Cost Stereo Camera System for Archaeological Surveys - Experiences from Doliche (turkey)

    Science.gov (United States)

    Haubeck, K.; Prinz, T.

    2013-08-01

    The use of Unmanned Aerial Vehicles (UAVs) for surveying archaeological sites is becoming more and more common due to their advantages in rapidity of data acquisition, cost-efficiency and flexibility. One possible usage is the documentation and visualization of historic geo-structures and -objects using UAV-attached digital small frame cameras. These monoscopic cameras offer the possibility to obtain close-range aerial photographs, but - under the condition that an accurate nadir-waypoint flight is not possible due to choppy or windy weather conditions - at the same time implicate the problem that two single aerial images not always meet the required overlap to use them for 3D photogrammetric purposes. In this paper, we present an attempt to replace the monoscopic camera with a calibrated low-cost stereo camera that takes two pictures from a slightly different angle at the same time. Our results show that such a geometrically predefined stereo image pair can be used for photogrammetric purposes e.g. the creation of digital terrain models (DTMs) and orthophotos or the 3D extraction of single geo-objects. Because of the limited geometric photobase of the applied stereo camera and the resulting base-height ratio the accuracy of the DTM however directly depends on the UAV flight altitude.

  2. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-07-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2-deg deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  3. Automatic camera tracking for remote manipulators

    International Nuclear Information System (INIS)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2 0 deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables

  4. Two-Phase Algorithm for Optimal Camera Placement

    Directory of Open Access Journals (Sweden)

    Jun-Woo Ahn

    2016-01-01

    Full Text Available As markers for visual sensor networks have become larger, interest in the optimal camera placement problem has continued to increase. The most featured solution for the optimal camera placement problem is based on binary integer programming (BIP. Due to the NP-hard characteristic of the optimal camera placement problem, however, it is difficult to find a solution for a complex, real-world problem using BIP. Many approximation algorithms have been developed to solve this problem. In this paper, a two-phase algorithm is proposed as an approximation algorithm based on BIP that can solve the optimal camera placement problem for a placement space larger than in current studies. This study solves the problem in three-dimensional space for a real-world structure.

  5. A Novel Multi-Digital Camera System Based on Tilt-Shift Photography Technology

    Science.gov (United States)

    Sun, Tao; Fang, Jun-yong; Zhao, Dong; Liu, Xue; Tong, Qing-xi

    2015-01-01

    Multi-digital camera systems (MDCS) are constantly being improved to meet the increasing requirement of high-resolution spatial data. This study identifies the insufficiencies of traditional MDCSs and proposes a new category MDCS based on tilt-shift photography to improve ability of the MDCS to acquire high-accuracy spatial data. A prototype system, including two or four tilt-shift cameras (TSC, camera model: Nikon D90), is developed to validate the feasibility and correctness of proposed MDCS. Similar to the cameras of traditional MDCSs, calibration is also essential for TSC of new MDCS. The study constructs indoor control fields and proposes appropriate calibration methods for TSC, including digital distortion model (DDM) approach and two-step calibrated strategy. The characteristics of TSC are analyzed in detail via a calibration experiment; for example, the edge distortion of TSC. Finally, the ability of the new MDCS to acquire high-accuracy spatial data is verified through flight experiments. The results of flight experiments illustrate that geo-position accuracy of prototype system achieves 0.3 m at a flight height of 800 m, and spatial resolution of 0.15 m. In addition, results of the comparison between the traditional (MADC II) and proposed MDCS demonstrate that the latter (0.3 m) provides spatial data with higher accuracy than the former (only 0.6 m) under the same conditions. We also take the attitude that using higher accuracy TSC in the new MDCS should further improve the accuracy of the photogrammetry senior product. PMID:25835187

  6. Imaging capabilities of germanium gamma cameras

    International Nuclear Information System (INIS)

    Steidley, J.W.

    1977-01-01

    Quantitative methods of analysis based on the use of a computer simulation were developed and used to investigate the imaging capabilities of germanium gamma cameras. The main advantage of the computer simulation is that the inherent unknowns of clinical imaging procedures are removed from the investigation. The effects of patient scattered radiation were incorporated using a mathematical LSF model which was empirically developed and experimentally verified. Image modifying effects of patient motion, spatial distortions, and count rate capabilities were also included in the model. Spatial domain and frequency domain modeling techniques were developed and used in the simulation as required. The imaging capabilities of gamma cameras were assessed using low contrast lesion source distributions. The results showed that an improvement in energy resolution from 10% to 2% offers significant clinical advantages in terms of improved contrast, increased detectability, and reduced patient dose. The improvements are of greatest significance for small lesions at low contrast. The results of the computer simulation were also used to compare a design of a hypothetical germanium gamma camera with a state-of-the-art scintillation camera. The computer model performed a parametric analysis of the interrelated effects of inherent and technological limitations of gamma camera imaging. In particular, the trade-off between collimator resolution and collimator efficiency for detection of a given low contrast lesion was directly addressed. This trade-off is an inherent limitation of both gamma cameras. The image degrading effects of patient motion, camera spatial distortions, and low count rate were shown to modify the improvements due to better energy resolution. Thus, based on this research, the continued development of germanium cameras to the point of clinical demonstration is recommended

  7. Towards next generation 3D cameras

    Science.gov (United States)

    Gupta, Mohit

    2017-03-01

    We are in the midst of a 3D revolution. Robots enabled by 3D cameras are beginning to autonomously drive cars, perform surgeries, and manage factories. However, when deployed in the real-world, these cameras face several challenges that prevent them from measuring 3D shape reliably. These challenges include large lighting variations (bright sunlight to dark night), presence of scattering media (fog, body tissue), and optically complex materials (metal, plastic). Due to these factors, 3D imaging is often the bottleneck in widespread adoption of several key robotics technologies. I will talk about our work on developing 3D cameras based on time-of-flight and active triangulation that addresses these long-standing problems. This includes designing `all-weather' cameras that can perform high-speed 3D scanning in harsh outdoor environments, as well as cameras that recover shape of objects with challenging material properties. These cameras are, for the first time, capable of measuring detailed (robotic inspection and assembly systems.

  8. A practical approach for active camera coordination based on a fusion-driven multi-agent system

    Science.gov (United States)

    Bustamante, Alvaro Luis; Molina, José M.; Patricio, Miguel A.

    2014-04-01

    In this paper, we propose a multi-agent system architecture to manage spatially distributed active (or pan-tilt-zoom) cameras. Traditional video surveillance algorithms are of no use for active cameras, and we have to look at different approaches. Such multi-sensor surveillance systems have to be designed to solve two related problems: data fusion and coordinated sensor-task management. Generally, architectures proposed for the coordinated operation of multiple cameras are based on the centralisation of management decisions at the fusion centre. However, the existence of intelligent sensors capable of decision making brings with it the possibility of conceiving alternative decentralised architectures. This problem is approached by means of a MAS, integrating data fusion as an integral part of the architecture for distributed coordination purposes. This paper presents the MAS architecture and system agents.

  9. Mechanism for propagation of the step leader of streak lightning

    International Nuclear Information System (INIS)

    Golubev, A.I.; Zolotovskil, V.I.; Ivanovskil, A.V.

    1992-01-01

    A hypothetical scheme for the development of the step leader of streak lightning is discussed. The mathematical problem of modeling the propagation of the leader in this scheme is stated. The main parameters of the leader are estimated: the length and propagation velocity of the step, the average propagation velocity, etc. This is compared with data from observations in nature. The propagation of the leader is simulated numerically. Results of the calculation are presented for two 'flashes' of the step leader. 25 refs., 6 figs

  10. Stereo matching based on SIFT descriptor with illumination and camera invariance

    Science.gov (United States)

    Niu, Haitao; Zhao, Xunjie; Li, Chengjin; Peng, Xiang

    2010-10-01

    Stereo matching is the process of finding corresponding points in two or more images. The description of interest points is a critical aspect of point correspondence which is vital in stereo matching. SIFT descriptor has been proven to be better on the distinctiveness and robustness than other local descriptors. However, SIFT descriptor does not involve color information of feature point which provides powerfully distinguishable feature in matching tasks. Furthermore, in a real scene, image color are affected by various geometric and radiometric factors,such as gamma correction and exposure. These situations are very common in stereo images. For this reason, the color recorded by a camera is not a reliable cue, and the color consistency assumption is no longer valid between stereo images in real scenes. Hence the performance of other SIFT-based stereo matching algorithms can be severely degraded under the radiometric variations. In this paper, we present a new improved SIFT stereo matching algorithms that is invariant to various radiometric variations between left and right images. Unlike other improved SIFT stereo matching algorithms, we explicitly employ the color formation model with the parameters of lighting geometry, illuminant color and camera gamma in SIFT descriptor. Firstly, we transform the input color images to log-chromaticity color space, thus a linear relationship can be established. Then, we use a log-polar histogram to build three color invariance components for SIFT descriptor. So that our improved SIFT descriptor is invariant to lighting geometry, illuminant color and camera gamma changes between left and right images. Then we can match feature points between two images and use SIFT descriptor Euclidean distance as a geometric measure in our data sets to make it further accurate and robust. Experimental results show that our method is superior to other SIFT-based algorithms including conventional stereo matching algorithms under various

  11. An image-tube camera for cometary spectrography

    Science.gov (United States)

    Mamadov, O.

    The paper discusses the mounting of an image tube camera. The cathode is of antimony, sodium, potassium, and cesium. The parts used for mounting are of acrylic plastic and a fabric-based laminate. A mounting design that does not include cooling is presented. The aperture ratio of the camera is 1:27. Also discussed is the way that the camera is joined to the spectrograph.

  12. Improvements in picosecond chronography

    International Nuclear Information System (INIS)

    Arthurs, E.G.; Bradley, D.J.; Liddy, Brian; O'Neill, Fergus; Roddie, A.G.; Sibbett, Wilson; Sleat, W.E.

    The durations of laser pulses as short as 1 picosecond have been measured with an electro-optical streak camera. The time resolution limit of the camera system has been directly and unambiguously demonstrated employing a flashlamp pumped mode-locked dye laser to reliably generate tunable-frequency pulses of duration between 1 and 2 psec. An argon laser pumped C.W. mode-locked dye laser has been developed using the streak camera as a diagnostic tool, to produce continuous streams of picosecond pulses. With the high light gain of the camera system, pulses of peak powers < 1 watt can be studied with picosecond time resolution. The build-up of picosecond pulses from the initial photon noise of the mode-locked laser has also been directly recorded for the first time

  13. Upgrading of analogue gamma cameras with PC based computer system

    International Nuclear Information System (INIS)

    Fidler, V.; Prepadnik, M.

    2002-01-01

    Full text: Dedicated nuclear medicine computers for acquisition and processing of images from analogue gamma cameras in developing countries are in many cases faulty and technologically obsolete. The aim of the upgrading project of International Atomic Energy Agency (IAEA) was to support the development of the PC based computer system which would cost 5.000 $ in total. Several research institutions from different countries (China, Cuba, India and Slovenia) were financially supported in this development. The basic demands for the system were: one acquisition card an ISA bus, image resolution up to 256x256, SVGA graphics, low count loss at high count rates, standard acquisition and clinical protocols incorporated in PIP (Portable Image Processing), on-line energy and uniformity correction, graphic printing and networking. The most functionally stable acquisition system tested on several international workshops and university clinics was the Slovenian one with a complete set of acquisition and clinical protocols, transfer of scintigraphic data from acquisition card to PC through PORT, count loss less than 1 % at count rate of 120 kc/s, improvement of integral uniformity index by a factor of 3-5 times, reporting, networking and archiving solutions for simple MS network or server oriented network systems (NT server, etc). More than 300 gamma cameras in 52 countries were digitized and put in the routine work. The project of upgrading the analogue gamma cameras yielded a high promotion of nuclear medicine in the developing countries by replacing the old computer systems, improving the technological knowledge of end users on workshops and training courses and lowering the maintenance cost of the departments. (author)

  14. Calibration method for projector-camera-based telecentric fringe projection profilometry system.

    Science.gov (United States)

    Liu, Haibo; Lin, Huijing; Yao, Linshen

    2017-12-11

    By combining a fringe projection setup with a telecentric lens, a fringe pattern could be projected and imaged within a small area, making it possible to measure the three-dimensional (3D) surfaces of micro-components. This paper focuses on the flexible calibration of the fringe projection profilometry (FPP) system using a telecentric lens. An analytical telecentric projector-camera calibration model is introduced, in which the rig structure parameters remain invariant for all views, and the 3D calibration target can be located on the projector image plane with sub-pixel precision. Based on the presented calibration model, a two-step calibration procedure is proposed. First, the initial parameters, e.g., the projector-camera rig, projector intrinsic matrix, and coordinates of the control points of a 3D calibration target, are estimated using the affine camera factorization calibration method. Second, a bundle adjustment algorithm with various simultaneous views is applied to refine the calibrated parameters, especially the rig structure parameters and coordinates of the control points forth 3D target. Because the control points are determined during the calibration, there is no need for an accurate 3D reference target, whose is costly and extremely difficult to fabricate, particularly for tiny objects used to calibrate the telecentric FPP system. Real experiments were performed to validate the performance of the proposed calibration method. The test results showed that the proposed approach is very accurate and reliable.

  15. Intellectual streaking: The value of teachers exposing minds (and hearts).

    Science.gov (United States)

    Bearman, Margaret; Molloy, Elizabeth

    2017-12-01

    As teachers we often ask learners to be vulnerable and yet present ourselves as high status, knowledgeable experts, often with pre-prepared scripts. This paper investigates the metaphoric notion of "intellectual streaking" - the nimble exposure of a teacher's thought processes, dilemmas, or failures - as a way of modeling both reflection-in-action and resilience. While there is a tension between credibility and vulnerability, both of which are necessary for trust, we argue that taking a few risks and revealing deficits in knowledge or performance can be illuminating and valuable for all parties.

  16. Automatic multi-camera calibration for deployable positioning systems

    Science.gov (United States)

    Axelsson, Maria; Karlsson, Mikael; Rudner, Staffan

    2012-06-01

    Surveillance with automated positioning and tracking of subjects and vehicles in 3D is desired in many defence and security applications. Camera systems with stereo or multiple cameras are often used for 3D positioning. In such systems, accurate camera calibration is needed to obtain a reliable 3D position estimate. There is also a need for automated camera calibration to facilitate fast deployment of semi-mobile multi-camera 3D positioning systems. In this paper we investigate a method for automatic calibration of the extrinsic camera parameters (relative camera pose and orientation) of a multi-camera positioning system. It is based on estimation of the essential matrix between each camera pair using the 5-point method for intrinsically calibrated cameras. The method is compared to a manual calibration method using real HD video data from a field trial with a multicamera positioning system. The method is also evaluated on simulated data from a stereo camera model. The results show that the reprojection error of the automated camera calibration method is close to or smaller than the error for the manual calibration method and that the automated calibration method can replace the manual calibration.

  17. First measurements of electron-beam transit times and micropulse elongation in a photoelectric injector at the High-Brightness Accelerator FEL (HIBAF)

    Energy Technology Data Exchange (ETDEWEB)

    Lumpkin, A.H.; Carlsten, B.E.; Feldman, R.B.

    1990-01-01

    Key aspects of the dynamics of a photoelectric injector (PEI) on the Los Alamos High-Brightness Accelerator FEL (HIBAF) facility have been investigated using a synchroscan streak camera. By phase-locking the streak camera sweep to the reference 108.3 MHz rf signal, the variations of micropulse temporal elongations (30 to 80% over the drive-laser pulse length) and of transit times (25 ps for a 16{degree}-phase change) were observed for the first time. These results were in good agreement with PARMELA simulations. 2 refs., 8 figs.

  18. Development of underwater camera using high-definition camera

    International Nuclear Information System (INIS)

    Tsuji, Kenji; Watanabe, Masato; Takashima, Masanobu; Kawamura, Shingo; Tanaka, Hiroyuki

    2012-01-01

    In order to reduce the time for core verification or visual inspection of BWR fuels, the underwater camera using a High-Definition camera has been developed. As a result of this development, the underwater camera has 2 lights and 370 x 400 x 328mm dimensions and 20.5kg weight. Using the camera, 6 or so spent-fuel IDs are identified at 1 or 1.5m distance at a time, and 0.3mmφ pin-hole is recognized at 1.5m distance and 20 times zoom-up. Noises caused by radiation less than 15 Gy/h are not affected the images. (author)

  19. Effective data-domain noise and streak reduction for X-ray CT

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhi; Zamyatin, Alexander A. [Toshiba Medical Research Institute USA, Inc., Vernon Hills, IL (United States); Akino, Naruomi [Toshiba Medical System Corporation, Tokyo (Japan)

    2011-07-01

    Streaks and noise caused by photon starvation can seriously impair the diagnostic value of the CT imaging. Existing processing methods often have several parameters to tune. The parameters can be ad hoc to the data sets. Iterative methods can achieve better results, however, at the cost of more hardware resources or longer processing time. This paper reports a new scheme of adaptive Gaussian filtering, which is based on the diffusion-derived scale-space concept. In scale-space view, filtering by Gaussians of different sizes is similar to decompose the data into a sequence of scales. The scale measure, which is the variance of the filter, should be linearly related to the noise standard deviation instead of the variance of the noise. This is a fundamental deviation in the way of using filters. The new filter has only one parameter that remains stable once tuned. Singlepass processing can usually reach the desired results. (orig.)

  20. Real-time multiple objects tracking on Raspberry-Pi-based smart embedded camera

    Science.gov (United States)

    Dziri, Aziz; Duranton, Marc; Chapuis, Roland

    2016-07-01

    Multiple-object tracking constitutes a major step in several computer vision applications, such as surveillance, advanced driver assistance systems, and automatic traffic monitoring. Because of the number of cameras used to cover a large area, these applications are constrained by the cost of each node, the power consumption, the robustness of the tracking, the processing time, and the ease of deployment of the system. To meet these challenges, the use of low-power and low-cost embedded vision platforms to achieve reliable tracking becomes essential in networks of cameras. We propose a tracking pipeline that is designed for fixed smart cameras and which can handle occlusions between objects. We show that the proposed pipeline reaches real-time processing on a low-cost embedded smart camera composed of a Raspberry-Pi board and a RaspiCam camera. The tracking quality and the processing speed obtained with the proposed pipeline are evaluated on publicly available datasets and compared to the state-of-the-art methods.

  1. Relative camera localisation in non-overlapping camera networks using multiple trajectories

    NARCIS (Netherlands)

    John, V.; Englebienne, G.; Kröse, B.J.A.

    2012-01-01

    In this article we present an automatic camera calibration algorithm using multiple trajectories in a multiple camera network with non-overlapping field-of-views (FOV). Visible trajectories within a camera FOV are assumed to be measured with respect to the camera local co-ordinate system.

  2. Automated Ground-based Time-lapse Camera Monitoring of West Greenland ice sheet outlet Glaciers: Challenges and Solutions

    Science.gov (United States)

    Ahn, Y.; Box, J. E.; Balog, J.; Lewinter, A.

    2008-12-01

    Monitoring Greenland outlet glaciers using remotely sensed data has drawn a great attention in earth science communities for decades and time series analysis of sensory data has provided important variability information of glacier flow by detecting speed and thickness changes, tracking features and acquiring model input. Thanks to advancements of commercial digital camera technology and increased solid state storage, we activated automatic ground-based time-lapse camera stations with high spatial/temporal resolution in west Greenland outlet and collected one-hour interval data continuous for more than one year at some but not all sites. We believe that important information of ice dynamics are contained in these data and that terrestrial mono-/stereo-photogrammetry can provide theoretical/practical fundamentals in data processing along with digital image processing techniques. Time-lapse images over periods in west Greenland indicate various phenomenon. Problematic is rain, snow, fog, shadows, freezing of water on camera enclosure window, image over-exposure, camera motion, sensor platform drift, and fox chewing of instrument cables, and the pecking of plastic window by ravens. Other problems include: feature identification, camera orientation, image registration, feature matching in image pairs, and feature tracking. Another obstacle is that non-metric digital camera contains large distortion to be compensated for precise photogrammetric use. Further, a massive number of images need to be processed in a way that is sufficiently computationally efficient. We meet these challenges by 1) identifying problems in possible photogrammetric processes, 2) categorizing them based on feasibility, and 3) clarifying limitation and alternatives, while emphasizing displacement computation and analyzing regional/temporal variability. We experiment with mono and stereo photogrammetric techniques in the aide of automatic correlation matching for efficiently handling the enormous

  3. CALIBRATION PROCEDURES ON OBLIQUE CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    G. Kemper

    2016-06-01

    Full Text Available Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna –IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first

  4. Securing Embedded Smart Cameras with Trusted Computing

    Directory of Open Access Journals (Sweden)

    Winkler Thomas

    2011-01-01

    Full Text Available Camera systems are used in many applications including video surveillance for crime prevention and investigation, traffic monitoring on highways or building monitoring and automation. With the shift from analog towards digital systems, the capabilities of cameras are constantly increasing. Today's smart camera systems come with considerable computing power, large memory, and wired or wireless communication interfaces. With onboard image processing and analysis capabilities, cameras not only open new possibilities but also raise new challenges. Often overlooked are potential security issues of the camera system. The increasing amount of software running on the cameras turns them into attractive targets for attackers. Therefore, the protection of camera devices and delivered data is of critical importance. In this work we present an embedded camera prototype that uses Trusted Computing to provide security guarantees for streamed videos. With a hardware-based security solution, we ensure integrity, authenticity, and confidentiality of videos. Furthermore, we incorporate image timestamping, detection of platform reboots, and reporting of the system status. This work is not limited to theoretical considerations but also describes the implementation of a prototype system. Extensive evaluation results illustrate the practical feasibility of the approach.

  5. Distributed FPGA-based smart camera architecture for computer vision applications

    OpenAIRE

    Bourrasset, Cédric; Maggiani, Luca; Sérot, Jocelyn; Berry, François; Pagano, Paolo

    2013-01-01

    International audience; Smart camera networks (SCN) raise challenging issues in many fields of research, including vision processing, communication protocols, distributed algorithms or power management. Furthermore, application logic in SCN is not centralized but spread among network nodes meaning that each node must have to process images to extract significant features, and aggregate data to understand the surrounding environment. In this context, smart camera have first embedded general pu...

  6. A pixellated γ-camera based on CdTe detectors clinical interests and performances

    International Nuclear Information System (INIS)

    Chambron, J.; Arntz, Y.; Eclancher, B.; Scheiber, Ch.; Siffert, P.; Hage Hali, M.; Regal, R.; Kazandjian, A.; Prat, V.; Thomas, S.; Warren, S.; Matz, R.; Jahnke, A.; Karman, M.; Pszota, A.; Nemeth, L.

    2000-01-01

    A mobile gamma camera dedicated to nuclear cardiology, based on a 15 cmx15 cm detection matrix of 2304 CdTe detector elements, 2.83 mmx2.83 mmx2 mm, has been developed with a European Community support to academic and industrial research centres. The intrinsic properties of the semiconductor crystals - low-ionisation energy, high-energy resolution, high attenuation coefficient - are potentially attractive to improve the γ-camera performances. But their use as γ detectors for medical imaging at high resolution requires production of high-grade materials and large quantities of sophisticated read-out electronics. The decision was taken to use CdTe rather than CdZnTe, because the manufacturer (Eurorad, France) has a large experience for producing high-grade materials, with a good homogeneity and stability and whose transport properties, characterised by the mobility-lifetime product, are at least 5 times greater than that of CdZnTe. The detector matrix is divided in 9 square units, each unit is composed of 256 detectors shared in 16 modules. Each module consists in a thin ceramic plate holding a line of 16 detectors, in four groups of four for an easy replacement, and holding a special 16 channels integrated circuit designed by CLRC (UK). A detection and acquisition logic based on a DSP card and a PC has been programmed by Eurorad for spectral and counting acquisition modes. Collimators LEAP and LEHR from commercial design, mobile gantry and clinical software were provided by Siemens (Germany). The γ-camera head housing, its general mounting and the electric connections were performed by Phase Laboratory (CNRS, France). The compactness of the γ-camera head, thin detectors matrix, electronic readout and collimator, facilitates the detection of close γ sources with the advantage of a high spatial resolution. Such an equipment is intended to bedside explorations. There is a growing clinical requirement in nuclear cardiology to early assess the extent of an infarct

  7. On camera-based smoke and gas leakage detection

    Energy Technology Data Exchange (ETDEWEB)

    Nyboe, Hans Olav

    1999-07-01

    Gas detectors are found in almost every part of industry and in many homes as well. An offshore oil or gas platform may host several hundred gas detectors. The ability of the common point and open path gas detectors to detect leakages depends on their location relative to the location of a gas cloud. This thesis describes the development of a passive volume gas detector, that is, one than will detect a leakage anywhere in the area monitored. After the consideration of several detection techniques it was decided to use an ordinary monochrome camera as sensor. Because a gas leakage may perturb the index of refraction, parts of the background appear to be displaced from their true positions, and it is necessary to develop algorithms that can deal with small differences between images. The thesis develops two such algorithms. Many image regions can be defined and several feature values can be computed for each region. The value of the features depends on the pattern in the image regions. The classes studied in this work are: reference, gas, smoke and human activity. Test show that observation belonging to these classes can be classified fairly high accuracy. The features in the feature set were chosen and developed for this particular application. Basically, the features measure the magnitude of pixel differences, size of detected phenomena and image distortion. Interesting results from many experiments are presented. Most important, the experiments show that apparent motion caused by a gas leakage or heat convection can be detected by means of a monochrome camera. Small leakages of methane can be detected at a range of about four metres. Other gases, such as butane, where the densities differ more from the density of air than the density of methane does, can be detected further from the camera. Gas leakages large enough to cause condensation have been detected at a camera distance of 20 metres. 59 refs., 42 figs., 13 tabs.

  8. Wheat streak mosaic virus coat protein is a host-specific long-distance transport determinant in oat

    Science.gov (United States)

    Viral determinants involved in systemic infection of hosts by monocot-infecting plant viruses are poorly understood. Wheat streak mosaic virus (WSMV, genus Tritimovirus, family Potyviridae) exclusively infects monocotyledonous crops such as wheat, oat, barley, maize, triticale, and rye. Previously, ...

  9. Time-resolved soft-x-ray studies of energy transport in layered and planar laser-driven targets

    International Nuclear Information System (INIS)

    Stradling, G.L.

    1982-01-01

    New low-energy x-ray diagnostic techniques are used to explore energy-transport processes in laser heated plasmas. Streak cameras are used to provide 15-psec time-resolution measurements of subkeV x-ray emission. A very thin (50 μg/cm 2 ) carbon substrate provides a low-energy x-ray transparent window to the transmission photocathode of this soft x-ray streak camera. Active differential vacuum pumping of the instrument is required. The use of high-sensitivity, low secondary-electron energy-spread CsI photocathodes in x-ray streak cameras is also described. Significant increases in sensitivity with only a small and intermittant decrease in dynamic range were observed. These coherent, complementary advances in subkeV, time-resolved x-ray diagnostic capability are applied to energy-transport investigations of 1.06-μm laser plasmas. Both solid disk targets of a variety of Z's as well as Be-on-Al layered-disk targets were irradiated with 700-psec laser pulses of selected intensity between 3 x 10 14 W/cm 2 and 1 x 10 15 W/cm 2

  10. Epistatic determinism of durum wheat resistance to the wheat spindle streak mosaic virus.

    Science.gov (United States)

    Holtz, Yan; Bonnefoy, Michel; Viader, Véronique; Ardisson, Morgane; Rode, Nicolas O; Poux, Gérard; Roumet, Pierre; Marie-Jeanne, Véronique; Ranwez, Vincent; Santoni, Sylvain; Gouache, David; David, Jacques L

    2017-07-01

    The resistance of durum wheat to the Wheat spindle streak mosaic virus (WSSMV) is controlled by two main QTLs on chromosomes 7A and 7B, with a huge epistatic effect. Wheat spindle streak mosaic virus (WSSMV) is a major disease of durum wheat in Europe and North America. Breeding WSSMV-resistant cultivars is currently the only way to control the virus since no treatment is available. This paper reports studies of the inheritance of WSSMV resistance using two related durum wheat populations obtained by crossing two elite cultivars with a WSSMV-resistant emmer cultivar. In 2012 and 2015, 354 recombinant inbred lines (RIL) were phenotyped using visual notations, ELISA and qPCR and genotyped using locus targeted capture and sequencing. This allowed us to build a consensus genetic map of 8568 markers and identify three chromosomal regions involved in WSSMV resistance. Two major regions (located on chromosomes 7A and 7B) jointly explain, on the basis of epistatic interactions, up to 43% of the phenotypic variation. Flanking sequences of our genetic markers are provided to facilitate future marker-assisted selection of WSSMV-resistant cultivars.

  11. SU-C-18A-02: Image-Based Camera Tracking: Towards Registration of Endoscopic Video to CT

    International Nuclear Information System (INIS)

    Ingram, S; Rao, A; Wendt, R; Castillo, R; Court, L; Yang, J; Beadle, B

    2014-01-01

    Purpose: Endoscopic examinations are routinely performed on head and neck and esophageal cancer patients. However, these images are underutilized for radiation therapy because there is currently no way to register them to a CT of the patient. The purpose of this work is to develop a method to track the motion of an endoscope within a structure using images from standard clinical equipment. This method will be incorporated into a broader endoscopy/CT registration framework. Methods: We developed a software algorithm to track the motion of an endoscope within an arbitrary structure. We computed frame-to-frame rotation and translation of the camera by tracking surface points across the video sequence and utilizing two-camera epipolar geometry. The resulting 3D camera path was used to recover the surrounding structure via triangulation methods. We tested this algorithm on a rigid cylindrical phantom with a pattern spray-painted on the inside. We did not constrain the motion of the endoscope while recording, and we did not constrain our measurements using the known structure of the phantom. Results: Our software algorithm can successfully track the general motion of the endoscope as it moves through the phantom. However, our preliminary data do not show a high degree of accuracy in the triangulation of 3D point locations. More rigorous data will be presented at the annual meeting. Conclusion: Image-based camera tracking is a promising method for endoscopy/CT image registration, and it requires only standard clinical equipment. It is one of two major components needed to achieve endoscopy/CT registration, the second of which is tying the camera path to absolute patient geometry. In addition to this second component, future work will focus on validating our camera tracking algorithm in the presence of clinical imaging features such as patient motion, erratic camera motion, and dynamic scene illumination

  12. Evaluation of banana hybrids for tolerance to black leaf streak (Mycosphaerella fijiensis Morelet) in Puerto Rico

    Science.gov (United States)

    In Puerto Rico, bananas (including plantains) are important agricultural commodities; their combined production totaled 133,500 tons in 2008. Black leaf streak (BLS) and Sigatoka leaf spot diseases, caused by Mycosphaerella fijiensis and M. musicola, respectively, are responsible for significant los...

  13. Human Detection Based on the Generation of a Background Image by Using a Far-Infrared Light Camera

    Directory of Open Access Journals (Sweden)

    Eun Som Jeon

    2015-03-01

    Full Text Available The need for computer vision-based human detection has increased in fields, such as security, intelligent surveillance and monitoring systems. However, performance enhancement of human detection based on visible light cameras is limited, because of factors, such as nonuniform illumination, shadows and low external light in the evening and night. Consequently, human detection based on thermal (far-infrared light cameras has been considered as an alternative. However, its performance is influenced by the factors, such as low image resolution, low contrast and the large noises of thermal images. It is also affected by the high temperature of backgrounds during the day. To solve these problems, we propose a new method for detecting human areas in thermal camera images. Compared to previous works, the proposed research is novel in the following four aspects. One background image is generated by median and average filtering. Additional filtering procedures based on maximum gray level, size filtering and region erasing are applied to remove the human areas from the background image. Secondly, candidate human regions in the input image are located by combining the pixel and edge difference images between the input and background images. The thresholds for the difference images are adaptively determined based on the brightness of the generated background image. Noise components are removed by component labeling, a morphological operation and size filtering. Third, detected areas that may have more than two human regions are merged or separated based on the information in the horizontal and vertical histograms of the detected area. This procedure is adaptively operated based on the brightness of the generated background image. Fourth, a further procedure for the separation and removal of the candidate human regions is performed based on the size and ratio of the height to width information of the candidate regions considering the camera viewing direction

  14. Computer vision camera with embedded FPGA processing

    Science.gov (United States)

    Lecerf, Antoine; Ouellet, Denis; Arias-Estrada, Miguel

    2000-03-01

    Traditional computer vision is based on a camera-computer system in which the image understanding algorithms are embedded in the computer. To circumvent the computational load of vision algorithms, low-level processing and imaging hardware can be integrated in a single compact module where a dedicated architecture is implemented. This paper presents a Computer Vision Camera based on an open architecture implemented in an FPGA. The system is targeted to real-time computer vision tasks where low level processing and feature extraction tasks can be implemented in the FPGA device. The camera integrates a CMOS image sensor, an FPGA device, two memory banks, and an embedded PC for communication and control tasks. The FPGA device is a medium size one equivalent to 25,000 logic gates. The device is connected to two high speed memory banks, an IS interface, and an imager interface. The camera can be accessed for architecture programming, data transfer, and control through an Ethernet link from a remote computer. A hardware architecture can be defined in a Hardware Description Language (like VHDL), simulated and synthesized into digital structures that can be programmed into the FPGA and tested on the camera. The architecture of a classical multi-scale edge detection algorithm based on a Laplacian of Gaussian convolution has been developed to show the capabilities of the system.

  15. Image features dependant correlation-weighting function for efficient PRNU based source camera identification.

    Science.gov (United States)

    Tiwari, Mayank; Gupta, Bhupendra

    2018-04-01

    For source camera identification (SCI), photo response non-uniformity (PRNU) has been widely used as the fingerprint of the camera. The PRNU is extracted from the image by applying a de-noising filter then taking the difference between the original image and the de-noised image. However, it is observed that intensity-based features and high-frequency details (edges and texture) of the image, effect quality of the extracted PRNU. This effects correlation calculation and creates problems in SCI. For solving this problem, we propose a weighting function based on image features. We have experimentally identified image features (intensity and high-frequency contents) effect on the estimated PRNU, and then develop a weighting function which gives higher weights to image regions which give reliable PRNU and at the same point it gives comparatively less weights to the image regions which do not give reliable PRNU. Experimental results show that the proposed weighting function is able to improve the accuracy of SCI up to a great extent. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Performance evaluation of a hand-held, semiconductor (CdZnTe)-based gamma camera

    CERN Document Server

    Abe, A; Lee, J; Oka, T; Shizukuishi, K; Kikuchi, T; Inoue, T; Jimbo, M; Ryuo, H; Bickel, C

    2003-01-01

    We have designed and developed a small field of view gamma camera, the eZ SCOPE, based on use of a CdZnTe semiconductor. This device utilises proprietary signal processing technology and an interface to a computer-based imaging system. The purpose of this study was to evaluate the performance of the eZ scope in comparison with currently employed gamma camera technology. The detector is a single wafer of 5-mm-thick CdZnTe that is divided into a 16 x 16 array (256 pixels). The sensitive area of the detector is a square of dimension 3.2 cm. Two parallel-hole collimators are provided with the system and have a matching (256 hole) pattern to the CdZnTe detector array: a low-energy, high-resolution parallel-hole (LEHR) collimator fabricated of lead and a low-energy, high-sensitivity parallel-hole (LEHS) collimator fabricated of tungsten. Performance measurements and the data analysis were done according to the procedures of the NEMA standard. We also studied the long-term stability of the system with continuous use...

  17. Design and Implementation of a Novel Portable 360° Stereo Camera System with Low-Cost Action Cameras

    Science.gov (United States)

    Holdener, D.; Nebiker, S.; Blaser, S.

    2017-11-01

    The demand for capturing indoor spaces is rising with the digitalization trend in the construction industry. An efficient solution for measuring challenging indoor environments is mobile mapping. Image-based systems with 360° panoramic coverage allow a rapid data acquisition and can be processed to georeferenced 3D images hosted in cloud-based 3D geoinformation services. For the multiview stereo camera system presented in this paper, a 360° coverage is achieved with a layout consisting of five horizontal stereo image pairs in a circular arrangement. The design is implemented as a low-cost solution based on a 3D printed camera rig and action cameras with fisheye lenses. The fisheye stereo system is successfully calibrated with accuracies sufficient for the applied measurement task. A comparison of 3D distances with reference data delivers maximal deviations of 3 cm on typical distances in indoor space of 2-8 m. Also the automatic computation of coloured point clouds from the stereo pairs is demonstrated.

  18. Photogrammetric Applications of Immersive Video Cameras

    Science.gov (United States)

    Kwiatek, K.; Tokarczyk, R.

    2014-05-01

    The paper investigates immersive videography and its application in close-range photogrammetry. Immersive video involves the capture of a live-action scene that presents a 360° field of view. It is recorded simultaneously by multiple cameras or microlenses, where the principal point of each camera is offset from the rotating axis of the device. This issue causes problems when stitching together individual frames of video separated from particular cameras, however there are ways to overcome it and applying immersive cameras in photogrammetry provides a new potential. The paper presents two applications of immersive video in photogrammetry. At first, the creation of a low-cost mobile mapping system based on Ladybug®3 and GPS device is discussed. The amount of panoramas is much too high for photogrammetric purposes as the base line between spherical panoramas is around 1 metre. More than 92 000 panoramas were recorded in one Polish region of Czarny Dunajec and the measurements from panoramas enable the user to measure the area of outdoors (adverting structures) and billboards. A new law is being created in order to limit the number of illegal advertising structures in the Polish landscape and immersive video recorded in a short period of time is a candidate for economical and flexible measurements off-site. The second approach is a generation of 3d video-based reconstructions of heritage sites based on immersive video (structure from immersive video). A mobile camera mounted on a tripod dolly was used to record the interior scene and immersive video, separated into thousands of still panoramas, was converted from video into 3d objects using Agisoft Photoscan Professional. The findings from these experiments demonstrated that immersive photogrammetry seems to be a flexible and prompt method of 3d modelling and provides promising features for mobile mapping systems.

  19. Self-Calibration Method Based on Surface Micromaching of Light Transceiver Focal Plane for Optical Camera

    Directory of Open Access Journals (Sweden)

    Jin Li

    2016-10-01

    Full Text Available In remote sensing photogrammetric applications, inner orientation parameter (IOP calibration of remote sensing camera is a prerequisite for determining image position. However, achieving such a calibration without temporal and spatial limitations remains a crucial but unresolved issue to date. The accuracy of IOP calibration methods of a remote sensing camera determines the performance of image positioning. In this paper, we propose a high-accuracy self-calibration method without temporal and spatial limitations for remote sensing cameras. Our method is based on an auto-collimating dichroic filter combined with a surface micromachining (SM point-source focal plane. The proposed method can autonomously complete IOP calibration without the need of outside reference targets. The SM procedure is used to manufacture a light transceiver focal plane, which integrates with point sources, a splitter, and a complementary metal oxide semiconductor sensor. A dichroic filter is used to fabricate an auto-collimation light reflection element. The dichroic filter, splitter, and SM point-source focal plane are integrated into a camera to perform an integrated self-calibration. Experimental measurements confirm the effectiveness and convenience of the proposed method. Moreover, the method can achieve micrometer-level precision and can satisfactorily complete real-time calibration without temporal or spatial limitations.

  20. Characterization of beam dynamics in the APS injector rings using time-resolved imaging techniques

    International Nuclear Information System (INIS)

    Yang, B.X.; Lumpkin, A.H.; Borland, M.

    1997-01-01

    Images taken with streak cameras and gated intensified cameras with both time (longitudinal) and spatial (transverse) resolution reveal a wealth of information about circular accelerators. The authors illustrate a novel technique by a sequence of dual-sweep streak camera images taken at a high dispersion location in the booster synchrotron, where the horizontal coordinate is strongly correlated with the particle energy and the open-quotes top-viewclose quotes of the beam gives a good approximation to the particle density distribution in the longitudinal phase space. A sequence of top-view images taken fight after injection clearly shows the beam dynamics in the phase space. We report another example from the positron accumulator ring for the characterization of its beam compression bunching with the 12th harmonic rf

  1. Comparative analysis of virus-derived small RNAs within cassava (Manihot esculenta Crantz) infected with cassava brown streak viruses.

    Science.gov (United States)

    Ogwok, Emmanuel; Ilyas, Muhammad; Alicai, Titus; Rey, Marie E C; Taylor, Nigel J

    2016-04-02

    Infection of plant cells by viral pathogens triggers RNA silencing, an innate antiviral defense mechanism. In response to infection, small RNAs (sRNAs) are produced that associate with Argonaute (AGO)-containing silencing complexes which act to inactivate viral genomes by posttranscriptional gene silencing (PTGS). Deep sequencing was used to compare virus-derived small RNAs (vsRNAs) in cassava genotypes NASE 3, TME 204 and 60444 infected with the positive sense single-stranded RNA (+ssRNA) viruses cassava brown streak virus (CBSV) and Ugandan cassava brown streak virus (UCBSV), the causal agents of cassava brown streak disease (CBSD). An abundance of 21-24nt vsRNAs was detected and mapped, covering the entire CBSV and UCBSV genomes. The 21nt vsRNAs were most predominant, followed by the 22 nt class with a slight bias toward sense compared to antisense polarity, and a bias for adenine and uracil bases present at the 5'-terminus. Distribution and frequency of vsRNAs differed between cassava genotypes and viral genomes. In susceptible genotypes TME 204 and 60444, CBSV-derived sRNAs were seen in greater abundance than UCBSV-derived sRNAs. NASE 3, known to be resistant to UCBSV, accumulated negligible UCBSV-derived sRNAs but high populations of CBSV-derived sRNAs. Transcript levels of cassava homologues of AGO2, DCL2 and DCL4, which are central to the gene-silencing complex, were found to be differentially regulated in CBSV- and UCBSV-infected plants across genotypes, suggesting these proteins play a role in antiviral defense. Irrespective of genotype or viral pathogen, maximum populations of vsRNAs mapped to the cytoplasmic inclusion, P1 and P3 protein-encoding regions. Our results indicate disparity between CBSV and UCBSV host-virus interaction mechanisms, and provide insight into the role of virus-induced gene silencing as a mechanism of resistance to CBSD. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Camera-Based Lock-in and Heterodyne Carrierographic Photoluminescence Imaging of Crystalline Silicon Wafers

    Science.gov (United States)

    Sun, Q. M.; Melnikov, A.; Mandelis, A.

    2015-06-01

    Carrierographic (spectrally gated photoluminescence) imaging of a crystalline silicon wafer using an InGaAs camera and two spread super-bandgap illumination laser beams is introduced in both low-frequency lock-in and high-frequency heterodyne modes. Lock-in carrierographic images of the wafer up to 400 Hz modulation frequency are presented. To overcome the frame rate and exposure time limitations of the camera, a heterodyne method is employed for high-frequency carrierographic imaging which results in high-resolution near-subsurface information. The feasibility of the method is guaranteed by the typical superlinearity behavior of photoluminescence, which allows one to construct a slow enough beat frequency component from nonlinear mixing of two high frequencies. Intensity-scan measurements were carried out with a conventional single-element InGaAs detector photocarrier radiometry system, and the nonlinearity exponent of the wafer was found to be around 1.7. Heterodyne images of the wafer up to 4 kHz have been obtained and qualitatively analyzed. With the help of the complementary lock-in and heterodyne modes, camera-based carrierographic imaging in a wide frequency range has been realized for fundamental research and industrial applications toward in-line nondestructive testing of semiconductor materials and devices.

  3. Ultraviolet Imaging with Low Cost Smartphone Sensors: Development and Application of a Raspberry Pi-Based UV Camera

    Directory of Open Access Journals (Sweden)

    Thomas C. Wilkes

    2016-10-01

    Full Text Available Here, we report, for what we believe to be the first time, on the modification of a low cost sensor, designed for the smartphone camera market, to develop an ultraviolet (UV camera system. This was achieved via adaptation of Raspberry Pi cameras, which are based on back-illuminated complementary metal-oxide semiconductor (CMOS sensors, and we demonstrated the utility of these devices for applications at wavelengths as low as 310 nm, by remotely sensing power station smokestack emissions in this spectral region. Given the very low cost of these units, ≈ USD 25, they are suitable for widespread proliferation in a variety of UV imaging applications, e.g., in atmospheric science, volcanology, forensics and surface smoothness measurements.

  4. Laser fusion diagnostics

    International Nuclear Information System (INIS)

    Coleman, L.W.

    1978-01-01

    The current status of the capability of laser fusion diagnostics is reviewed. Optical and infrared streak cameras provide one time resolution measurement capability of less than 10 ps, while x-ray streak cameras provide 15 ps time resolution in the range of about 1--30 keV presently. Time integrated spatial resolutions of 1 μm are provided with a variety of optical techniques. Ultraviolet holographic interferometry has measured electron densities above 10 21 cm -3 with 1 μm spatial resolution and 15 ps temporal resolution. X-ray microscopes provide 3 μm time integrated resolution and the x-ray streak pinhole camera has 6 μm spatial resolution. Development of the framing camera has thus far provided 50 μm spatial resolution with 125 ps frame duration and the third order reconstruction of zone plate images has provided 3 μm resolutions for alpha particles. Time integrated measurements of x-rays span the range shown. Finally, the new Shiva neutron spectrometer increases the energy resolution capability of that technique to 25 keV for 14-MeV neutrons. These combined capabilities provide a unique set of diagnostics for the detailed measurement of the interaction of laser light with targets and a subsequent performance of those targets

  5. Bunch length measurements in the SLC damping ring

    International Nuclear Information System (INIS)

    Decker, F.J.; Limberg, T.; Minty, M.; Ross, M.

    1993-05-01

    The synchrotron light of the SLC damping ring was used to measure the bunch length with a streak camera at different times in the damping cycle. There are bunch length oscillations after injection, different equilibrium length during the cycle due to rf manipulations to avoid microwave instability oscillations, and just before extraction there is a longitudinal phase space rotation (bunch muncher) to shorten the bunch length. Measurements under these different conditions are presented and compared with BPM pulse height signals. Calibration and adjustment issues and the connection of the streak camera to the SLC control system are also discussed

  6. Lunar Reconnaissance Orbiter Camera (LROC) instrument overview

    Science.gov (United States)

    Robinson, M.S.; Brylow, S.M.; Tschimmel, M.; Humm, D.; Lawrence, S.J.; Thomas, P.C.; Denevi, B.W.; Bowman-Cisneros, E.; Zerr, J.; Ravine, M.A.; Caplinger, M.A.; Ghaemi, F.T.; Schaffner, J.A.; Malin, M.C.; Mahanti, P.; Bartels, A.; Anderson, J.; Tran, T.N.; Eliason, E.M.; McEwen, A.S.; Turtle, E.; Jolliff, B.L.; Hiesinger, H.

    2010-01-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) and Narrow Angle Cameras (NACs) are on the NASA Lunar Reconnaissance Orbiter (LRO). The WAC is a 7-color push-frame camera (100 and 400 m/pixel visible and UV, respectively), while the two NACs are monochrome narrow-angle linescan imagers (0.5 m/pixel). The primary mission of LRO is to obtain measurements of the Moon that will enable future lunar human exploration. The overarching goals of the LROC investigation include landing site identification and certification, mapping of permanently polar shadowed and sunlit regions, meter-scale mapping of polar regions, global multispectral imaging, a global morphology base map, characterization of regolith properties, and determination of current impact hazards.

  7. Mobile phone camera benchmarking: combination of camera speed and image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-01-01

    When a mobile phone camera is tested and benchmarked, the significance of quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. For example, ISO 15781 defines several measurements to evaluate various camera system delays. However, the speed or rapidity metrics of the mobile phone's camera system have not been used with the quality metrics even if the camera speed has become more and more important camera performance feature. There are several tasks in this work. Firstly, the most important image quality metrics are collected from the standards and papers. Secondly, the speed related metrics of a mobile phone's camera system are collected from the standards and papers and also novel speed metrics are identified. Thirdly, combinations of the quality and speed metrics are validated using mobile phones in the market. The measurements are done towards application programming interface of different operating system. Finally, the results are evaluated and conclusions are made. The result of this work gives detailed benchmarking results of mobile phone camera systems in the market. The paper defines also a proposal of combined benchmarking metrics, which includes both quality and speed parameters.

  8. A new apparatus for track-analysis in nuclear track emulsion based on a CCD-camera device

    International Nuclear Information System (INIS)

    Ganssauge, E.

    1993-01-01

    A CCD camera-based, image-analyzing system for automatic evaluation of nuclear track emulsion chambers is presented. The stage of a normal microscope moves using three remote controlled stepping motors with a step size of 0.25 μm. A CCD-camera is mounted on tope of the microscope in order to register the nuclear emulsion. The camera has a resolution capable of differentiating single emulsion-grains (0.6 μm). The camera picture is transformed from analogue to digital signals and stored by a frame grabber. Some background-picture elements can be eliminated by applying cuts on grey levels. The central computer processes the picture, correlates the single picture points, the coordinates and the grey-levels, such that in the end one has a unique assignment of each picture point to an address on the hard disk for a given plate. After repetition of this procedure for several plates by means of an appropriate software (for instance our vertex program [1]). the coordinates of the points are combined to tracks, and a variety of distributions like pseudorapidity-distributions can be calculated and presented on the terminal. (author)

  9. Construction of Double Right-Border Binary Vector Carrying Non-Host Gene Rxol Resistant to Bacterial Leaf Streak of Rice

    Institute of Scientific and Technical Information of China (English)

    Xu Mei-rong; XIA Zhi-hui; ZHAI Wen-xue; XU Jian-long; ZHOU Yong-li; LI Zhi-kang

    2008-01-01

    Rxol cloned from maize is a non-host gene resistant to bacterial leaf streak of rice. pCAMBIA1305-1 with Rxol was digested with Sca Ⅰ and NgoM Ⅳ and the double right-border binary vector pMNDRBBin6 was digested with Hpa Ⅰ and Xma Ⅰ.pMNDRBBin6 carrying the gene Rxol was acquired by ligation of blunt-end and cohesive end. The results of PCR, restriction enzyme analysis and sequencing indicated that the Rxol gene had been cloned into pMNDRBBin6. This double right-border binary vector,named as pMNDRBBin6-Rxol, will play a role in breeding marker-free plants resistant to bacterial leaf streak of rice by genetic transformation.

  10. Nuclear Radiation Degradation Study on HD Camera Based on CMOS Image Sensor at Different Dose Rates.

    Science.gov (United States)

    Wang, Congzheng; Hu, Song; Gao, Chunming; Feng, Chang

    2018-02-08

    In this work, we irradiated a high-definition (HD) industrial camera based on a commercial-off-the-shelf (COTS) CMOS image sensor (CIS) with Cobalt-60 gamma-rays. All components of the camera under test were fabricated without radiation hardening, except for the lens. The irradiation experiments of the HD camera under biased conditions were carried out at 1.0, 10.0, 20.0, 50.0 and 100.0 Gy/h. During the experiment, we found that the tested camera showed a remarkable degradation after irradiation and differed in the dose rates. With the increase of dose rate, the same target images become brighter. Under the same dose rate, the radiation effect in bright area is lower than that in dark area. Under different dose rates, the higher the dose rate is, the worse the radiation effect will be in both bright and dark areas. And the standard deviations of bright and dark areas become greater. Furthermore, through the progressive degradation analysis of the captured image, experimental results demonstrate that the attenuation of signal to noise ratio (SNR) versus radiation time is not obvious at the same dose rate, and the degradation is more and more serious with increasing dose rate. Additionally, the decrease rate of SNR at 20.0, 50.0 and 100.0 Gy/h is far greater than that at 1.0 and 10.0 Gy/h. Even so, we confirm that the HD industrial camera is still working at 10.0 Gy/h during the 8 h of measurements, with a moderate decrease of the SNR (5 dB). The work is valuable and can provide suggestion for camera users in the radiation field.

  11. CamOn: A Real-Time Autonomous Camera Control System

    DEFF Research Database (Denmark)

    Burelli, Paolo; Jhala, Arnav Harish

    2009-01-01

    This demonstration presents CamOn, an autonomous cam- era control system for real-time 3D games. CamOn employs multiple Artificial Potential Fields (APFs), a robot motion planning technique, to control both the location and orienta- tion of the camera. Scene geometry from the 3D environment...... contributes to the potential field that is used to determine po- sition and movement of the camera. Composition constraints for the camera are modelled as potential fields for controlling the view target of the camera. CamOn combines the compositional benefits of constraint- based camera systems, and improves...

  12. Normalized Metadata Generation for Human Retrieval Using Multiple Video Surveillance Cameras

    Directory of Open Access Journals (Sweden)

    Jaehoon Jung

    2016-06-01

    Full Text Available Since it is impossible for surveillance personnel to keep monitoring videos from a multiple camera-based surveillance system, an efficient technique is needed to help recognize important situations by retrieving the metadata of an object-of-interest. In a multiple camera-based surveillance system, an object detected in a camera has a different shape in another camera, which is a critical issue of wide-range, real-time surveillance systems. In order to address the problem, this paper presents an object retrieval method by extracting the normalized metadata of an object-of-interest from multiple, heterogeneous cameras. The proposed metadata generation algorithm consists of three steps: (i generation of a three-dimensional (3D human model; (ii human object-based automatic scene calibration; and (iii metadata generation. More specifically, an appropriately-generated 3D human model provides the foot-to-head direction information that is used as the input of the automatic calibration of each camera. The normalized object information is used to retrieve an object-of-interest in a wide-range, multiple-camera surveillance system in the form of metadata. Experimental results show that the 3D human model matches the ground truth, and automatic calibration-based normalization of metadata enables a successful retrieval and tracking of a human object in the multiple-camera video surveillance system.

  13. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    Science.gov (United States)

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  14. Passive auto-focus for digital still cameras and camera phones: Filter-switching and low-light techniques

    Science.gov (United States)

    Gamadia, Mark Noel

    In order to gain valuable market share in the growing consumer digital still camera and camera phone market, camera manufacturers have to continually add and improve existing features to their latest product offerings. Auto-focus (AF) is one such feature, whose aim is to enable consumers to quickly take sharply focused pictures with little or no manual intervention in adjusting the camera's focus lens. While AF has been a standard feature in digital still and cell-phone cameras, consumers often complain about their cameras' slow AF performance, which may lead to missed photographic opportunities, rendering valuable moments and events with undesired out-of-focus pictures. This dissertation addresses this critical issue to advance the state-of-the-art in the digital band-pass filter, passive AF method. This method is widely used to realize AF in the camera industry, where a focus actuator is adjusted via a search algorithm to locate the in-focus position by maximizing a sharpness measure extracted from a particular frequency band of the incoming image of the scene. There are no known systematic methods for automatically deriving the parameters such as the digital pass-bands or the search step-size increments used in existing passive AF schemes. Conventional methods require time consuming experimentation and tuning in order to arrive at a set of parameters which balance AF performance in terms of speed and accuracy ultimately causing a delay in product time-to-market. This dissertation presents a new framework for determining an optimal set of passive AF parameters, named Filter- Switching AF, providing an automatic approach to achieve superior AF performance, both in good and low lighting conditions based on the following performance measures (metrics): speed (total number of iterations), accuracy (offset from truth), power consumption (total distance moved), and user experience (in-focus position overrun). Performance results using three different prototype cameras

  15. DistancePPG: Robust non-contact vital signs monitoring using a camera

    Science.gov (United States)

    Kumar, Mayank; Veeraraghavan, Ashok; Sabharwal, Ashutosh

    2015-01-01

    Vital signs such as pulse rate and breathing rate are currently measured using contact probes. But, non-contact methods for measuring vital signs are desirable both in hospital settings (e.g. in NICU) and for ubiquitous in-situ health tracking (e.g. on mobile phone and computers with webcams). Recently, camera-based non-contact vital sign monitoring have been shown to be feasible. However, camera-based vital sign monitoring is challenging for people with darker skin tone, under low lighting conditions, and/or during movement of an individual in front of the camera. In this paper, we propose distancePPG, a new camera-based vital sign estimation algorithm which addresses these challenges. DistancePPG proposes a new method of combining skin-color change signals from different tracked regions of the face using a weighted average, where the weights depend on the blood perfusion and incident light intensity in the region, to improve the signal-to-noise ratio (SNR) of camera-based estimate. One of our key contributions is a new automatic method for determining the weights based only on the video recording of the subject. The gains in SNR of camera-based PPG estimated using distancePPG translate into reduction of the error in vital sign estimation, and thus expand the scope of camera-based vital sign monitoring to potentially challenging scenarios. Further, a dataset will be released, comprising of synchronized video recordings of face and pulse oximeter based ground truth recordings from the earlobe for people with different skin tones, under different lighting conditions and for various motion scenarios. PMID:26137365

  16. Low-cost uncooled VOx infrared camera development

    Science.gov (United States)

    Li, Chuan; Han, C. J.; Skidmore, George D.; Cook, Grady; Kubala, Kenny; Bates, Robert; Temple, Dorota; Lannon, John; Hilton, Allan; Glukh, Konstantin; Hardy, Busbee

    2013-06-01

    The DRS Tamarisk® 320 camera, introduced in 2011, is a low cost commercial camera based on the 17 µm pixel pitch 320×240 VOx microbolometer technology. A higher resolution 17 µm pixel pitch 640×480 Tamarisk®640 has also been developed and is now in production serving the commercial markets. Recently, under the DARPA sponsored Low Cost Thermal Imager-Manufacturing (LCTI-M) program and internal project, DRS is leading a team of industrial experts from FiveFocal, RTI International and MEMSCAP to develop a small form factor uncooled infrared camera for the military and commercial markets. The objective of the DARPA LCTI-M program is to develop a low SWaP camera (costs less than US $500 based on a 10,000 units per month production rate. To meet this challenge, DRS is developing several innovative technologies including a small pixel pitch 640×512 VOx uncooled detector, an advanced digital ROIC and low power miniature camera electronics. In addition, DRS and its partners are developing innovative manufacturing processes to reduce production cycle time and costs including wafer scale optic and vacuum packaging manufacturing and a 3-dimensional integrated camera assembly. This paper provides an overview of the DRS Tamarisk® project and LCTI-M related uncooled technology development activities. Highlights of recent progress and challenges will also be discussed. It should be noted that BAE Systems and Raytheon Vision Systems are also participants of the DARPA LCTI-M program.

  17. The GISMO-2 Bolometer Camera

    Science.gov (United States)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; hide

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  18. Bio-inspired motion detection in an FPGA-based smart camera module

    International Nuclear Information System (INIS)

    Koehler, T; Roechter, F; Moeller, R; Lindemann, J P

    2009-01-01

    Flying insects, despite their relatively coarse vision and tiny nervous system, are capable of carrying out elegant and fast aerial manoeuvres. Studies of the fly visual system have shown that this is accomplished by the integration of signals from a large number of elementary motion detectors (EMDs) in just a few global flow detector cells. We developed an FPGA-based smart camera module with more than 10 000 single EMDs, which is closely modelled after insect motion-detection circuits with respect to overall architecture, resolution and inter-receptor spacing. Input to the EMD array is provided by a CMOS camera with a high frame rate. Designed as an adaptable solution for different engineering applications and as a testbed for biological models, the EMD detector type and parameters such as the EMD time constants, the motion-detection directions and the angle between correlated receptors are reconfigurable online. This allows a flexible and simultaneous detection of complex motion fields such as translation, rotation and looming, such that various tasks, e.g., obstacle avoidance, height/distance control or speed regulation can be performed by the same compact device

  19. High resolution RGB color line scan camera

    Science.gov (United States)

    Lynch, Theodore E.; Huettig, Fred

    1998-04-01

    A color line scan camera family which is available with either 6000, 8000 or 10000 pixels/color channel, utilizes off-the-shelf lenses, interfaces with currently available frame grabbers, includes on-board pixel by pixel offset correction, and is configurable and controllable via RS232 serial port for computer controlled or stand alone operation is described in this paper. This line scan camera is based on an available 8000 element monochrome line scan camera designed by AOA for OEM use. The new color version includes improvements such as better packaging and additional user features which make the camera easier to use. The heart of the camera is a tri-linear CCD sensor with on-chip color balancing for maximum accuracy and pinned photodiodes for low lag response. Each color channel is digitized to 12 bits and all three channels are multiplexed together so that the resulting camera output video is either a 12 or 8 bit data stream at a rate of up to 24Megpixels/sec. Conversion from 12 to 8 bit, or user-defined gamma, is accomplished by on board user-defined video look up tables. The camera has two user-selectable operating modes; lows speed, high sensitivity mode or high speed, reduced sensitivity mode. The intended uses of the camera include industrial inspection, digital archiving, document scanning, and graphic arts applications.

  20. Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry

    Science.gov (United States)

    Chikatsu, Hirofumi; Takahashi, Yoji

    2009-08-01

    The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

  1. The GCT camera for the Cherenkov Telescope Array

    Science.gov (United States)

    Lapington, J. S.; Abchiche, A.; Allan, D.; Amans, J.-P.; Armstrong, T. P.; Balzer, A.; Berge, D.; Boisson, C.; Bousquet, J.-J.; Bose, R.; Brown, A. M.; Bryan, M.; Buchholtz, G.; Buckley, J.; Chadwick, P. M.; Costantini, H.; Cotter, G.; Daniel, M. K.; De Franco, A.; De Frondat, F.; Dournaux, J.-L.; Dumas, D.; Ernenwein, J.-P.; Fasola, G.; Funk, S.; Gironnet, J.; Graham, J. A.; Greenshaw, T.; Hervet, O.; Hidaka, N.; Hinton, J. A.; Huet, J.-M.; Jankowsky, D.; Jegouzo, I.; Jogler, T.; Kawashima, T.; Kraus, M.; Laporte, P.; Leach, S.; Lefaucheur, J.; Markoff, S.; Melse, T.; Minaya, I. A.; Mohrmann, L.; Molyneux, P.; Moore, P.; Nolan, S. J.; Okumura, A.; Osborne, J. P.; Parsons, R. D.; Rosen, S.; Ross, D.; Rowell, G.; Rulten, C. B.; Sato, Y.; Sayede, F.; Schmoll, J.; Schoorlemmer, H.; Servillat, M.; Sol, H.; Stamatescu, V.; Stephan, M.; Stuik, R.; Sykes, J.; Tajima, H.; Thornhill, J.; Tibaldo, L.; Trichard, C.; Varner, G.; Vink, J.; Watson, J. J.; White, R.; Yamane, N.; Zech, A.; Zink, A.; Zorn, J.; CTA Consortium

    2017-12-01

    The Gamma Cherenkov Telescope (GCT) is one of the designs proposed for the Small Sized Telescope (SST) section of the Cherenkov Telescope Array (CTA). The GCT uses dual-mirror optics, resulting in a compact telescope with good image quality and a large field of view with a smaller, more economical, camera than is achievable with conventional single mirror solutions. The photon counting GCT camera is designed to record the flashes of atmospheric Cherenkov light from gamma and cosmic ray initiated cascades, which last only a few tens of nanoseconds. The GCT optics require that the camera detectors follow a convex surface with a radius of curvature of 1 m and a diameter of 35 cm, which is approximated by tiling the focal plane with 32 modules. The first camera prototype is equipped with multi-anode photomultipliers, each comprising an 8×8 array of 6×6 mm2 pixels to provide the required angular scale, adding up to 2048 pixels in total. Detector signals are shaped, amplified and digitised by electronics based on custom ASICs that provide digitisation at 1 GSample/s. The camera is self-triggering, retaining images where the focal plane light distribution matches predefined spatial and temporal criteria. The electronics are housed in the liquid-cooled, sealed camera enclosure. LED flashers at the corners of the focal plane provide a calibration source via reflection from the secondary mirror. The first GCT camera prototype underwent preliminary laboratory tests last year. In November 2015, the camera was installed on a prototype GCT telescope (SST-GATE) in Paris and was used to successfully record the first Cherenkov light of any CTA prototype, and the first Cherenkov light seen with such a dual-mirror optical system. A second full-camera prototype based on Silicon Photomultipliers is under construction. Up to 35 GCTs are envisaged for CTA.

  2. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    Science.gov (United States)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  3. Calibration and verification of thermographic cameras for geometric measurements

    Science.gov (United States)

    Lagüela, S.; González-Jorge, H.; Armesto, J.; Arias, P.

    2011-03-01

    Infrared thermography is a technique with an increasing degree of development and applications. Quality assessment in the measurements performed with the thermal cameras should be achieved through metrology calibration and verification. Infrared cameras acquire temperature and geometric information, although calibration and verification procedures are only usual for thermal data. Black bodies are used for these purposes. Moreover, the geometric information is important for many fields as architecture, civil engineering and industry. This work presents a calibration procedure that allows the photogrammetric restitution and a portable artefact to verify the geometric accuracy, repeatability and drift of thermographic cameras. These results allow the incorporation of this information into the quality control processes of the companies. A grid based on burning lamps is used for the geometric calibration of thermographic cameras. The artefact designed for the geometric verification consists of five delrin spheres and seven cubes of different sizes. Metrology traceability for the artefact is obtained from a coordinate measuring machine. Two sets of targets with different reflectivity are fixed to the spheres and cubes to make data processing and photogrammetric restitution possible. Reflectivity was the chosen material propriety due to the thermographic and visual cameras ability to detect it. Two thermographic cameras from Flir and Nec manufacturers, and one visible camera from Jai are calibrated, verified and compared using calibration grids and the standard artefact. The calibration system based on burning lamps shows its capability to perform the internal orientation of the thermal cameras. Verification results show repeatability better than 1 mm for all cases, being better than 0.5 mm for the visible one. As it must be expected, also accuracy appears higher in the visible camera, and the geometric comparison between thermographic cameras shows slightly better

  4. Portable mini gamma camera for medical applications

    CERN Document Server

    Porras, E; Benlloch, J M; El-Djalil-Kadi-Hanifi, M; López, S; Pavon, N; Ruiz, J A; Sánchez, F; Sebastiá, A

    2002-01-01

    A small, portable and low-cost gamma camera for medical applications has been developed and clinically tested. This camera, based on a scintillator crystal and a Position Sensitive Photo-Multiplier Tube, has a useful field of view of 4.6 cm diameter and provides 2.2 mm of intrinsic spatial resolution. Its mobility and light weight allow to reach the patient from any desired direction. This camera images small organs with high efficiency and so addresses the demand for devices of specific clinical applications. In this paper, we present the camera and briefly describe the procedures that have led us to choose its configuration and the image reconstruction method. The clinical tests and diagnostic capability are also presented and discussed.

  5. Visual Positioning Indoors: Human Eyes vs. Smartphone Cameras.

    Science.gov (United States)

    Wu, Dewen; Chen, Ruizhi; Chen, Liang

    2017-11-16

    Artificial Intelligence (AI) technologies and their related applications are now developing at a rapid pace. Indoor positioning will be one of the core technologies that enable AI applications because people spend 80% of their time indoors. Humans can locate themselves related to a visually well-defined object, e.g., a door, based on their visual observations. Can a smartphone camera do a similar job when it points to an object? In this paper, a visual positioning solution was developed based on a single image captured from a smartphone camera pointing to a well-defined object. The smartphone camera simulates the process of human eyes for the purpose of relatively locating themselves against a well-defined object. Extensive experiments were conducted with five types of smartphones on three different indoor settings, including a meeting room, a library, and a reading room. Experimental results shown that the average positioning accuracy of the solution based on five smartphone cameras is 30.6 cm, while that for the human-observed solution with 300 samples from 10 different people is 73.1 cm.

  6. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, U.; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is mounted in the ray inlet opening of the camera, while the others are placed on separate supports. The supports are swingably mounted upon a column one above the other through about 90 0 to a collimator exchange position. Each of the separate supports is swingable to a vertically aligned position, with limiting of the swinging movement and positioning of the support at the desired exchange position. The collimators are carried on the supports by means of a series of vertically disposed coil springs. Projections on the camera are movable from above into grooves of the collimator at the exchange position, whereupon the collimator is turned so that it is securely prevented from falling out of the camera head

  7. A real-time MTFC algorithm of space remote-sensing camera based on FPGA

    Science.gov (United States)

    Zhao, Liting; Huang, Gang; Lin, Zhe

    2018-01-01

    A real-time MTFC algorithm of space remote-sensing camera based on FPGA was designed. The algorithm can provide real-time image processing to enhance image clarity when the remote-sensing camera running on-orbit. The image restoration algorithm adopted modular design. The MTF measurement calculation module on-orbit had the function of calculating the edge extension function, line extension function, ESF difference operation, normalization MTF and MTFC parameters. The MTFC image filtering and noise suppression had the function of filtering algorithm and effectively suppressing the noise. The algorithm used System Generator to design the image processing algorithms to simplify the design structure of system and the process redesign. The image gray gradient dot sharpness edge contrast and median-high frequency were enhanced. The image SNR after recovery reduced less than 1 dB compared to the original image. The image restoration system can be widely used in various fields.

  8. CAMERA-BASED SOFTWARE IN REHABILITATION/THERAPY INTERVENTION (extended)

    DEFF Research Database (Denmark)

    Brooks, Anthony Lewis

    2014-01-01

    on specific hardware. Adaptable means that human tracking and created artefact interaction in the camera field of view is relatively easily changed as one desires via a user-friendly GUI. The significance of having both available for contemporary intervention is argued. Conclusions are that the mature, robust...

  9. High-precision real-time 3D shape measurement based on a quad-camera system

    Science.gov (United States)

    Tao, Tianyang; Chen, Qian; Feng, Shijie; Hu, Yan; Zhang, Minliang; Zuo, Chao

    2018-01-01

    Phase-shifting profilometry (PSP) based 3D shape measurement is well established in various applications due to its high accuracy, simple implementation, and robustness to environmental illumination and surface texture. In PSP, higher depth resolution generally requires higher fringe density of projected patterns which, in turn, lead to severe phase ambiguities that must be solved with additional information from phase coding and/or geometric constraints. However, in order to guarantee the reliability of phase unwrapping, available techniques are usually accompanied by increased number of patterns, reduced amplitude of fringe, and complicated post-processing algorithms. In this work, we demonstrate that by using a quad-camera multi-view fringe projection system and carefully arranging the relative spatial positions between the cameras and the projector, it becomes possible to completely eliminate the phase ambiguities in conventional three-step PSP patterns with high-fringe-density without projecting any additional patterns or embedding any auxiliary signals. Benefiting from the position-optimized quad-camera system, stereo phase unwrapping can be efficiently and reliably performed by flexible phase consistency checks. Besides, redundant information of multiple phase consistency checks is fully used through a weighted phase difference scheme to further enhance the reliability of phase unwrapping. This paper explains the 3D measurement principle and the basic design of quad-camera system, and finally demonstrates that in a large measurement volume of 200 mm × 200 mm × 400 mm, the resultant dynamic 3D sensing system can realize real-time 3D reconstruction at 60 frames per second with a depth precision of 50 μm.

  10. Molecular confirmation of Maize rayado fino virus as the Brazilian corn streak virus

    OpenAIRE

    Hammond,Rosemarie Wahnbaeck; Bedendo,Ivan Paulo

    2005-01-01

    Maize rayado fino virus (MRFV), present in various countries in Latin America, has shown similarities to corn streak virus that occurs in Brazil, regarding pathogenic, serological and histological characteristics. In the current report both virus were molecularly compared to confirm the similarities between them. MRFV was identified by nucleic acid hybridization in samples of maize tissues exhibiting symptoms of "corn stunt" disease, collected from two Brazilian States - São Paulo and Minas G...

  11. Nuclear Radiation Degradation Study on HD Camera Based on CMOS Image Sensor at Different Dose Rates

    Directory of Open Access Journals (Sweden)

    Congzheng Wang

    2018-02-01

    Full Text Available In this work, we irradiated a high-definition (HD industrial camera based on a commercial-off-the-shelf (COTS CMOS image sensor (CIS with Cobalt-60 gamma-rays. All components of the camera under test were fabricated without radiation hardening, except for the lens. The irradiation experiments of the HD camera under biased conditions were carried out at 1.0, 10.0, 20.0, 50.0 and 100.0 Gy/h. During the experiment, we found that the tested camera showed a remarkable degradation after irradiation and differed in the dose rates. With the increase of dose rate, the same target images become brighter. Under the same dose rate, the radiation effect in bright area is lower than that in dark area. Under different dose rates, the higher the dose rate is, the worse the radiation effect will be in both bright and dark areas. And the standard deviations of bright and dark areas become greater. Furthermore, through the progressive degradation analysis of the captured image, experimental results demonstrate that the attenuation of signal to noise ratio (SNR versus radiation time is not obvious at the same dose rate, and the degradation is more and more serious with increasing dose rate. Additionally, the decrease rate of SNR at 20.0, 50.0 and 100.0 Gy/h is far greater than that at 1.0 and 10.0 Gy/h. Even so, we confirm that the HD industrial camera is still working at 10.0 Gy/h during the 8 h of measurements, with a moderate decrease of the SNR (5 dB. The work is valuable and can provide suggestion for camera users in the radiation field.

  12. The making of analog module for gamma camera interface

    International Nuclear Information System (INIS)

    Yulinarsari, Leli; Rl, Tjutju; Susila, Atang; Sukandar

    2003-01-01

    The making of an analog module for gamma camera has been conducted. For computerization of planar gamma camera 37 PMT it has been developed interface hardware technology and software between the planar gamma camera with PC. With this interface gamma camera image information (Originally analog signal) was changed to digital single, therefore processes of data acquisition, image quality increase and data analysis as well as data base processing can be conducted with the help of computers, there are three gamma camera main signals, i.e. X, Y and Z . This analog module makes digitation of analog signal X and Y from the gamma camera that conveys position information coming from the gamma camera crystal. Analog conversion to digital was conducted by 2 converters ADC 12 bit with conversion time 800 ns each, conversion procedure for each coordinate X and Y was synchronized using suitable strobe signal Z for information acceptance

  13. A novel camera type for very high energy gamma-ray astronomy based on Geiger-mode avalanche photodiodes

    International Nuclear Information System (INIS)

    Anderhub, H; Biland, A; Boller, A; Braun, I; Commichau, S; Commichau, V; Dorner, D; Gendotti, A; Grimm, O; Gunten, H von; Hildebrand, D; Horisberger, U; Kraehenbuehl, T; Kranich, D; Lorenz, E; Lustermann, W; Backes, M; Neise, D; Bretz, T; Mannheim, K

    2009-01-01

    Geiger-mode avalanche photodiodes (G-APD) are promising new sensors for light detection in atmospheric Cherenkov telescopes. In this paper, the design and commissioning of a 36-pixel G-APD prototype camera is presented. The data acquisition is based on the Domino Ring Sampling (DRS2) chip. A sub-nanosecond time resolution has been achieved. Cosmic-ray induced air showers have been recorded using an imaging mirror setup, in a self-triggered mode. This is the first time that such measurements have been carried out with a complete G-APD camera.

  14. CCD camera system for use with a streamer chamber

    International Nuclear Information System (INIS)

    Angius, S.A.; Au, R.; Crawley, G.C.; Djalali, C.; Fox, R.; Maier, M.; Ogilvie, C.A.; Molen, A. van der; Westfall, G.D.; Tickle, R.S.

    1988-01-01

    A system based on three charge-coupled-device (CCD) cameras is described here. It has been used to acquire images from a streamer chamber and consists of three identical subsystems, one for each camera. Each subsystem contains an optical lens, CCD camera head, camera controller, an interface between the CCD and a microprocessor, and a link to a minicomputer for data recording and on-line analysis. Image analysis techniques have been developed to enhance the quality of the particle tracks. Some steps have been made to automatically identify tracks and reconstruct the event. (orig.)

  15. Fuzzy logic control for camera tracking system

    Science.gov (United States)

    Lea, Robert N.; Fritz, R. H.; Giarratano, J.; Jani, Yashvant

    1992-01-01

    A concept utilizing fuzzy theory has been developed for a camera tracking system to provide support for proximity operations and traffic management around the Space Station Freedom. Fuzzy sets and fuzzy logic based reasoning are used in a control system which utilizes images from a camera and generates required pan and tilt commands to track and maintain a moving target in the camera's field of view. This control system can be implemented on a fuzzy chip to provide an intelligent sensor for autonomous operations. Capabilities of the control system can be expanded to include approach, handover to other sensors, caution and warning messages.

  16. FOREX-A Fiber Optics Diagnostic System For Study Of Materials At High Temperatures And Pressures

    Science.gov (United States)

    Smith, D. E.; Roeske, F.

    1983-03-01

    We have successfully fielded a Fiber Optics Radiation EXperiment system (FOREX) designed for measuring material properties at high temperatures and pressures on an underground nuclear test. The system collects light from radiating materials and transmits it through several hundred meters of optical fibers to a recording station consisting of a streak camera with film readout. The use of fiber optics provides a faster time response than can presently be obtained with equalized coaxial cables over comparable distances. Fibers also have significant cost and physical size advantages over coax cables. The streak camera achieves a much higher information density than an equivalent oscilloscope system, and it also serves as the light detector. The result is a wide bandwidth high capacity system that can be fielded at a relatively low cost in manpower, space, and materials. For this experiment, the streak camera had a 120 ns time window with a 1.2 ns time resolution. Dynamic range for the system was about 1000. Beam current statistical limitations were approximately 8% for a 0.3 ns wide data point at one decade above the threshold recording intensity.

  17. Imaging performance comparison between a LaBr3: Ce scintillator based and a CdTe semiconductor based photon counting compact gamma camera.

    Science.gov (United States)

    Russo, P; Mettivier, G; Pani, R; Pellegrini, R; Cinti, M N; Bennati, P

    2009-04-01

    The authors report on the performance of two small field of view, compact gamma cameras working in single photon counting in planar imaging tests at 122 and 140 keV. The first camera is based on a LaBr3: Ce scintillator continuous crystal (49 x 49 x 5 mm3) assembled with a flat panel multianode photomultiplier tube with parallel readout. The second one belongs to the class of semiconductor hybrid pixel detectors, specifically, a CdTe pixel detector (14 x 14 x 1 mm3) with 256 x 256 square pixels and a pitch of 55 microm, read out by a CMOS single photon counting integrated circuit of the Medipix2 series. The scintillation camera was operated with selectable energy window while the CdTe camera was operated with a single low-energy detection threshold of about 20 keV, i.e., without energy discrimination. The detectors were coupled to pinhole or parallel-hole high-resolution collimators. The evaluation of their overall performance in basic imaging tasks is presented through measurements of their detection efficiency, intrinsic spatial resolution, noise, image SNR, and contrast recovery. The scintillation and CdTe cameras showed, respectively, detection efficiencies at 122 keV of 83% and 45%, intrinsic spatial resolutions of 0.9 mm and 75 microm, and total background noises of 40.5 and 1.6 cps. Imaging tests with high-resolution parallel-hole and pinhole collimators are also reported.

  18. Conceptual design of a neutron camera for MAST Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Weiszflog, M., E-mail: matthias.weiszflog@physics.uu.se; Sangaroon, S.; Cecconello, M.; Conroy, S.; Ericsson, G.; Klimek, I. [Department of Physics and Astronomy, Uppsala University, EURATOM-VR Association, Uppsala (Sweden); Keeling, D.; Martin, R. [CCFE, Culham Science Centre, Abingdon (United Kingdom); Turnyanskiy, M. [ITER Physics Department, EFDA CSU Garching, Boltzmannstrae 2, D-85748 Garching (Germany)

    2014-11-15

    This paper presents two different conceptual designs of neutron cameras for Mega Ampere Spherical Tokamak (MAST) Upgrade. The first one consists of two horizontal cameras, one equatorial and one vertically down-shifted by 65 cm. The second design, viewing the plasma in a poloidal section, also consists of two cameras, one radial and the other one with a diagonal view. Design parameters for the different cameras were selected on the basis of neutron transport calculations and on a set of target measurement requirements taking into account the predicted neutron emissivities in the different MAST Upgrade operating scenarios. Based on a comparison of the cameras’ profile resolving power, the horizontal cameras are suggested as the best option.

  19. Gamma camera

    International Nuclear Information System (INIS)

    Tschunt, E.; Platz, W.; Baer, Ul; Heinz, L.

    1978-01-01

    A gamma camera has a plurality of exchangeable collimators, one of which is replaceably mounted in the ray inlet opening of the camera, while the others are placed on separate supports. Supports are swingably mounted upon a column one above the other

  20. A pixellated gamma-camera based on CdTe detectors clinical interests and performances

    CERN Document Server

    Chambron, J; Eclancher, B; Scheiber, C; Siffert, P; Hage-Ali, M; Regal, R; Kazandjian, A; Prat, V; Thomas, S; Warren, S; Matz, R; Jahnke, A; Karman, M; Pszota, A; Németh, L

    2000-01-01

    A mobile gamma camera dedicated to nuclear cardiology, based on a 15 cmx15 cm detection matrix of 2304 CdTe detector elements, 2.83 mmx2.83 mmx2 mm, has been developed with a European Community support to academic and industrial research centres. The intrinsic properties of the semiconductor crystals - low-ionisation energy, high-energy resolution, high attenuation coefficient - are potentially attractive to improve the gamma-camera performances. But their use as gamma detectors for medical imaging at high resolution requires production of high-grade materials and large quantities of sophisticated read-out electronics. The decision was taken to use CdTe rather than CdZnTe, because the manufacturer (Eurorad, France) has a large experience for producing high-grade materials, with a good homogeneity and stability and whose transport properties, characterised by the mobility-lifetime product, are at least 5 times greater than that of CdZnTe. The detector matrix is divided in 9 square units, each unit is composed ...

  1. Camera Traps Can Be Heard and Seen by Animals

    Science.gov (United States)

    Meek, Paul D.; Ballard, Guy-Anthony; Fleming, Peter J. S.; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals’ hearing and produce illumination that can be seen by many species. PMID:25354356

  2. Camera traps can be heard and seen by animals.

    Directory of Open Access Journals (Sweden)

    Paul D Meek

    Full Text Available Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5 and infrared illumination outputs (n = 7 of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21 and assessed the vision ranges (n = 3 of mammals species (where data existed to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  3. Ground-based search for the brightest transiting planets with the Multi-site All-Sky CAmeRA: MASCARA

    Science.gov (United States)

    Snellen, Ignas A. G.; Stuik, Remko; Navarro, Ramon; Bettonvil, Felix; Kenworthy, Matthew; de Mooij, Ernst; Otten, Gilles; ter Horst, Rik; le Poole, Rudolf

    2012-09-01

    The Multi-site All-sky CAmeRA MASCARA is an instrument concept consisting of several stations across the globe, with each station containing a battery of low-cost cameras to monitor the near-entire sky at each location. Once all stations have been installed, MASCARA will be able to provide a nearly 24-hr coverage of the complete dark sky, down to magnitude 8, at sub-minute cadence. Its purpose is to find the brightest transiting exoplanet systems, expected in the V=4-8 magnitude range - currently not probed by space- or ground-based surveys. The bright/nearby transiting planet systems, which MASCARA will discover, will be the key targets for detailed planet atmosphere observations. We present studies on the initial design of a MASCARA station, including the camera housing, domes, and computer equipment, and on the photometric stability of low-cost cameras showing that a precision of 0.3-1% per hour can be readily achieved. We plan to roll out the first MASCARA station before the end of 2013. A 5-station MASCARA can within two years discover up to a dozen of the brightest transiting planet systems in the sky.

  4. Semiotic Analysis of Canon Camera Advertisements

    OpenAIRE

    INDRAWATI, SUSAN

    2015-01-01

    Keywords: Semiotic Analysis, Canon Camera, Advertisement. Advertisement is a medium to deliver message to people with the goal to influence the to use certain products. Semiotics is applied to develop a correlation within element used in an advertisement. In this study, the writer chose the Semiotic analysis of canon camera advertisement as the subject to be analyzed using semiotic study based on Peirce's theory. Semiotic approach is employed in interpreting the sign, symbol, icon, and index ...

  5. Regulatory considerations and quality assurance of depleted uranium based radiography cameras

    International Nuclear Information System (INIS)

    Sapkal, Jyotsna A.; Yadav, R.K.B.; Amrota, C.T.; Singh, Pratap; GopaIakrishanan, R.H.; Patil, B.N.; Mane, Nilesh

    2016-01-01

    Radiography cameras with shielding material as Depleted Uranium (DU) are used for containment of Iridium ( 192 Ir) source. DU shielding surrounds the titanium made 'S' tube through which the encapsulated 192 Ir source along with the pigtail travels. As per guidelines, it is required to check periodically the shielding integrity of DU shielding periodically by monitoring for alpha transferable contamination inside the 'S' tube. This paper describes in brief the method followed for collection of samples from inside the 'S' tube . The samples were analysed for transferable contamination due to gross alpha using alpha scintillation (ALSCIN) counter. The gross alpha contamination in the 'S' tube was found to be less than the recommended USNRC value for discarding the radiography camera. IAEA recommendations related to transferable contamination and AERB guidelines on the quality assurance (QA) requirements of radiography camera were studied

  6. Beyond leaf color: Comparing camera-based phenological metrics with leaf biochemical, biophysical, and spectral properties throughout the growing season of a temperate deciduous forest

    Science.gov (United States)

    Yang, Xi; Tang, Jianwu; Mustard, John F.

    2014-03-01

    Plant phenology, a sensitive indicator of climate change, influences vegetation-atmosphere interactions by changing the carbon and water cycles from local to global scales. Camera-based phenological observations of the color changes of the vegetation canopy throughout the growing season have become popular in recent years. However, the linkages between camera phenological metrics and leaf biochemical, biophysical, and spectral properties are elusive. We measured key leaf properties including chlorophyll concentration and leaf reflectance on a weekly basis from June to November 2011 in a white oak forest on the island of Martha's Vineyard, Massachusetts, USA. Concurrently, we used a digital camera to automatically acquire daily pictures of the tree canopies. We found that there was a mismatch between the camera-based phenological metric for the canopy greenness (green chromatic coordinate, gcc) and the total chlorophyll and carotenoids concentration and leaf mass per area during late spring/early summer. The seasonal peak of gcc is approximately 20 days earlier than the peak of the total chlorophyll concentration. During the fall, both canopy and leaf redness were significantly correlated with the vegetation index for anthocyanin concentration, opening a new window to quantify vegetation senescence remotely. Satellite- and camera-based vegetation indices agreed well, suggesting that camera-based observations can be used as the ground validation for satellites. Using the high-temporal resolution dataset of leaf biochemical, biophysical, and spectral properties, our results show the strengths and potential uncertainties to use canopy color as the proxy of ecosystem functioning.

  7. A clinical gamma camera-based pinhole collimated system for high resolution small animal SPECT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Mejia, J.; Galvis-Alonso, O.Y., E-mail: mejia_famerp@yahoo.com.b [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Biologia Molecular; Castro, A.A. de; Simoes, M.V. [Faculdade de Medicina de Sao Jose do Rio Preto (FAMERP), SP (Brazil). Dept. de Clinica Medica; Leite, J.P. [Universidade de Sao Paulo (FMRP/USP), Ribeirao Preto, SP (Brazil). Fac. de Medicina. Dept. de Neurociencias e Ciencias do Comportamento; Braga, J. [Instituto Nacional de Pesquisas Espaciais (INPE), Sao Jose dos Campos, SP (Brazil). Div. de Astrofisica

    2010-11-15

    The main objective of the present study was to upgrade a clinical gamma camera to obtain high resolution tomographic images of small animal organs. The system is based on a clinical gamma camera to which we have adapted a special-purpose pinhole collimator and a device for positioning and rotating the target based on a computer-controlled step motor. We developed a software tool to reconstruct the target's three-dimensional distribution of emission from a set of planar projections, based on the maximum likelihood algorithm. We present details on the hardware and software implementation. We imaged phantoms and heart and kidneys of rats. When using pinhole collimators, the spatial resolution and sensitivity of the imaging system depend on parameters such as the detector-to-collimator and detector-to-target distances and pinhole diameter. In this study, we reached an object voxel size of 0.6 mm and spatial resolution better than 2.4 and 1.7 mm full width at half maximum when 1.5- and 1.0-mm diameter pinholes were used, respectively. Appropriate sensitivity to study the target of interest was attained in both cases. Additionally, we show that as few as 12 projections are sufficient to attain good quality reconstructions, a result that implies a significant reduction of acquisition time and opens the possibility for radiotracer dynamic studies. In conclusion, a high resolution single photon emission computed tomography (SPECT) system was developed using a commercial clinical gamma camera, allowing the acquisition of detailed volumetric images of small animal organs. This type of system has important implications for research areas such as Cardiology, Neurology or Oncology. (author)

  8. Radiation-resistant camera tube

    International Nuclear Information System (INIS)

    Kuwahata, Takao; Manabe, Sohei; Makishima, Yasuhiro

    1982-01-01

    It was a long time ago that Toshiba launched on manufacturing black-and-white radiation-resistant camera tubes employing nonbrowning face-plate glass for ITV cameras used in nuclear power plants. Now in compliance with the increasing demand in nuclear power field, the Company is at grips with the development of radiation-resistant single color-camera tubes incorporating a color-stripe filter for color ITV cameras used under radiation environment. Herein represented are the results of experiments on characteristics of materials for single color-camera tubes and prospects for commercialization of the tubes. (author)

  9. Six-frame picosecond radiation camera based on hydrated electron photoabsorption phenomena

    International Nuclear Information System (INIS)

    Coutts, G.W.; Olk, L.B.; Gates, H.A.; St Leger-Barter, G.

    1977-01-01

    To obtain picosecond photographs of nanosecond radiation sources, a six-frame ultra-high speed radiation camera based on hydrated electron absorption phenomena has been developed. A time-dependent opacity pattern is formed in an acidic aqueous cell by a pulsed radiation source. Six time-resolved picosecond images of this changing opacity pattern are transferred to photographic film with the use of a mode-locked dye laser and six electronically gated microchannel plate image intensifiers. Because the lifetime of the hydrated electron absorption centers can be reduced to picoseconds, the opacity patterns represent time-space pulse profile images

  10. Scent Lure Effect on Camera-Trap Based Leopard Density Estimates.

    Directory of Open Access Journals (Sweden)

    Alexander Richard Braczkowski

    Full Text Available Density estimates for large carnivores derived from camera surveys often have wide confidence intervals due to low detection rates. Such estimates are of limited value to authorities, which require precise population estimates to inform conservation strategies. Using lures can potentially increase detection, improving the precision of estimates. However, by altering the spatio-temporal patterning of individuals across the camera array, lures may violate closure, a fundamental assumption of capture-recapture. Here, we test the effect of scent lures on the precision and veracity of density estimates derived from camera-trap surveys of a protected African leopard population. We undertook two surveys (a 'control' and 'treatment' survey on Phinda Game Reserve, South Africa. Survey design remained consistent except a scent lure was applied at camera-trap stations during the treatment survey. Lures did not affect the maximum movement distances (p = 0.96 or temporal activity of female (p = 0.12 or male leopards (p = 0.79, and the assumption of geographic closure was met for both surveys (p >0.05. The numbers of photographic captures were also similar for control and treatment surveys (p = 0.90. Accordingly, density estimates were comparable between surveys (although estimates derived using non-spatial methods (7.28-9.28 leopards/100km2 were considerably higher than estimates from spatially-explicit methods (3.40-3.65 leopards/100km2. The precision of estimates from the control and treatment surveys, were also comparable and this applied to both non-spatial and spatial methods of estimation. Our findings suggest that at least in the context of leopard research in productive habitats, the use of lures is not warranted.

  11. New camera-based microswitch technology to monitor small head and mouth responses of children with multiple disabilities.

    Science.gov (United States)

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Furniss, Fred

    2014-06-01

    Assessing a new camera-based microswitch technology, which did not require the use of color marks on the participants' face. Two children with extensive multiple disabilities participated. The responses selected for them consisted of small, lateral head movements and mouth closing or opening. The intervention was carried out according to a multiple probe design across responses. The technology involved a computer with a CPU using a 2-GHz clock, a USB video camera with a 16-mm lens, a USB cable connecting the camera and the computer, and a special software program written in ISO C++ language. The new technology was satisfactorily used with both children. Large increases in their responding were observed during the intervention periods (i.e. when the responses were followed by preferred stimulation). The new technology may be an important resource for persons with multiple disabilities and minimal motor behavior.

  12. Wheat streak mosaic virus coat protein is a determinant for vector transmission by the wheat curl mite

    Science.gov (United States)

    Wheat streak mosaic virus (WSMV; genus Tritimovirus; family Potyviridae), is transmitted by the wheat curl mite (Aceria tosichella Keifer). The requirement of coat protein (CP) for WSMV transmission by the wheat curl mite was examined using a series of viable deletion and point mutations. Mite trans...

  13. GRACE star camera noise

    Science.gov (United States)

    Harvey, Nate

    2016-08-01

    Extending results from previous work by Bandikova et al. (2012) and Inacio et al. (2015), this paper analyzes Gravity Recovery and Climate Experiment (GRACE) star camera attitude measurement noise by processing inter-camera quaternions from 2003 to 2015. We describe a correction to star camera data, which will eliminate a several-arcsec twice-per-rev error with daily modulation, currently visible in the auto-covariance function of the inter-camera quaternion, from future GRACE Level-1B product releases. We also present evidence supporting the argument that thermal conditions/settings affect long-term inter-camera attitude biases by at least tens-of-arcsecs, and that several-to-tens-of-arcsecs per-rev star camera errors depend largely on field-of-view.

  14. Note: Tormenta: An open source Python-powered control software for camera based optical microscopy.

    Science.gov (United States)

    Barabas, Federico M; Masullo, Luciano A; Stefani, Fernando D

    2016-12-01

    Until recently, PC control and synchronization of scientific instruments was only possible through closed-source expensive frameworks like National Instruments' LabVIEW. Nowadays, efficient cost-free alternatives are available in the context of a continuously growing community of open-source software developers. Here, we report on Tormenta, a modular open-source software for the control of camera-based optical microscopes. Tormenta is built on Python, works on multiple operating systems, and includes some key features for fluorescence nanoscopy based on single molecule localization.

  15. Gamma cameras - a method of evaluation

    International Nuclear Information System (INIS)

    Oates, L.; Bibbo, G.

    2000-01-01

    Full text: With the sophistication and longevity of the modern gamma camera it is not often that the need arises to evaluate a gamma camera for purchase. We have recently been placed in the position of retiring our two single headed cameras of some vintage and replacing them with a state of the art dual head variable angle gamma camera. The process used for the evaluation consisted of five parts: (1) Evaluation of the technical specification as expressed in the tender document; (2) A questionnaire adapted from the British Society of Nuclear Medicine; (3) Site visits to assess gantry configuration, movement, patient access and occupational health, welfare and safety considerations; (4) Evaluation of the processing systems offered; (5) Whole of life costing based on equally configured systems. The results of each part of the evaluation were expressed using a weighted matrix analysis with each of the criteria assessed being weighted in accordance with their importance to the provision of an effective nuclear medicine service for our centre and the particular importance to paediatric nuclear medicine. This analysis provided an objective assessment of each gamma camera system from which a purchase recommendation was made. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  16. X-ray imaging using digital cameras

    Science.gov (United States)

    Winch, Nicola M.; Edgar, Andrew

    2012-03-01

    The possibility of using the combination of a computed radiography (storage phosphor) cassette and a semiprofessional grade digital camera for medical or dental radiography is investigated. We compare the performance of (i) a Canon 5D Mk II single lens reflex camera with f1.4 lens and full-frame CMOS array sensor and (ii) a cooled CCD-based camera with a 1/3 frame sensor and the same lens system. Both systems are tested with 240 x 180 mm cassettes which are based on either powdered europium-doped barium fluoride bromide or needle structure europium-doped cesium bromide. The modulation transfer function for both systems has been determined and falls to a value of 0.2 at around 2 lp/mm, and is limited by light scattering of the emitted light from the storage phosphor rather than the optics or sensor pixelation. The modulation transfer function for the CsBr:Eu2+ plate is bimodal, with a high frequency wing which is attributed to the light-guiding behaviour of the needle structure. The detective quantum efficiency has been determined using a radioisotope source and is comparatively low at 0.017 for the CMOS camera and 0.006 for the CCD camera, attributed to the poor light harvesting by the lens. The primary advantages of the method are portability, robustness, digital imaging and low cost; the limitations are the low detective quantum efficiency and hence signal-to-noise ratio for medical doses, and restricted range of plate sizes. Representative images taken with medical doses are shown and illustrate the potential use for portable basic radiography.

  17. Effects of Hot Streak and Phantom Cooling on Heat Transfer in a Cooled Turbine Stage Including Particulate Deposition

    Energy Technology Data Exchange (ETDEWEB)

    Bons, Jeffrey [The Ohio State Univ., Columbus, OH (United States); Ameri, Ali [The Ohio State Univ., Columbus, OH (United States)

    2016-01-08

    The objective of this research effort was to develop a validated computational modeling capability for the characterization of the effects of hot streaks and particulate deposition on the heat load of modern gas turbines. This was accomplished with a multi-faceted approach including analytical, experimental, and computational components. A 1-year no cost extension request was approved for this effort, so the total duration was 4 years. The research effort succeeded in its ultimate objective by leveraging extensive experimental deposition studies complemented by computational modeling. Experiments were conducted with hot streaks, vane cooling, and combinations of hot streaks with vane cooling. These studies contributed to a significant body of corporate knowledge of deposition, in combination with particle rebound and deposition studies funded by other agencies, to provide suitable conditions for the development of a new model. The model includes the following physical phenomena: elastic deformation, plastic deformation, adhesion, and shear removal. It also incorporates material property sensitivity to temperature and tangential-normal velocity rebound cross-dependencies observed in experiments. The model is well-suited for incorporation in CFD simulations of complex gas turbine flows due to its algebraic (explicit) formulation. This report contains model predictions compared to coefficient of restitution data available in the open literature as well as deposition results from two different high temperature turbine deposition facilities. While the model comparisons with experiments are in many cases promising, several key aspects of particle deposition remain elusive. The simple phenomenological nature of the model allows for parametric dependencies to be evaluated in a straightforward manner. This effort also included the first-ever full turbine stage deposition model published in the open literature. The simulations included hot streaks and simulated vane cooling

  18. High speed photography, videography, and photonics V; Proceedings of the Meeting, San Diego, CA, Aug. 17-19, 1987

    International Nuclear Information System (INIS)

    Johnson, H.C.

    1988-01-01

    Recent advances in high-speed optical and electrooptic devices are discussed in reviews and reports. Topics examined include data quantification and related technologies, high-speed photographic applications and instruments, flash and cine radiography, and novel ultrafast methods. Also considered are optical streak technology, high-speed videographic and photographic equipment, and X-ray streak cameras. Extensive diagrams, drawings, graphs, sample images, and tables of numerical data are provided

  19. High resolution time- and 2-dimensional space-resolved x-ray imaging of plasmas at NOVA

    International Nuclear Information System (INIS)

    Landen, O.L.

    1992-01-01

    A streaked multiple pinhole camera technique, first used by P. Choi et al. to record time- and 2-D space-resolved soft X-ray images of plasma pinches, has been implemented on laser plasmas at NOVA. The instrument is particularly useful for time-resolved imaging of small sources ( 2.5 key imaging, complementing the existing 1--3 key streaked X-ray microscope capabilities at NOVA

  20. High speed photography, videography, and photonics V; Proceedings of the Meeting, San Diego, CA, Aug. 17-19, 1987

    Science.gov (United States)

    Johnson, Howard C. (Editor)

    1988-01-01

    Recent advances in high-speed optical and electrooptic devices are discussed in reviews and reports. Topics examined include data quantification and related technologies, high-speed photographic applications and instruments, flash and cine radiography, and novel ultrafast methods. Also considered are optical streak technology, high-speed videographic and photographic equipment, and X-ray streak cameras. Extensive diagrams, drawings, graphs, sample images, and tables of numerical data are provided.

  1. Genomic and phylogenetic evidence that Maize rough dwarf and Rice black-streaked dwarf fijiviruses should be classified as different geographic strains of a single species.

    Science.gov (United States)

    Xie, L; Lv, M-F; Yang, J; Chen, J-P; Zhang, H-M

    Maize rough dwarf disease (MRDD) has long been known as one of the most devastating viral diseases of maize worldwide and is caused by single or complex infection by four fijiviruses: Maize rough dwarf virus (MRDV) in Europe and the Middle East, Mal de Rio Cuarto virus (MRCV) in South America, rice black-streaked dwarf virus (RBSDV), and Southern rice black-streaked dwarf virus (SRBSDV or Rice black-streaked dwarf virus 2, RBSDV-2) in East Asia. These are currently classified as four distinct species in the genus Fijivirus, family Reoviridae, but their taxonomic status has been questioned. To help resolve this, the nucleotide sequences of the ten genomic segments of an Italian isolate of MRDV have been determined, providing the first complete genomic sequence of this virus. Its genome has 29144 nucleotides and is similar in organization to those of RBSDV, SRBSDV, and MRCV. The 13 ORFs always share highest identities (81.3-97.2%) with the corresponding ORFs of RBSDV and phylogenetic analyses of the different genome segments and ORFs all confirm that MRDV clusters most closely with RBSDV and that MRCV and SRBSDV are slightly more distantly related. The results suggest that MRDV and RBSDV should be classified as different geographic strains of the same virus species and we suggest the name cereal black-streaked dwarf fijivirus (CBSDV) for consideration.

  2. Nonmedical applications of a positron camera

    International Nuclear Information System (INIS)

    Hawkesworth, M.R.; Parker, D.J.; Fowles, P.; Crilly, J.F.; Jefferies, N.L.; Jonkers, G.

    1991-01-01

    The positron camera in the School on Physics and Space Research, University of Birmingham, is based on position-sensitive multiwire γ-ray detectors developed at the Rutherford Appleton Laboratory. The current characteristics of the camera are discussed with particular reference to its suitability for flow mapping in industrial subjects. The techniques developed for studying the dynamics of processes with time scales ranging from milliseconds to days are described, and examples of recent results from a variety of industrial applications are presented. (orig.)

  3. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects.

    Science.gov (United States)

    Bulczak, David; Lambers, Martin; Kolb, Andreas

    2017-12-22

    In the last decade, Time-of-Flight (ToF) range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW) ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF) measurements for selected, purchasable materials in the near-infrared (NIR) range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  4. Quantified, Interactive Simulation of AMCW ToF Camera Including Multipath Effects

    Directory of Open Access Journals (Sweden)

    David Bulczak

    2017-12-01

    Full Text Available In the last decade, Time-of-Flight (ToF range cameras have gained increasing popularity in robotics, automotive industry, and home entertainment. Despite technological developments, ToF cameras still suffer from error sources such as multipath interference or motion artifacts. Thus, simulation of ToF cameras, including these artifacts, is important to improve camera and algorithm development. This paper presents a physically-based, interactive simulation technique for amplitude modulated continuous wave (AMCW ToF cameras, which, among other error sources, includes single bounce indirect multipath interference based on an enhanced image-space approach. The simulation accounts for physical units down to the charge level accumulated in sensor pixels. Furthermore, we present the first quantified comparison for ToF camera simulators. We present bidirectional reference distribution function (BRDF measurements for selected, purchasable materials in the near-infrared (NIR range, craft real and synthetic scenes out of these materials and quantitatively compare the range sensor data.

  5. New continuous recording procedure of holographic information on transient phenomena

    Science.gov (United States)

    Nagayama, Kunihito; Nishihara, H. Keith; Murakami, Terutoshi

    1992-09-01

    A new method for continuous recording of holographic information, 'streak holography,' is proposed. This kind of record can be useful for velocity and acceleration measurement as well as for observing a moving object whose trajectory cannot be predicted in advance. A very high speed camera system has been designed and constructed for streak holography. A ring-shaped 100-mm-diam film has been cut out from the high-resolution sheet film and mounted on a thin duralmin disk, which has been driven to rotate directly by an air-turbine spindle. Attainable streak velocity is 0.3 mm/microsecond(s) . A direct film drive mechanism makes it possible to use a relay lens system of extremely small f number. The feasibility of the camera system has been demonstrated by observing several transient events, such as the forced oscillation of a wire and the free fall of small glass particles, using an argon-ion laser as a light source.

  6. A direct-view customer-oriented digital holographic camera

    Science.gov (United States)

    Besaga, Vira R.; Gerhardt, Nils C.; Maksimyak, Peter P.; Hofmann, Martin R.

    2018-01-01

    In this paper, we propose a direct-view digital holographic camera system consisting mostly of customer-oriented components. The camera system is based on standard photographic units such as camera sensor and objective and is adapted to operate under off-axis external white-light illumination. The common-path geometry of the holographic module of the system ensures direct-view operation. The system can operate in both self-reference and self-interference modes. As a proof of system operability, we present reconstructed amplitude and phase information of a test sample.

  7. Proposal of secure camera-based radiation warning system for nuclear detection

    International Nuclear Information System (INIS)

    Tsuchiya, Ken'ichi; Kurosawa, Kenji; Akiba, Norimitsu; Kakuda, Hidetoshi; Imoto, Daisuke; Hirabayashi, Manato; Kuroki, Kenro

    2016-01-01

    Counter-terrorisms against radiological and nuclear threat are significant issues toward Tokyo 2020 Olympic and Paralympic Games. In terms of cost benefit, it is not easy to build a warning system for nuclear detection to prevent a Dirty Bomb attack (dispersion of radioactive materials using a conventional explosive) or a Silent Source attack (hidden radioactive materials) from occurring. We propose a nuclear detection system using the installed secure cameras. We describe a method to estimate radiation dose from noise pattern in CCD images caused by radiation. Some dosimeters under neutron and gamma-ray irradiations (0.1mSv-100mSv) were taken in CCD video camera. We confirmed amount of noise in CCD images increased in radiation exposure. The radiation detection using CMOS in secure cameras or cell phones has been implemented. However, in this presentation, we propose a warning system including neutron detection to search shielded nuclear materials or radiation exposure devices using criticality. (author)

  8. Calibration of a Stereo Radiation Detection Camera Using Planar Homography

    Directory of Open Access Journals (Sweden)

    Seung-Hae Baek

    2016-01-01

    Full Text Available This paper proposes a calibration technique of a stereo gamma detection camera. Calibration of the internal and external parameters of a stereo vision camera is a well-known research problem in the computer vision society. However, few or no stereo calibration has been investigated in the radiation measurement research. Since no visual information can be obtained from a stereo radiation camera, it is impossible to use a general stereo calibration algorithm directly. In this paper, we develop a hybrid-type stereo system which is equipped with both radiation and vision cameras. To calibrate the stereo radiation cameras, stereo images of a calibration pattern captured from the vision cameras are transformed in the view of the radiation cameras. The homography transformation is calibrated based on the geometric relationship between visual and radiation camera coordinates. The accuracy of the stereo parameters of the radiation camera is analyzed by distance measurements to both visual light and gamma sources. The experimental results show that the measurement error is about 3%.

  9. Gamma camera

    International Nuclear Information System (INIS)

    Schlosser, P.A.; Steidley, J.W.

    1980-01-01

    The design of a collimation system for a gamma camera for use in nuclear medicine is described. When used with a 2-dimensional position sensitive radiation detector, the novel system can produce superior images than conventional cameras. The optimal thickness and positions of the collimators are derived mathematically. (U.K.)

  10. Picosecond camera

    International Nuclear Information System (INIS)

    Decroisette, Michel

    A Kerr cell activated by infrared pulses of a model locked Nd glass laser, acts as an ultra-fast and periodic shutter, with a few p.s. opening time. Associated with a S.T.L. camera, it gives rise to a picosecond camera allowing us to study very fast effects [fr

  11. Selecting the right digital camera for telemedicine-choice for 2009.

    Science.gov (United States)

    Patricoski, Chris; Ferguson, A Stewart; Brudzinski, Jay; Spargo, Garret

    2010-03-01

    Digital cameras are fundamental tools for store-and-forward telemedicine (electronic consultation). The choice of a camera may significantly impact this consultative process based on the quality of the images, the ability of users to leverage the cameras' features, and other facets of the camera design. The goal of this research was to provide a substantive framework and clearly defined process for reviewing digital cameras and to demonstrate the results obtained when employing this process to review point-and-shoot digital cameras introduced in 2009. The process included a market review, in-house evaluation of features, image reviews, functional testing, and feature prioritization. Seventy-two cameras were identified new on the market in 2009, and 10 were chosen for in-house evaluation. Four cameras scored very high for mechanical functionality and ease-of-use. The final analysis revealed three cameras that had excellent scores for both color accuracy and photographic detail and these represent excellent options for telemedicine: Canon Powershot SD970 IS, Fujifilm FinePix F200EXR, and Panasonic Lumix DMC-ZS3. Additional features of the Canon Powershot SD970 IS make it the camera of choice for our Alaska program.

  12. Calibration Procedures in Mid Format Camera Setups

    Science.gov (United States)

    Pivnicka, F.; Kemper, G.; Geissler, S.

    2012-07-01

    A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU), the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and camera can be applied

  13. Reducing the Variance of Intrinsic Camera Calibration Results in the ROS Camera_Calibration Package

    Science.gov (United States)

    Chiou, Geoffrey Nelson

    The intrinsic calibration of a camera is the process in which the internal optical and geometric characteristics of the camera are determined. If accurate intrinsic parameters of a camera are known, the ray in 3D space that every point in the image lies on can be determined. Pairing with another camera allows for the position of the points in the image to be calculated by intersection of the rays. Accurate intrinsics also allow for the position and orientation of a camera relative to some world coordinate system to be calculated. These two reasons for having accurate intrinsic calibration for a camera are especially important in the field of industrial robotics where 3D cameras are frequently mounted on the ends of manipulators. In the ROS (Robot Operating System) ecosystem, the camera_calibration package is the default standard for intrinsic camera calibration. Several researchers from the Industrial Robotics & Automation division at Southwest Research Institute have noted that this package results in large variances in the intrinsic parameters of the camera when calibrating across multiple attempts. There are also open issues on this matter in their public repository that have not been addressed by the developers. In this thesis, we confirm that the camera_calibration package does indeed return different results across multiple attempts, test out several possible hypothesizes as to why, identify the reason, and provide simple solution to fix the cause of the issue.

  14. Commercialization of radiation tolerant camera

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10{sup 6} - 10{sup 8} rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  15. Commercialization of radiation tolerant camera

    International Nuclear Information System (INIS)

    Lee, Yong Bum; Choi, Young Soo; Kim, Sun Ku; Lee, Jong Min; Cha, Bung Hun; Lee, Nam Ho; Byun, Eiy Gyo; Yoo, Seun Wook; Choi, Bum Ki; Yoon, Sung Up; Kim, Hyun Gun; Sin, Jeong Hun; So, Suk Il

    1999-12-01

    In this project, radiation tolerant camera which tolerates 10 6 - 10 8 rad total dose is developed. In order to develop radiation tolerant camera, radiation effect of camera components was examined and evaluated, and camera configuration was studied. By the result of evaluation, the components were decided and design was performed. Vidicon tube was selected to use by image sensor and non-browning optics and camera driving circuit were applied. The controller needed for CCTV camera system, lens, light, pan/tilt controller, was designed by the concept of remote control. And two type of radiation tolerant camera were fabricated consider to use in underwater environment or normal environment. (author)

  16. Cameras in mobile phones

    Science.gov (United States)

    Nummela, Ville; Viinikanoja, Jarkko; Alakarhu, Juha

    2006-04-01

    One of the fastest growing markets in consumer markets today are camera phones. During past few years total volume has been growing fast and today millions of mobile phones with camera will be sold. At the same time resolution and functionality of the cameras has been growing from CIF towards DSC level. From camera point of view the mobile world is an extremely challenging field. Cameras should have good image quality but in small size. They also need to be reliable and their construction should be suitable for mass manufacturing. All components of the imaging chain should be well optimized in this environment. Image quality and usability are the most important parameters to user. The current trend of adding more megapixels to cameras and at the same time using smaller pixels is affecting both. On the other hand reliability and miniaturization are key drivers for product development as well as the cost. In optimized solution all parameters are in balance but the process of finding the right trade-offs is not an easy task. In this paper trade-offs related to optics and their effects to image quality and usability of cameras are discussed. Key development areas from mobile phone camera point of view are also listed.

  17. Remote hardware-reconfigurable robotic camera

    Science.gov (United States)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.

    2001-10-01

    In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.

  18. Unmanned Ground Vehicle Perception Using Thermal Infrared Cameras

    Science.gov (United States)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry; Bajracharya, Max; Assad, Christopher; Brennan, Shane; Bellutta, Paolo; Sherwin, Gary W.

    2011-01-01

    The ability to perform off-road autonomous navigation at any time of day or night is a requirement for some unmanned ground vehicle (UGV) programs. Because there are times when it is desirable for military UGVs to operate without emitting strong, detectable electromagnetic signals, a passive only terrain perception mode of operation is also often a requirement. Thermal infrared (TIR) cameras can be used to provide day and night passive terrain perception. TIR cameras have a detector sensitive to either mid-wave infrared (MWIR) radiation (3-5?m) or long-wave infrared (LWIR) radiation (8-12?m). With the recent emergence of high-quality uncooled LWIR cameras, TIR cameras have become viable passive perception options for some UGV programs. The Jet Propulsion Laboratory (JPL) has used a stereo pair of TIR cameras under several UGV programs to perform stereo ranging, terrain mapping, tree-trunk detection, pedestrian detection, negative obstacle detection, and water detection based on object reflections. In addition, we have evaluated stereo range data at a variety of UGV speeds, evaluated dual-band TIR classification of soil, vegetation, and rock terrain types, analyzed 24 hour water and 12 hour mud TIR imagery, and analyzed TIR imagery for hazard detection through smoke. Since TIR cameras do not currently provide the resolution available from megapixel color cameras, a UGV's daytime safe speed is often reduced when using TIR instead of color cameras. In this paper, we summarize the UGV terrain perception work JPL has performed with TIR cameras over the last decade and describe a calibration target developed by General Dynamics Robotic Systems (GDRS) for TIR cameras and other sensors.

  19. Hardware accelerator design for change detection in smart camera

    Science.gov (United States)

    Singh, Sanjay; Dunga, Srinivasa Murali; Saini, Ravi; Mandal, A. S.; Shekhar, Chandra; Chaudhury, Santanu; Vohra, Anil

    2011-10-01

    Smart Cameras are important components in Human Computer Interaction. In any remote surveillance scenario, smart cameras have to take intelligent decisions to select frames of significant changes to minimize communication and processing overhead. Among many of the algorithms for change detection, one based on clustering based scheme was proposed for smart camera systems. However, such an algorithm could achieve low frame rate far from real-time requirements on a general purpose processors (like PowerPC) available on FPGAs. This paper proposes the hardware accelerator capable of detecting real time changes in a scene, which uses clustering based change detection scheme. The system is designed and simulated using VHDL and implemented on Xilinx XUP Virtex-IIPro FPGA board. Resulted frame rate is 30 frames per second for QVGA resolution in gray scale.

  20. Practical issues of retrieving isolated attosecond pulses

    International Nuclear Information System (INIS)

    Wang He; Chini, Michael; Khan, Sabih D; Chen, Shouyuan; Gilbertson, Steve; Feng Ximao; Mashiko, Hiroki; Chang Zenghu

    2009-01-01

    The attosecond streaking technique is used for the characterization of isolated extreme ultraviolet (XUV) attosecond pulses. This type of measurement suffers from low photoelectron counts in the streaked spectrogram, and is thus susceptible to shot noise. For the retrieval of few- or mono-cycle attosecond pulses, high-intensity streaking laser fields are required, which cause the energy spectrum of above-threshold ionized (ATI) electrons to overlap with that of the streaked photoelectrons. It is found by using the principal component generalized projections algorithm that the XUV attosecond pulse can accurately be retrieved for simulated and experimental spectrograms with a peak value of 50 or more photoelectron counts. Also, the minimum streaking intensity is found to be more than 50 times smaller than that required by the classical streaking camera for retrieval of pulses with a spectral bandwidth supporting 90 as transform-limited pulse durations. Furthermore, spatial variation of the streaking laser intensity, collection angle of streaked electrons and time delay jitter between the XUV pulse and streaking field can degrade the quality of the streaked spectrogram. We find that even when the XUV and streaking laser focal spots are comparable in size, the streaking electrons are collected from a 4π solid angle, or the delay fluctuates by more than the attosecond pulse duration, the attosecond pulses can still be accurately retrieved. In order to explain the insusceptibility of the streaked spectrogram to these factors, the linearity of the streaked spectrogram with respect to the streaking field is derived under the saddle point approximation.

  1. Do it yourself smartphone fundus camera – DIYretCAM

    Directory of Open Access Journals (Sweden)

    Biju Raju

    2016-01-01

    Full Text Available This article describes the method to make a do it yourself smartphone-based fundus camera which can image the central retina as well as the peripheral retina up to the pars plana. It is a cost-effective alternative to the fundus camera.

  2. SCC500: next-generation infrared imaging camera core products with highly flexible architecture for unique camera designs

    Science.gov (United States)

    Rumbaugh, Roy N.; Grealish, Kevin; Kacir, Tom; Arsenault, Barry; Murphy, Robert H.; Miller, Scott

    2003-09-01

    A new 4th generation MicroIR architecture is introduced as the latest in the highly successful Standard Camera Core (SCC) series by BAE SYSTEMS to offer an infrared imaging engine with greatly reduced size, weight, power, and cost. The advanced SCC500 architecture provides great flexibility in configuration to include multiple resolutions, an industry standard Real Time Operating System (RTOS) for customer specific software application plug-ins, and a highly modular construction for unique physical and interface options. These microbolometer based camera cores offer outstanding and reliable performance over an extended operating temperature range to meet the demanding requirements of real-world environments. A highly integrated lens and shutter is included in the new SCC500 product enabling easy, drop-in camera designs for quick time-to-market product introductions.

  3. Major QTL Conferring Resistance to Rice Bacterial Leaf Streak

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Bacterial leaf streak (BLS) is one of the important limiting factors to rice production in southern China and other tropical and sub-tropical areas in Asia. Resistance to BLS was found to be a quantitative trait and no major resistant gene was located in rice until date. In the present study, a new major quantitative trait locus (QTL) conferring resistance to BLS was identified from a highly resistant variety Dular by the employment of Dular/Balilla (DB) and Dular/IR24 (DI) segregation populations and was designated qBLSR-11-1. This QTL was located between the simple sequence repeat (SSR) markers RM120 and RM441 on chromosome 11 and could account for 18.1-21.7% and 36.3% of the variance in DB and DI populations, respectively. The genetic pattern of rice resistance to BLS was discussed.

  4. Establishing imaging sensor specifications for digital still cameras

    Science.gov (United States)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  5. Stability Analysis for a Multi-Camera Photogrammetric System

    Directory of Open Access Journals (Sweden)

    Ayman Habib

    2014-08-01

    Full Text Available Consumer-grade digital cameras suffer from geometrical instability that may cause problems when used in photogrammetric applications. This paper provides a comprehensive review of this issue of interior orientation parameter variation over time, it explains the common ways used for coping with the issue, and describes the existing methods for performing stability analysis for a single camera. The paper then points out the lack of coverage of stability analysis for multi-camera systems, suggests a modification of the collinearity model to be used for the calibration of an entire photogrammetric system, and proposes three methods for system stability analysis. The proposed methods explore the impact of the changes in interior orientation and relative orientation/mounting parameters on the reconstruction process. Rather than relying on ground truth in real datasets to check the system calibration stability, the proposed methods are simulation-based. Experiment results are shown, where a multi-camera photogrammetric system was calibrated three times, and stability analysis was performed on the system calibration parameters from the three sessions. The proposed simulation-based methods provided results that were compatible with a real-data based approach for evaluating the impact of changes in the system calibration parameters on the three-dimensional reconstruction.

  6. Advantages of computer cameras over video cameras/frame grabbers for high-speed vision applications

    Science.gov (United States)

    Olson, Gaylord G.; Walker, Jo N.

    1997-09-01

    Cameras designed to work specifically with computers can have certain advantages in comparison to the use of cameras loosely defined as 'video' cameras. In recent years the camera type distinctions have become somewhat blurred, with a great presence of 'digital cameras' aimed more at the home markets. This latter category is not considered here. The term 'computer camera' herein is intended to mean one which has low level computer (and software) control of the CCD clocking. These can often be used to satisfy some of the more demanding machine vision tasks, and in some cases with a higher rate of measurements than video cameras. Several of these specific applications are described here, including some which use recently designed CCDs which offer good combinations of parameters such as noise, speed, and resolution. Among the considerations for the choice of camera type in any given application would be such effects as 'pixel jitter,' and 'anti-aliasing.' Some of these effects may only be relevant if there is a mismatch between the number of pixels per line in the camera CCD and the number of analog to digital (A/D) sampling points along a video scan line. For the computer camera case these numbers are guaranteed to match, which alleviates some measurement inaccuracies and leads to higher effective resolution.

  7. PhC-4 new high-speed camera with mirror scanning

    International Nuclear Information System (INIS)

    Daragan, A.O.; Belov, B.G.

    1979-01-01

    The description of the optical system and the construction of the high-speed PhC-4 photographic camera with mirror scanning of the continuously operating type is given. The optical system of the camera is based on the foursided rotating mirror, two optical inlets and two working sectors. The PhC-4 camera provides the framing rate up to 600 thousand frames per second. (author)

  8. Divergence-ratio axi-vision camera (Divcam): A distance mapping camera

    International Nuclear Information System (INIS)

    Iizuka, Keigo

    2006-01-01

    A novel distance mapping camera the divergence-ratio axi-vision camera (Divcam) is proposed. The decay rate of the illuminating light with distance due to the divergence of the light is used as means of mapping the distance. Resolutions of 10 mm over a range of meters and 0.5 mm over a range of decimeters were achieved. The special features of this camera are its high resolution real-time operation, simplicity, compactness, light weight, portability, and yet low fabrication cost. The feasibility of various potential applications is also included

  9. Multi-Channel Data Recording of Marx switch closures

    International Nuclear Information System (INIS)

    Lockwood, G.J.; Ruggles, L.E.; Ziska, G.R.

    1984-01-01

    The authors have measured the optical signals associated with switch closure on the Demon marx at Sandia National Laboratories. Using the High Speed Multi-Channel Data Recorder(HSMCDR), they have recorded the time histories of the optical signals from the thirty switches in the marx generator. All thirty switches were fiber connected to the HSMCDR. The HSMCDR consists of a high speed streak camera, and a microcomputer-based video digitizing system. Since the thirty signals are recorded on a single streak, the time sequence can be determined with great accuracy. The appearance of a given signal can be determined to within two samples of the 256 samples that make up the time streak. The authors have found that the light intensity and time history of any given switch varied over a large range from shot to shot. Thus, the ability to record the entire optical signal as a function of time for each switch on every shot is necessary if accurate timing results are required

  10. CameraHRV: robust measurement of heart rate variability using a camera

    Science.gov (United States)

    Pai, Amruta; Veeraraghavan, Ashok; Sabharwal, Ashutosh

    2018-02-01

    The inter-beat-interval (time period of the cardiac cycle) changes slightly for every heartbeat; this variation is measured as Heart Rate Variability (HRV). HRV is presumed to occur due to interactions between the parasym- pathetic and sympathetic nervous system. Therefore, it is sometimes used as an indicator of the stress level of an individual. HRV also reveals some clinical information about cardiac health. Currently, HRV is accurately measured using contact devices such as a pulse oximeter. However, recent research in the field of non-contact imaging Photoplethysmography (iPPG) has made vital sign measurements using just the video recording of any exposed skin (such as a person's face) possible. The current signal processing methods for extracting HRV using peak detection perform well for contact-based systems but have poor performance for the iPPG signals. The main reason for this poor performance is the fact that current methods are sensitive to large noise sources which are often present in iPPG data. Further, current methods are not robust to motion artifacts that are common in iPPG systems. We developed a new algorithm, CameraHRV, for robustly extracting HRV even in low SNR such as is common with iPPG recordings. CameraHRV combined spatial combination and frequency demodulation to obtain HRV from the instantaneous frequency of the iPPG signal. CameraHRV outperforms other current methods of HRV estimation. Ground truth data was obtained from FDA-approved pulse oximeter for validation purposes. CameraHRV on iPPG data showed an error of 6 milliseconds for low motion and varying skin tone scenarios. The improvement in error was 14%. In case of high motion scenarios like reading, watching and talking, the error was 10 milliseconds.

  11. A Study of the Usability of Ergonomic Camera Vest Based on Spirometry Parameters

    Directory of Open Access Journals (Sweden)

    Shirazeh Arghami

    2017-12-01

    Full Text Available Background: Being a cameraman is one of those occupations that expose people to musculoskeletal disorders (MSDs. Therefore, control measures should be taken to protect cameramen’s health. To solve the given problem, a vest was designed for cameramen to prevent MSDs by reducing the pressure and contact stress while carrying the camera on their shoulder. However, the usability of vest had to be considered. The aim of this study was to determine the usability of the proposed vest using the spirometry parameters indicator. Methods: In this experimental study, 120 spirometry experiments were conducted with 40 male volunteer subjects with and without designed vest. Data were analyzed using SPSS- 16 with dependent t-test, at 0.05 significance level. Results: Based on the spirometry results, there is a significant difference between Forced Vital Capacity (FVC, Forced Expiratory Volume (FEV1 and heart rate in activity with and without vest (p<0.001. Conclusion: The results suggest that the promising impact of this invention on the health of cameramen makes this domestically designed camera vest a good option for mass production.

  12. Thermal Cameras and Applications

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. This type of camera was originally developed as a surveillance and night vision tool for the military, but recently the price has dropped, significantly opening up...... a broader field of applications. Deploying this type of sensor in vision systems eliminates the illumination problems of normal greyscale and RGB cameras. This survey provides an overview of the current applications of thermal cameras. Applications include animals, agriculture, buildings, gas detection......, industrial, and military applications, as well as detection, tracking, and recognition of humans. Moreover, this survey describes the nature of thermal radiation and the technology of thermal cameras....

  13. A wide field X-ray camera

    International Nuclear Information System (INIS)

    Sims, M.; Turner, M.J.L.; Willingale, R.

    1980-01-01

    A wide field of view X-ray camera based on the Dicke or Coded Mask principle is described. It is shown that this type of instrument is more sensitive than a pin-hole camera, or than a scanning survey of a given region of sky for all wide field conditions. The design of a practical camera is discussed and the sensitivity and performance of the chosen design are evaluated by means of computer simulations. The Wiener Filter and Maximum Entropy methods of deconvolution are described and these methods are compared with each other and cross-correlation using data from the computer simulations. It is shown that the analytic expressions for sensitivity used by other workers are confirmed by the simulations, and that ghost images caused by incomplete coding can be substantially eliminated by the use of the Wiener Filter and the Maximum Entropy Method, with some penalty in computer time for the latter. The cyclic mask configuration is compared with the simple mask camera. It is shown that when the diffuse X-ray background dominates, the simple system is more sensitive and has the better angular resolution. When sources dominate the simple system is less sensitive. It is concluded that the simple coded mask camera is the best instrument for wide field imaging of the X-ray sky. (orig.)

  14. Single-Camera-Based Method for Step Length Symmetry Measurement in Unconstrained Elderly Home Monitoring.

    Science.gov (United States)

    Cai, Xi; Han, Guang; Song, Xin; Wang, Jinkuan

    2017-11-01

    single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring. To build unconstrained monitoring environments, we propose a method to measure step length symmetry ratio (a useful gait parameter representing gait symmetry without significant relationship with age) from unconstrained straight walking using a single camera, without strict restrictions on walking directions or routes. according to projective geometry theory, we first develop a calculation formula of step length ratio for the case of unconstrained straight-line walking. Then, to adapt to general cases, we propose to modify noncollinear footprints, and accordingly provide general procedure for step length ratio extraction from unconstrained straight walking. Our method achieves a mean absolute percentage error (MAPE) of 1.9547% for 15 subjects' normal and abnormal side-view gaits, and also obtains satisfactory MAPEs for non-side-view gaits (2.4026% for 45°-view gaits and 3.9721% for 30°-view gaits). The performance is much better than a well-established monocular gait measurement system suitable only for side-view gaits with a MAPE of 3.5538%. Independently of walking directions, our method can accurately estimate step length ratios from unconstrained straight walking. This demonstrates our method is applicable for elders' daily gait monitoring to provide valuable information for elderly health care, such as abnormal gait recognition, fall risk assessment, etc. single-camera-based gait monitoring is unobtrusive, inexpensive, and easy-to-use to monitor daily gait of seniors in their homes. However, most studies require subjects to walk perpendicularly to camera's optical axis or along some specified routes, which limits its application in elderly home monitoring

  15. Cost Effective Paper-Based Colorimetric Microfluidic Devices and Mobile Phone Camera Readers for the Classroom

    Science.gov (United States)

    Koesdjojo, Myra T.; Pengpumkiat, Sumate; Wu, Yuanyuan; Boonloed, Anukul; Huynh, Daniel; Remcho, Thomas P.; Remcho, Vincent T.

    2015-01-01

    We have developed a simple and direct method to fabricate paper-based microfluidic devices that can be used for a wide range of colorimetric assay applications. With these devices, assays can be performed within minutes to allow for quantitative colorimetric analysis by use of a widely accessible iPhone camera and an RGB color reader application…

  16. Gamma camera based Positron Emission Tomography: a study of the viability on quantification

    International Nuclear Information System (INIS)

    Pozzo, Lorena

    2005-01-01

    Positron Emission Tomography (PET) is a Nuclear Medicine imaging modality for diagnostic purposes. Pharmaceuticals labeled with positron emitters are used and images which represent the in vivo biochemical process within tissues can be obtained. The positron/electron annihilation photons are detected in coincidence and this information is used for object reconstruction. Presently, there are two types of systems available for this imaging modality: the dedicated systems and those based on gamma camera technology. In this work, we utilized PET/SPECT systems, which also allows for the traditional Nuclear Medicine studies based on single photon emitters. There are inherent difficulties which affect quantification of activity and other indices. They are related to the Poisson nature of radioactivity, to radiation interactions with patient body and detector, noise due to statistical nature of these interactions and to all the detection processes, as well as the patient acquisition protocols. Corrections are described in the literature and not all of them are implemented by the manufacturers: scatter, attenuation, random, decay, dead time, spatial resolution, and others related to the properties of each equipment. The goal of this work was to assess these methods adopted by two manufacturers, as well as the influence of some technical characteristics of PET/SPECT systems on the estimation of SUV. Data from a set of phantoms were collected in 3D mode by one camera and 2D, by the other. We concluded that quantification is viable in PET/SPECT systems, including the estimation of SUVs. This is only possible if, apart from the above mentioned corrections, the camera is well tuned and coefficients for sensitivity normalization and partial volume corrections are applied. We also verified that the shapes of the sources used for obtaining these factors play a role on the final results and should be delt with carefully in clinical quantification. Finally, the choice of the region

  17. Radiation camera exposure control

    International Nuclear Information System (INIS)

    Martone, R.J.; Yarsawich, M.; Wolczek, W.

    1976-01-01

    A system and method for governing the exposure of an image generated by a radiation camera to an image sensing camera is disclosed. The exposure is terminated in response to the accumulation of a predetermined quantity of radiation, defining a radiation density, occurring in a predetermined area. An index is produced which represents the value of that quantity of radiation whose accumulation causes the exposure termination. The value of the predetermined radiation quantity represented by the index is sensed so that the radiation camera image intensity can be calibrated to compensate for changes in exposure amounts due to desired variations in radiation density of the exposure, to maintain the detectability of the image by the image sensing camera notwithstanding such variations. Provision is also made for calibrating the image intensity in accordance with the sensitivity of the image sensing camera, and for locating the index for maintaining its detectability and causing the proper centering of the radiation camera image

  18. Fast time-of-flight camera based surface registration for radiotherapy patient positioning.

    Science.gov (United States)

    Placht, Simon; Stancanello, Joseph; Schaller, Christian; Balda, Michael; Angelopoulou, Elli

    2012-01-01

    This work introduces a rigid registration framework for patient positioning in radiotherapy, based on real-time surface acquisition by a time-of-flight (ToF) camera. Dynamic properties of the system are also investigated for future gating/tracking strategies. A novel preregistration algorithm, based on translation and rotation-invariant features representing surface structures, was developed. Using these features, corresponding three-dimensional points were computed in order to determine initial registration parameters. These parameters became a robust input to an accelerated version of the iterative closest point (ICP) algorithm for the fine-tuning of the registration result. Distance calibration and Kalman filtering were used to compensate for ToF-camera dependent noise. Additionally, the advantage of using the feature based preregistration over an "ICP only" strategy was evaluated, as well as the robustness of the rigid-transformation-based method to deformation. The proposed surface registration method was validated using phantom data. A mean target registration error (TRE) for translations and rotations of 1.62 ± 1.08 mm and 0.07° ± 0.05°, respectively, was achieved. There was a temporal delay of about 65 ms in the registration output, which can be seen as negligible considering the dynamics of biological systems. Feature based preregistration allowed for accurate and robust registrations even at very large initial displacements. Deformations affected the accuracy of the results, necessitating particular care in cases of deformed surfaces. The proposed solution is able to solve surface registration problems with an accuracy suitable for radiotherapy cases where external surfaces offer primary or complementary information to patient positioning. The system shows promising dynamic properties for its use in gating/tracking applications. The overall system is competitive with commonly-used surface registration technologies. Its main benefit is the

  19. Fast time-of-flight camera based surface registration for radiotherapy patient positioning

    International Nuclear Information System (INIS)

    Placht, Simon; Stancanello, Joseph; Schaller, Christian; Balda, Michael; Angelopoulou, Elli

    2012-01-01

    Purpose: This work introduces a rigid registration framework for patient positioning in radiotherapy, based on real-time surface acquisition by a time-of-flight (ToF) camera. Dynamic properties of the system are also investigated for future gating/tracking strategies. Methods: A novel preregistration algorithm, based on translation and rotation-invariant features representing surface structures, was developed. Using these features, corresponding three-dimensional points were computed in order to determine initial registration parameters. These parameters became a robust input to an accelerated version of the iterative closest point (ICP) algorithm for the fine-tuning of the registration result. Distance calibration and Kalman filtering were used to compensate for ToF-camera dependent noise. Additionally, the advantage of using the feature based preregistration over an ''ICP only'' strategy was evaluated, as well as the robustness of the rigid-transformation-based method to deformation. Results: The proposed surface registration method was validated using phantom data. A mean target registration error (TRE) for translations and rotations of 1.62 ± 1.08 mm and 0.07 deg. ± 0.05 deg., respectively, was achieved. There was a temporal delay of about 65 ms in the registration output, which can be seen as negligible considering the dynamics of biological systems. Feature based preregistration allowed for accurate and robust registrations even at very large initial displacements. Deformations affected the accuracy of the results, necessitating particular care in cases of deformed surfaces. Conclusions: The proposed solution is able to solve surface registration problems with an accuracy suitable for radiotherapy cases where external surfaces offer primary or complementary information to patient positioning. The system shows promising dynamic properties for its use in gating/tracking applications. The overall system is competitive with commonly-used surface registration

  20. Semantic Information Extraction of Lanes Based on Onboard Camera Videos

    Science.gov (United States)

    Tang, L.; Deng, T.; Ren, C.

    2018-04-01

    In the field of autonomous driving, semantic information of lanes is very important. This paper proposes a method of automatic detection of lanes and extraction of semantic information from onboard camera videos. The proposed method firstly detects the edges of lanes by the grayscale gradient direction, and improves the Probabilistic Hough transform to fit them; then, it uses the vanishing point principle to calculate the lane geometrical position, and uses lane characteristics to extract lane semantic information by the classification of decision trees. In the experiment, 216 road video images captured by a camera mounted onboard a moving vehicle were used to detect lanes and extract lane semantic information. The results show that the proposed method can accurately identify lane semantics from video images.

  1. Adapting Virtual Camera Behaviour

    DEFF Research Database (Denmark)

    Burelli, Paolo

    2013-01-01

    In a three-dimensional virtual environment aspects such as narrative and interaction completely depend on the camera since the camera defines the player’s point of view. Most research works in automatic camera control aim to take the control of this aspect from the player to automatically gen- er...

  2. CALIBRATION PROCEDURES IN MID FORMAT CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    F. Pivnicka

    2012-07-01

    Full Text Available A growing number of mid-format cameras are used for aerial surveying projects. To achieve a reliable and geometrically precise result also in the photogrammetric workflow, awareness on the sensitive parts is important. The use of direct referencing systems (GPS/IMU, the mounting on a stabilizing camera platform and the specific values of the mid format camera make a professional setup with various calibration and misalignment operations necessary. An important part is to have a proper camera calibration. Using aerial images over a well designed test field with 3D structures and/or different flight altitudes enable the determination of calibration values in Bingo software. It will be demonstrated how such a calibration can be performed. The direct referencing device must be mounted in a solid and reliable way to the camera. Beside the mechanical work especially in mounting the camera beside the IMU, 2 lever arms have to be measured in mm accuracy. Important are the lever arms from the GPS Antenna to the IMU's calibrated centre and also the lever arm from the IMU centre to the Camera projection centre. In fact, the measurement with a total station is not a difficult task but the definition of the right centres and the need for using rotation matrices can cause serious accuracy problems. The benefit of small and medium format cameras is that also smaller aircrafts can be used. Like that, a gyro bases stabilized platform is recommended. This causes, that the IMU must be mounted beside the camera on the stabilizer. The advantage is, that the IMU can be used to control the platform, the problematic thing is, that the IMU to GPS antenna lever arm is floating. In fact we have to deal with an additional data stream, the values of the movement of the stabiliser to correct the floating lever arm distances. If the post-processing of the GPS-IMU data by taking the floating levers into account, delivers an expected result, the lever arms between IMU and

  3. Reflective optical system for time-resolved electron bunch measurements at PITZ

    Energy Technology Data Exchange (ETDEWEB)

    Rosbach, K; Baehr, J [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Roensch-Schulenburg, J [Hamburg Univ. (Germany). Inst. fuer Experimentalphysik

    2011-01-15

    The Photo-Injector Test facility at DESY, Zeuthen site (PITZ), produces pulsed electron beams with low transverse emittance and is equipped with diagnostic devices for measuring various electron bunch properties, including the longitudinal and transverse electron phase space distributions. The longitudinal bunch structure is recorded using a streak camera located outside the accelerator tunnel, connected to the diagnostics in the beam-line stations by an optical system of about 30 m length. This system mainly consists of telescopes of achromatic lenses, which transport the light pulses and image them onto the entrance slit of the streak camera. Due to dispersion in the lenses, the temporal resolution degrades during transport. This article presents general considerations for time-resolving optical systems as well as simulations and measurements of specific candidate systems. It then describes the development of an imaging system based on mirror telescopes which will improve the temporal resolution, with an emphasis on off-axis parabolic mirror systems working at unit magnification. A hybrid system of lenses and mirrors will serve as a proof of principle. (orig.)

  4. Characterestics of pico-second single bunch at the S-band linear accelerator

    International Nuclear Information System (INIS)

    Uesaka, Mitsuru; Kozawa, Takahiro; Kobayashi, Toshiaki; Ueda, Toru; Miya, Kenzo

    1994-01-01

    Measurement of the bunch structure of a pico-second single bunch was performed using a femto-second streak camera at the S-band linear accelerator of the University of Tokyo. The aim of this research is to investigate the feasibility of the generation of a femto-second single bunch at the S-band linac. The details of the bunch structure and energy spectrum of an original single bunch were precisely investigated in several operation modes where the RF phases in accelerating tubes and a prebuncher were varied. The femto-second streak camera was utilized to measure the bunch structure by one shot via Cherenkov radiation emitted by the electrons in the bunch. Next, an experiment for magnetic pulse compression of the original single bunch was carried out. Pulse shapes of the compressed bunchs for different energy modulation were also obtained by measuring Cherenkov radiation by one shot using the femto-second streak camera. Prior to the experiment, numerical tracking analysis to determine operating parameters for the magnetic pulse compression was also done. Measured pulse widths were compared with calculated ones. Finally, a 2 ps (full width at half maximum; FWHM) single bunch with an electric charge of 0.3 nC could be generated by the magnetic pulse compression. ((orig.))

  5. The Camera-Based Assessment Survey System (C-BASS): A towed camera platform for reef fish abundance surveys and benthic habitat characterization in the Gulf of Mexico

    Science.gov (United States)

    Lembke, Chad; Grasty, Sarah; Silverman, Alex; Broadbent, Heather; Butcher, Steven; Murawski, Steven

    2017-12-01

    An ongoing challenge for fisheries management is to provide cost-effective and timely estimates of habitat stratified fish densities. Traditional approaches use modified commercial fishing gear (such as trawls and baited hooks) that have biases in species selectivity and may also be inappropriate for deployment in some habitat types. Underwater visual and optical approaches offer the promise of more precise and less biased assessments of relative fish abundance, as well as direct estimates of absolute fish abundance. A number of video-based approaches have been developed and the technology for data acquisition, calibration, and synthesis has been developing rapidly. Beginning in 2012, our group of engineers and researchers at the University of South Florida has been working towards the goal of completing large scale, video-based surveys in the eastern Gulf of Mexico. This paper discusses design considerations and development of a towed camera system for collection of video-based data on commercially and recreationally important reef fishes and benthic habitat on the West Florida Shelf. Factors considered during development included potential habitat types to be assessed, sea-floor bathymetry, vessel support requirements, personnel requirements, and cost-effectiveness of system components. This regional-specific effort has resulted in a towed platform called the Camera-Based Assessment Survey System, or C-BASS, which has proven capable of surveying tens of kilometers of video transects per day and has the ability to cost-effective population estimates of reef fishes and coincident benthic habitat classification.

  6. Making Ceramic Cameras

    Science.gov (United States)

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  7. Test bed for real-time image acquisition and processing systems based on FlexRIO, CameraLink, and EPICS

    International Nuclear Information System (INIS)

    Barrera, E.; Ruiz, M.; Sanz, D.; Vega, J.; Castro, R.; Juárez, E.; Salvador, R.

    2014-01-01

    Highlights: • The test bed allows for the validation of real-time image processing techniques. • Offers FPGA (FlexRIO) image processing that does not require CPU intervention. • Is fully compatible with the architecture of the ITER Fast Controllers. • Provides flexibility and easy integration in distributed experiments based on EPICS. - Abstract: Image diagnostics are becoming standard ones in nuclear fusion. At present, images are typically analyzed off-line. However, real-time processing is occasionally required (for instance, hot-spot detection or pattern recognition tasks), which will be the objective for the next generation of fusion devices. In this paper, a test bed for image generation, acquisition, and real-time processing is presented. The proposed solution is built using a Camera Link simulator, a Camera Link frame-grabber, a PXIe chassis, and offers software interface with EPICS. The Camera Link simulator (PCIe card PCIe8 DVa C-Link from Engineering Design Team) generates simulated image data (for example, from video-movies stored in fusion databases) using a Camera Link interface to mimic the frame sequences produced with diagnostic cameras. The Camera Link frame-grabber (FlexRIO Solution from National Instruments) includes a field programmable gate array (FPGA) for image acquisition using a Camera Link interface; the FPGA allows for the codification of ad-hoc image processing algorithms using LabVIEW/FPGA software. The frame grabber is integrated in a PXIe chassis with system architecture similar to that of the ITER Fast Controllers, and the frame grabber provides a software interface with EPICS to program all of its functionalities, capture the images, and perform the required image processing. The use of these four elements allows for the implementation of a test bed system that permits the development and validation of real-time image processing techniques in an architecture that is fully compatible with that of the ITER Fast Controllers

  8. Investigation of high resolution compact gamma camera module based on a continuous scintillation crystal using a novel charge division readout method

    International Nuclear Information System (INIS)

    Dai Qiusheng; Zhao Cuilan; Qi Yujin; Zhang Hualin

    2010-01-01

    The objective of this study is to investigate a high performance and lower cost compact gamma camera module for a multi-head small animal SPECT system. A compact camera module was developed using a thin Lutetium Oxyorthosilicate (LSO) scintillation crystal slice coupled to a Hamamatsu H8500 position sensitive photomultiplier tube (PSPMT). A two-stage charge division readout board based on a novel subtractive resistive readout with a truncated center-of-gravity (TCOG) positioning method was developed for the camera. The performance of the camera was evaluated using a flood 99m Tc source with a four-quadrant bar-mask phantom. The preliminary experimental results show that the image shrinkage problem associated with the conventional resistive readout can be effectively overcome by the novel subtractive resistive readout with an appropriate fraction subtraction factor. The response output area (ROA) of the camera shown in the flood image was improved up to 34%, and an intrinsic spatial resolution better than 2 mm of detector was achieved. In conclusion, the utilization of a continuous scintillation crystal and a flat-panel PSPMT equipped with a novel subtractive resistive readout is a feasible approach for developing a high performance and lower cost compact gamma camera. (authors)

  9. 78 FR 24199 - Streak Products, Inc. v. UTi, United States, Inc.; Notice of Filing of Complaint and Assignment

    Science.gov (United States)

    2013-04-24

    ... FEDERAL MARITIME COMMISSION [Docket No. 13--04] Streak Products, Inc. v. UTi, United States, Inc...,'' against UTi, United States, Inc. (``UTi''), hereinafter ``Respondent.'' Complainant states that it is a... therefore, has violated 46 U.S.C. 41104(2). Complainant also alleges that ``UTi engaged in an unfair or...

  10. Camera Movement in Narrative Cinema

    DEFF Research Database (Denmark)

    Nielsen, Jakob Isak

    2007-01-01

    section unearths what characterizes the literature on camera movement. The second section of the dissertation delineates the history of camera movement itself within narrative cinema. Several organizational principles subtending the on-screen effect of camera movement are revealed in section two...... but they are not organized into a coherent framework. This is the task that section three meets in proposing a functional taxonomy for camera movement in narrative cinema. Two presumptions subtend the taxonomy: That camera movement actively contributes to the way in which we understand the sound and images on the screen......, commentative or valuative manner. 4) Focalization: associating the movement of the camera with the viewpoints of characters or entities in the story world. 5) Reflexive: inviting spectators to engage with the artifice of camera movement. 6) Abstract: visualizing abstract ideas and concepts. In order...

  11. Development and application of an automatic system for measuring the laser camera

    International Nuclear Information System (INIS)

    Feng Shuli; Peng Mingchen; Li Kuncheng

    2004-01-01

    Objective: To provide an automatic system for measuring imaging quality of laser camera, and to make an automatic measurement and analysis system. Methods: On the special imaging workstation (SGI 540), the procedure was written by using Matlab language. An automatic measurement and analysis system of imaging quality for laser camera was developed and made according to the imaging quality measurement standard of laser camera of International Engineer Commission (IEC). The measurement system used the theories of digital signal processing, and was based on the characteristics of digital images, as well as put the automatic measurement and analysis of laser camera into practice by the affiliated sample pictures of the laser camera. Results: All the parameters of imaging quality of laser camera, including H-D and MTF curve, low and middle and high resolution of optical density, all kinds of geometry distort, maximum and minimum density, as well as the dynamic range of gray scale, could be measured by this system. The system was applied for measuring the laser cameras in 20 hospitals in Beijing. The measuring results showed that the system could provide objective and quantitative data, and could accurately evaluate the imaging quality of laser camera, as well as correct the results made by manual measurement based on the affiliated sample pictures of the laser camera. Conclusion: The automatic measuring system of laser camera is an effective and objective tool for testing the quality of the laser camera, and the system makes a foundation for the future research

  12. Design and tests of a portable mini gamma camera

    International Nuclear Information System (INIS)

    Sanchez, F.; Benlloch, J.M.; Escat, B.; Pavon, N.; Porras, E.; Kadi-Hanifi, D.; Ruiz, J.A.; Mora, F.J.; Sebastia, A.

    2004-01-01

    Design optimization, manufacturing, and tests, both laboratory and clinical, of a portable gamma camera for medical applications are presented. This camera, based on a continuous scintillation crystal and a position-sensitive photomultiplier tube, has an intrinsic spatial resolution of ≅2 mm, an energy resolution of 13% at 140 keV, and linearities of 0.28 mm (absolute) and 0.15 mm (differential), with a useful field of view of 4.6 cm diameter. Our camera can image small organs with high efficiency and so it can address the demand for devices of specific clinical applications like thyroid and sentinel node scintigraphy as well as scintimammography and radio-guided surgery. The main advantages of the gamma camera with respect to those previously reported in the literature are high portability, low cost, and weight (2 kg), with no significant loss of sensitivity and spatial resolution. All the electronic components are packed inside the minigamma camera, and no external electronic devices are required. The camera is only connected through the universal serial bus port to a portable personal computer (PC), where a specific software allows to control both the camera parameters and the measuring process, by displaying on the PC the acquired image on 'real time'. In this article, we present the camera and describe the procedures that have led us to choose its configuration. Laboratory and clinical tests are presented together with diagnostic capabilities of the gamma camera

  13. World's fastest and most sensitive astronomical camera

    Science.gov (United States)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these

  14. Study of a new architecture of gamma cameras with Cd/ZnTe/CdTe semiconductors; Etude d'une nouvelle architecture de gamma camera a base de semi-conducteurs CdZnTe /CdTe

    Energy Technology Data Exchange (ETDEWEB)

    Guerin, L

    2007-11-15

    This thesis studies new semi conductors for gammas cameras in order to improve the quality of image in nuclear medicine. The chapter 1 reminds the general principle of the imaging gamma, by describing the radiotracers, the channel of detection and the types of Anger gamma cameras acquisition. The physiological, physical and technological limits of the camera are then highlighted, to better identify the needs of future gamma cameras. The chapter 2 is dedicated to a bibliographical study. At first, semi-conductors used in imaging gamma are presented, and more particularly semi-conductors CDTE and CdZnTe, by distinguishing planar detectors and monolithic pixelated detectors. Secondly, the classic collimators of the gamma cameras, used in clinical routine for the most part of between them, are described. Their geometry is presented, as well as their characteristics, their advantages and their inconveniences. The chapter 3 is dedicated to a state of art of the simulation codes dedicated to the medical imaging and the methods of reconstruction in imaging gamma. These states of art allow to introduce the software of simulation and the methods of reconstruction used within the framework of this thesis. The chapter 4 presents the new architecture of gamma camera proposed during this work of thesis. It is structured in three parts. The first part justifies the use of semiconducting detectors CdZnTe, in particular the monolithic pixelated detectors, by bringing to light their advantages with regard to the detection modules based on scintillator. The second part presents gamma cameras to base of detectors CdZnTe (prototypes or commercial products) and their associated collimators, as well as the interest of an association of detectors CdZnTe in the classic collimators. Finally, the third part presents in detail the HiSens architecture. The chapter 5 describes both software of simulation used within the framework of this thesis to estimate the performances of the Hi

  15. Partially slotted crystals for a high-resolution γ-camera based on a position sensitive photomultiplier

    International Nuclear Information System (INIS)

    Giokaris, N.; Loudos, G.; Maintas, D.; Karabarbounis, A.; Lembesi, M.; Spanoudaki, V.; Stiliaris, E.; Boukis, S.; Gektin, A.; Pedash, V.; Gayshan, V.

    2005-01-01

    Partially slotted crystals have been designed and constructed and have been used to evaluate the performance with respect to the spatial resolution of a γ-camera based on a position-sensitive photomultiplier. It is shown that the resolution obtained with such a crystal is only slightly worse than the one obtained with a fully pixelized one whose cost, however, is much higher

  16. Simple, fast, and low-cost camera-based water content measurement with colorimetric fluorescent indicator

    Science.gov (United States)

    Song, Seok-Jeong; Kim, Tae-Il; Kim, Youngmi; Nam, Hyoungsik

    2018-05-01

    Recently, a simple, sensitive, and low-cost fluorescent indicator has been proposed to determine water contents in organic solvents, drugs, and foodstuffs. The change of water content leads to the change of the indicator's fluorescence color under the ultra-violet (UV) light. Whereas the water content values could be estimated from the spectrum obtained by a bulky and expensive spectrometer in the previous research, this paper demonstrates a simple and low-cost camera-based water content measurement scheme with the same fluorescent water indicator. Water content is calculated over the range of 0-30% by quadratic polynomial regression models with color information extracted from the captured images of samples. Especially, several color spaces such as RGB, xyY, L∗a∗b∗, u‧v‧, HSV, and YCBCR have been investigated to establish the optimal color information features over both linear and nonlinear RGB data given by a camera before and after gamma correction. In the end, a 2nd order polynomial regression model along with HSV in a linear domain achieves the minimum mean square error of 1.06% for a 3-fold cross validation method. Additionally, the resultant water content estimation model is implemented and evaluated in an off-the-shelf Android-based smartphone.

  17. Gate Simulation of a Gamma Camera

    International Nuclear Information System (INIS)

    Abidi, Sana; Mlaouhi, Zohra

    2008-01-01

    Medical imaging is a very important diagnostic because it allows for an exploration of the internal human body. The nuclear imaging is an imaging technique used in the nuclear medicine. It is to determine the distribution in the body of a radiotracers by detecting the radiation it emits using a detection device. Two methods are commonly used: Single Photon Emission Computed Tomography (SPECT) and the Positrons Emission Tomography (PET). In this work we are interested on modelling of a gamma camera. This simulation is based on Monte-Carlo language and in particular Gate simulator (Geant4 Application Tomographic Emission). We have simulated a clinical gamma camera called GAEDE (GKS-1) and then we validate these simulations by experiments. The purpose of this work is to monitor the performance of these gamma camera and the optimization of the detector performance and the the improvement of the images quality. (Author)

  18. Presence capture cameras - a new challenge to the image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2016-04-01

    Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.

  19. A method for evaluating image quality of monochrome and color displays based on luminance by use of a commercially available color digital camera

    Energy Technology Data Exchange (ETDEWEB)

    Tokurei, Shogo, E-mail: shogo.tokurei@gmail.com, E-mail: junjim@med.kyushu-u.ac.jp [Department of Health Sciences, Graduate School of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka, Fukuoka 812-8582, Japan and Department of Radiology, Yamaguchi University Hospital, 1-1-1 Minamikogushi, Ube, Yamaguchi 755-8505 (Japan); Morishita, Junji, E-mail: shogo.tokurei@gmail.com, E-mail: junjim@med.kyushu-u.ac.jp [Department of Health Sciences, Faculty of Medical Sciences, Kyushu University, 3-1-1 Maidashi, Higashi-ku, Fukuoka, Fukuoka 812-8582 (Japan)

    2015-08-15

    Purpose: The aim of this study is to propose a method for the quantitative evaluation of image quality of both monochrome and color liquid-crystal displays (LCDs) using a commercially available color digital camera. Methods: The intensities of the unprocessed red (R), green (G), and blue (B) signals of a camera vary depending on the spectral sensitivity of the image sensor used in the camera. For consistent evaluation of image quality for both monochrome and color LCDs, the unprocessed RGB signals of the camera were converted into gray scale signals that corresponded to the luminance of the LCD. Gray scale signals for the monochrome LCD were evaluated by using only the green channel signals of the camera. For the color LCD, the RGB signals of the camera were converted into gray scale signals by employing weighting factors (WFs) for each RGB channel. A line image displayed on the color LCD was simulated on the monochrome LCD by using a software application for subpixel driving in order to verify the WF-based conversion method. Furthermore, the results obtained by different types of commercially available color cameras and a photometric camera were compared to examine the consistency of the authors’ method. Finally, image quality for both the monochrome and color LCDs was assessed by measuring modulation transfer functions (MTFs) and Wiener spectra (WS). Results: The authors’ results demonstrated that the proposed method for calibrating the spectral sensitivity of the camera resulted in a consistent and reliable evaluation of the luminance of monochrome and color LCDs. The MTFs and WS showed different characteristics for the two LCD types owing to difference in the subpixel structure. The MTF in the vertical direction of the color LCD was superior to that of the monochrome LCD, although the WS in the vertical direction of the color LCD was inferior to that of the monochrome LCD as a result of luminance fluctuations in RGB subpixels. Conclusions: The authors

  20. Applications of a shadow camera system for energy meteorology

    Science.gov (United States)

    Kuhn, Pascal; Wilbert, Stefan; Prahl, Christoph; Garsche, Dominik; Schüler, David; Haase, Thomas; Ramirez, Lourdes; Zarzalejo, Luis; Meyer, Angela; Blanc, Philippe; Pitz-Paal, Robert

    2018-02-01

    Downward-facing shadow cameras might play a major role in future energy meteorology. Shadow cameras directly image shadows on the ground from an elevated position. They are used to validate other systems (e.g. all-sky imager based nowcasting systems, cloud speed sensors or satellite forecasts) and can potentially provide short term forecasts for solar power plants. Such forecasts are needed for electricity grids with high penetrations of renewable energy and can help to optimize plant operations. In this publication, two key applications of shadow cameras are briefly presented.

  1. Multiple-camera tracking: UK government requirements

    Science.gov (United States)

    Hosmer, Paul

    2007-10-01

    The Imagery Library for Intelligent Detection Systems (i-LIDS) is the UK government's new standard for Video Based Detection Systems (VBDS). The standard was launched in November 2006 and evaluations against it began in July 2007. With the first four i-LIDS scenarios completed, the Home Office Scientific development Branch (HOSDB) are looking toward the future of intelligent vision in the security surveillance market by adding a fifth scenario to the standard. The fifth i-LIDS scenario will concentrate on the development, testing and evaluation of systems for the tracking of people across multiple cameras. HOSDB and the Centre for the Protection of National Infrastructure (CPNI) identified a requirement to track targets across a network of CCTV cameras using both live and post event imagery. The Detection and Vision Systems group at HOSDB were asked to determine the current state of the market and develop an in-depth Operational Requirement (OR) based on government end user requirements. Using this OR the i-LIDS team will develop a full i-LIDS scenario to aid the machine vision community in its development of multi-camera tracking systems. By defining a requirement for multi-camera tracking and building this into the i-LIDS standard the UK government will provide a widely available tool that developers can use to help them turn theory and conceptual demonstrators into front line application. This paper will briefly describe the i-LIDS project and then detail the work conducted in building the new tracking aspect of the standard.

  2. Developments of optical fast-gated imaging systems

    International Nuclear Information System (INIS)

    Koehler, H.A.; Kotecki, D.

    1984-08-01

    Several fast-gated imaging systems to measure ultra-fast single-transient data have been developed for time-resolved imaging of pulsed radiation sources. These systems were designed to achieve image recording times of 1 to 3 ms and dynamic ranges of >200:1 to produce large two-dimensional images (greater than or equal to 10 4 spatial points) of 1 to 2 ns exposure and small two-dimensional images (less than or equal to 200 spatial points) of less than or equal to 0.5 ns exposure. Both MCP intensified solid-state two-dimensional framing cameras and streak camera/solid-state camera systems were used; the framing camera system provides snap shots with high spatial resolution whereas the streak camera system provides for limited spatial points each with high temporal resolution. Applications of these systems include electron-beam, x-ray, gamma-ray, and neutron diagnostics. This report reviews the characteristics of the major components of fast-gated imaging systems developed at Lawrence Livermore National Laboratory. System performances are described in view of major experiments, and the diagnostic requirements of new experiments in atomic physics (x-ray lasers) and nuclear physics (fusion) are indicated

  3. Demonstration of the CDMA-mode CAOS smart camera.

    Science.gov (United States)

    Riza, Nabeel A; Mazhar, Mohsin A

    2017-12-11

    Demonstrated is the code division multiple access (CDMA)-mode coded access optical sensor (CAOS) smart camera suited for bright target scenarios. Deploying a silicon CMOS sensor and a silicon point detector within a digital micro-mirror device (DMD)-based spatially isolating hybrid camera design, this smart imager first engages the DMD starring mode with a controlled factor of 200 high optical attenuation of the scene irradiance to provide a classic unsaturated CMOS sensor-based image for target intelligence gathering. Next, this CMOS sensor provided image data is used to acquire a focused zone more robust un-attenuated true target image using the time-modulated CDMA-mode of the CAOS camera. Using four different bright light test target scenes, successfully demonstrated is a proof-of-concept visible band CAOS smart camera operating in the CDMA-mode using up-to 4096 bits length Walsh design CAOS pixel codes with a maximum 10 KHz code bit rate giving a 0.4096 seconds CAOS frame acquisition time. A 16-bit analog-to-digital converter (ADC) with time domain correlation digital signal processing (DSP) generates the CDMA-mode images with a 3600 CAOS pixel count and a best spatial resolution of one micro-mirror square pixel size of 13.68 μm side. The CDMA-mode of the CAOS smart camera is suited for applications where robust high dynamic range (DR) imaging is needed for un-attenuated un-spoiled bright light spectrally diverse targets.

  4. Analysis of Camera Arrays Applicable to the Internet of Things.

    Science.gov (United States)

    Yang, Jiachen; Xu, Ru; Lv, Zhihan; Song, Houbing

    2016-03-22

    The Internet of Things is built based on various sensors and networks. Sensors for stereo capture are essential for acquiring information and have been applied in different fields. In this paper, we focus on the camera modeling and analysis, which is very important for stereo display and helps with viewing. We model two kinds of cameras, a parallel and a converged one, and analyze the difference between them in vertical and horizontal parallax. Even though different kinds of camera arrays are used in various applications and analyzed in the research work, there are few discussions on the comparison of them. Therefore, we make a detailed analysis about their performance over different shooting distances. From our analysis, we find that the threshold of shooting distance for converged cameras is 7 m. In addition, we design a camera array in our work that can be used as a parallel camera array, as well as a converged camera array and take some images and videos with it to identify the threshold.

  5. Visual fatigue modeling for stereoscopic video shot based on camera motion

    Science.gov (United States)

    Shi, Guozhong; Sang, Xinzhu; Yu, Xunbo; Liu, Yangdong; Liu, Jing

    2014-11-01

    As three-dimensional television (3-DTV) and 3-D movie become popular, the discomfort of visual feeling limits further applications of 3D display technology. The cause of visual discomfort from stereoscopic video conflicts between accommodation and convergence, excessive binocular parallax, fast motion of objects and so on. Here, a novel method for evaluating visual fatigue is demonstrated. Influence factors including spatial structure, motion scale and comfortable zone are analyzed. According to the human visual system (HVS), people only need to converge their eyes to the specific objects for static cameras and background. Relative motion should be considered for different camera conditions determining different factor coefficients and weights. Compared with the traditional visual fatigue prediction model, a novel visual fatigue predicting model is presented. Visual fatigue degree is predicted using multiple linear regression method combining with the subjective evaluation. Consequently, each factor can reflect the characteristics of the scene, and the total visual fatigue score can be indicated according to the proposed algorithm. Compared with conventional algorithms which ignored the status of the camera, our approach exhibits reliable performance in terms of correlation with subjective test results.

  6. Camera-based microswitch technology to monitor mouth, eyebrow, and eyelid responses of children with profound multiple disabilities

    NARCIS (Netherlands)

    Lancioni, G.E.; Bellini, D.; Oliva, D.; Singh, N.N.; O'Reilly, M.F.; Sigafoos, J.; Lang, R.B.; Didden, H.C.M.

    2011-01-01

    A camera-based microswitch technology was recently used to successfully monitor small eyelid and mouth responses of two adults with profound multiple disabilities (Lancioni et al., Res Dev Disab 31:1509-1514, 2010a). This technology, in contrast with the traditional optic microswitches used for

  7. Adaptation Computing Parameters of Pan-Tilt-Zoom Cameras for Traffic Monitoring

    Directory of Open Access Journals (Sweden)

    Ya Lin WU

    2014-01-01

    Full Text Available The Closed- CIRCUIT television (CCTV cameras have been widely used in recent years for traffic monitoring and surveillance applications. We can use CCTV cameras to extract automatically real-time traffic parameters according to the image processing and tracking technologies. Especially, the pan-tilt-zoom (PTZ cameras can provide flexible view selection as well as a wider observation range, and this makes the traffic parameters can be accurately calculated. Therefore, that the parameters of PTZ cameras are calibrated plays an important role in vision-based traffic applications. However, in the specific traffic environment, which is that the license plate number of the illegal parking is located, the parameters of PTZ cameras have to be updated according to the position and distance of illegal parking. In proposed traffic monitoring systems, we use the ordinary webcam and PTZ camera. We get vanishing-point of traffic lane lines in the pixel-based coordinate system by fixed webcam. The parameters of PTZ camera can be initialized by distance of the traffic monitoring and specific objectives and vanishing-point. And then we can use the coordinate position of the illegally parked car to update the parameters of PTZ camera and then get the real word coordinate position of the illegally parked car and use it to compute the distance. The result shows the error of the tested distance and real distance is only 0.2064 meter.

  8. Development of camera technology for monitoring nests. Chapter 15

    Science.gov (United States)

    W. Andrew Cox; M. Shane Pruett; Thomas J. Benson; Scott J. Chiavacci; Frank R., III Thompson

    2012-01-01

    Photo and video technology has become increasingly useful in the study of avian nesting ecology. However, researchers interested in using camera systems are often faced with insufficient information on the types and relative advantages of available technologies. We reviewed the literature for studies of nests that used cameras and summarized them based on study...

  9. CCD-camera-based diffuse optical tomography to study ischemic stroke in preclinical rat models

    Science.gov (United States)

    Lin, Zi-Jing; Niu, Haijing; Liu, Yueming; Su, Jianzhong; Liu, Hanli

    2011-02-01

    Stroke, due to ischemia or hemorrhage, is the neurological deficit of cerebrovasculature and is the third leading cause of death in the United States. More than 80 percent of stroke patients are ischemic stroke due to blockage of artery in the brain by thrombosis or arterial embolism. Hence, development of an imaging technique to image or monitor the cerebral ischemia and effect of anti-stoke therapy is more than necessary. Near infrared (NIR) optical tomographic technique has a great potential to be utilized as a non-invasive image tool (due to its low cost and portability) to image the embedded abnormal tissue, such as a dysfunctional area caused by ischemia. Moreover, NIR tomographic techniques have been successively demonstrated in the studies of cerebro-vascular hemodynamics and brain injury. As compared to a fiberbased diffuse optical tomographic system, a CCD-camera-based system is more suitable for pre-clinical animal studies due to its simpler setup and lower cost. In this study, we have utilized the CCD-camera-based technique to image the embedded inclusions based on tissue-phantom experimental data. Then, we are able to obtain good reconstructed images by two recently developed algorithms: (1) depth compensation algorithm (DCA) and (2) globally convergent method (GCM). In this study, we will demonstrate the volumetric tomographic reconstructed results taken from tissuephantom; the latter has a great potential to determine and monitor the effect of anti-stroke therapies.

  10. First experience with THE AUTOLAP™ SYSTEM: an image-based robotic camera steering device.

    Science.gov (United States)

    Wijsman, Paul J M; Broeders, Ivo A M J; Brenkman, Hylke J; Szold, Amir; Forgione, Antonello; Schreuder, Henk W R; Consten, Esther C J; Draaisma, Werner A; Verheijen, Paul M; Ruurda, Jelle P; Kaufman, Yuval

    2018-05-01

    Robotic camera holders for endoscopic surgery have been available for 20 years but market penetration is low. The current camera holders are controlled by voice, joystick, eyeball tracking, or head movements, and this type of steering has proven to be successful but excessive disturbance of surgical workflow has blocked widespread introduction. The Autolap™ system (MST, Israel) uses a radically different steering concept based on image analysis. This may improve acceptance by smooth, interactive, and fast steering. These two studies were conducted to prove safe and efficient performance of the core technology. A total of 66 various laparoscopic procedures were performed with the AutoLap™ by nine experienced surgeons, in two multi-center studies; 41 cholecystectomies, 13 fundoplications including hiatal hernia repair, 4 endometriosis surgeries, 2 inguinal hernia repairs, and 6 (bilateral) salpingo-oophorectomies. The use of the AutoLap™ system was evaluated in terms of safety, image stability, setup and procedural time, accuracy of imaged-based movements, and user satisfaction. Surgical procedures were completed with the AutoLap™ system in 64 cases (97%). The mean overall setup time of the AutoLap™ system was 4 min (04:08 ± 0.10). Procedure times were not prolonged due to the use of the system when compared to literature average. The reported user satisfaction was 3.85 and 3.96 on a scale of 1 to 5 in two studies. More than 90% of the image-based movements were accurate. No system-related adverse events were recorded while using the system. Safe and efficient use of the core technology of the AutoLap™ system was demonstrated with high image stability and good surgeon satisfaction. The results support further clinical studies that will focus on usability, improved ergonomics and additional image-based features.

  11. Optimization of gamma-ray cameras of Anger type

    International Nuclear Information System (INIS)

    Jatteau, Michel; Lelong, Pierre; Normand, Gerard; Ott, Jean; Pauvert, Joseph; Pergrale, Jean

    1979-01-01

    Most of the radionuclide imaging equipments used for the diagnosis in nuclear medicine include a scintillation camera of the Anger type. Following a period of camera improvements connected to pure technological advances, perfecting the camera can only result nowadays from more thorough studies based on numerical approaches and computer simulations. Two important contributions to an optimization study of Anger gamma-ray cameras are presented, the first one being related to the light collection by the photomultiplier tubes, i.e. one of the processes which determine for a large part the performance parameters; the second one being connected to the computation of the intrinsic geometrical and spectral resolutions, which are two of the main characteristics acting on the image quality. The validity of computer simulation is shown by comparison between theoretical and experimental results before the simulation programmes to study the influence of various parameters are used [fr

  12. Software development and its description for Geoid determination based on Spherical-Cap-Harmonics Modelling using digital-zenith camera and gravimetric measurements hybrid data

    Science.gov (United States)

    Morozova, K.; Jaeger, R.; Balodis, J.; Kaminskis, J.

    2017-10-01

    Over several years the Institute of Geodesy and Geoinformatics (GGI) was engaged in the design and development of a digital zenith camera. At the moment the camera developments are finished and tests by field measurements are done. In order to check these data and to use them for geoid model determination DFHRS (Digital Finite element Height reference surface (HRS)) v4.3. software is used. It is based on parametric modelling of the HRS as a continous polynomial surface. The HRS, providing the local Geoid height N, is a necessary geodetic infrastructure for a GNSS-based determination of physcial heights H from ellipsoidal GNSS heights h, by H=h-N. The research and this publication is dealing with the inclusion of the data of observed vertical deflections from digital zenith camera into the mathematical model of the DFHRS approach and software v4.3. A first target was to test out and validate the mathematical model and software, using additionally real data of the above mentioned zenith camera observations of deflections of the vertical. A second concern of the research was to analyze the results and the improvement of the Latvian quasi-geoid computation compared to the previous version HRS computed without zenith camera based deflections of the vertical. The further development of the mathematical model and software concerns the use of spherical-cap-harmonics as the designed carrier function for the DFHRS v.5. It enables - in the sense of the strict integrated geodesy approach, holding also for geodetic network adjustment - both a full gravity field and a geoid and quasi-geoid determination. In addition, it allows the inclusion of gravimetric measurements, together with deflections of the vertical from digital-zenith cameras, and all other types of observations. The theoretical description of the updated version of DFHRS software and methods are discussed in this publication.

  13. Temperature measurement with industrial color camera devices

    Science.gov (United States)

    Schmidradler, Dieter J.; Berndorfer, Thomas; van Dyck, Walter; Pretschuh, Juergen

    1999-05-01

    This paper discusses color camera based temperature measurement. Usually, visual imaging and infrared image sensing are treated as two separate disciplines. We will show, that a well selected color camera device might be a cheaper, more robust and more sophisticated solution for optical temperature measurement in several cases. Herein, only implementation fragments and important restrictions for the sensing element will be discussed. Our aim is to draw the readers attention to the use of visual image sensors for measuring thermal radiation and temperature and to give reasons for the need of improved technologies for infrared camera devices. With AVL-List, our partner of industry, we successfully used the proposed sensor to perform temperature measurement for flames inside the combustion chamber of diesel engines which finally led to the presented insights.

  14. Decision Support System to Choose Digital Single Lens Camera with Simple Additive Weighting Method

    Directory of Open Access Journals (Sweden)

    Tri Pina Putri

    2016-11-01

    Full Text Available One of the technologies that evolve today is Digital Single Lens Reflex (DSLR camera. The number of products makes users have difficulties to choose the appropriate camera based on their criteria. Users may utilize several ways to help them choosing the intended camera such as using magazine, internet, and other media. This paper discusses about a web based decision support system to choose cameras by using SAW (Simple Additive Weighting method in order to make the decision process more effective and efficient. This system is expected to give recommendations about the camera which is appropriate with the user’s need and criteria based on the cost, the resolution, the feature, the ISO, and the censor. The system was implemented by using PHP and MySQL. Based on the result of questionnaire distributed to 20 respondents, 60% respondents agree that this decision support system can help users to choose the appropriate camera DSLR in accordance with the user’s need, 60% of respondents agree that this decision support system is more effective to choose DSLR camera and 75% of respondents agree that this system is more efficient. In addition, 60.55% of respondents agree that this system has met 5 Es Usability Framework.

  15. VUV testing of science cameras at MSFC: QE measurement of the CLASP flight cameras

    Science.gov (United States)

    Champey, P.; Kobayashi, K.; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-08-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint MSFC, National Astronomical Observatory of Japan (NAOJ), Instituto de Astrofisica de Canarias (IAC) and Institut D'Astrophysique Spatiale (IAS) sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512 × 512 detector, dual channel analog readout and an internally mounted cold block. At the flight CCD temperature of -20C, the CLASP cameras exceeded the low-noise performance requirements (UV, EUV and X-ray science cameras at MSFC.

  16. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    NARCIS (Netherlands)

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P.T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many

  17. Theory and applications of smart cameras

    CERN Document Server

    2016-01-01

    This book presents an overview of smart camera systems, considering practical applications but also reviewing fundamental aspects of the underlying technology.  It introduces in a tutorial style the principles of sensing and signal processing, and also describes topics such as wireless connection to the Internet of Things (IoT) which is expected to be the biggest market for smart cameras. It is an excellent guide to the fundamental of smart camera technology, and the chapters complement each other well as the authors have worked as a team under the auspice of GFP(Global Frontier Project), the largest-scale funded research in Korea.  This is the third of three books based on the Integrated Smart Sensors research project, which describe the development of innovative devices, circuits, and system-level enabling technologies.  The aim of the project was to develop common platforms on which various devices and sensors can be loaded, and to create systems offering significant improvements in information processi...

  18. Nuclear test experimental science

    International Nuclear Information System (INIS)

    Struble, G.L.; Middleton, C.; Bucciarelli, G.; Carter, J.; Cherniak, J.; Donohue, M.L.; Kirvel, R.D.; MacGregor, P.; Reid, S.

    1989-01-01

    This report discusses research being conducted at Lawrence Livermore Laboratory under the following topics: prompt diagnostics; experimental modeling, design, and analysis; detector development; streak-camera data systems; weapons supporting research

  19. Nuclear test experimental science

    Energy Technology Data Exchange (ETDEWEB)

    Struble, G.L.; Middleton, C.; Bucciarelli, G.; Carter, J.; Cherniak, J.; Donohue, M.L.; Kirvel, R.D.; MacGregor, P.; Reid, S. (eds.)

    1989-01-01

    This report discusses research being conducted at Lawrence Livermore Laboratory under the following topics: prompt diagnostics; experimental modeling, design, and analysis; detector development; streak-camera data systems; weapons supporting research.

  20. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  1. Application of single-image camera calibration for ultrasound augmented laparoscopic visualization.

    Science.gov (United States)

    Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D; Shekhar, Raj

    2015-03-01

    Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool ( rdCalib ; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker ® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.

  2. Application of single-image camera calibration for ultrasound augmented laparoscopic visualization

    Science.gov (United States)

    Liu, Xinyang; Su, He; Kang, Sukryool; Kane, Timothy D.; Shekhar, Raj

    2015-03-01

    Accurate calibration of laparoscopic cameras is essential for enabling many surgical visualization and navigation technologies such as the ultrasound-augmented visualization system that we have developed for laparoscopic surgery. In addition to accuracy and robustness, there is a practical need for a fast and easy camera calibration method that can be performed on demand in the operating room (OR). Conventional camera calibration methods are not suitable for the OR use because they are lengthy and tedious. They require acquisition of multiple images of a target pattern in its entirety to produce satisfactory result. In this work, we evaluated the performance of a single-image camera calibration tool (rdCalib; Percieve3D, Coimbra, Portugal) featuring automatic detection of corner points in the image, whether partial or complete, of a custom target pattern. Intrinsic camera parameters of a 5-mm and a 10-mm standard Stryker® laparoscopes obtained using rdCalib and the well-accepted OpenCV camera calibration method were compared. Target registration error (TRE) as a measure of camera calibration accuracy for our optical tracking-based AR system was also compared between the two calibration methods. Based on our experiments, the single-image camera calibration yields consistent and accurate results (mean TRE = 1.18 ± 0.35 mm for the 5-mm scope and mean TRE = 1.13 ± 0.32 mm for the 10-mm scope), which are comparable to the results obtained using the OpenCV method with 30 images. The new single-image camera calibration method is promising to be applied to our augmented reality visualization system for laparoscopic surgery.

  3. Comparison of the effectiveness of three retinal camera technologies for malarial retinopathy detection in Malawi

    Science.gov (United States)

    Soliz, Peter; Nemeth, Sheila C.; Barriga, E. Simon; Harding, Simon P.; Lewallen, Susan; Taylor, Terrie E.; MacCormick, Ian J.; Joshi, Vinayak S.

    2016-03-01

    The purpose of this study was to test the suitability of three available camera technologies (desktop, portable, and iphone based) for imaging comatose children who presented with clinical symptoms of malaria. Ultimately, the results of the project would form the basis for a design of a future camera to screen for malaria retinopathy (MR) in a resource challenged environment. The desktop, portable, and i-phone based cameras were represented by the Topcon, Pictor Plus, and Peek cameras, respectively. These cameras were tested on N=23 children presenting with symptoms of cerebral malaria (CM) at a malaria clinic, Queen Elizabeth Teaching Hospital in Malawi, Africa. Each patient was dilated for binocular indirect ophthalmoscopy (BIO) exam by an ophthalmologist followed by imaging with all three cameras. Each of the cases was graded according to an internationally established protocol and compared to the BIO as the clinical ground truth. The reader used three principal retinal lesions as markers for MR: hemorrhages, retinal whitening, and vessel discoloration. The study found that the mid-priced Pictor Plus hand-held camera performed considerably better than the lower price mobile phone-based camera, and slightly the higher priced table top camera. When comparing the readings of digital images against the clinical reference standard (BIO), the Pictor Plus camera had sensitivity and specificity for MR of 100% and 87%, respectively. This compares to a sensitivity and specificity of 87% and 75% for the i-phone based camera and 100% and 75% for the desktop camera. The drawback of all the cameras were their limited field of view which did not allow complete view of the periphery where vessel discoloration occurs most frequently. The consequence was that vessel discoloration was not addressed in this study. None of the cameras offered real-time image quality assessment to ensure high quality images to afford the best possible opportunity for reading by a remotely located

  4. The ECM moves during primitive streak formation--computation of ECM versus cellular motion.

    Directory of Open Access Journals (Sweden)

    Evan A Zamir

    2008-10-01

    Full Text Available Galileo described the concept of motion relativity--motion with respect to a reference frame--in 1632. He noted that a person below deck would be unable to discern whether the boat was moving. Embryologists, while recognizing that embryonic tissues undergo large-scale deformations, have failed to account for relative motion when analyzing cell motility data. A century of scientific articles has advanced the concept that embryonic cells move ("migrate" in an autonomous fashion such that, as time progresses, the cells and their progeny assemble an embryo. In sharp contrast, the motion of the surrounding extracellular matrix scaffold has been largely ignored/overlooked. We developed computational/optical methods that measure the extent embryonic cells move relative to the extracellular matrix. Our time-lapse data show that epiblastic cells largely move in concert with a sub-epiblastic extracellular matrix during stages 2 and 3 in primitive streak quail embryos. In other words, there is little cellular motion relative to the extracellular matrix scaffold--both components move together as a tissue. The extracellular matrix displacements exhibit bilateral vortical motion, convergence to the midline, and extension along the presumptive vertebral axis--all patterns previously attributed solely to cellular "migration." Our time-resolved data pose new challenges for understanding how extracellular chemical (morphogen gradients, widely hypothesized to guide cellular trajectories at early gastrulation stages, are maintained in this dynamic extracellular environment. We conclude that models describing primitive streak cellular guidance mechanisms must be able to account for sub-epiblastic extracellular matrix displacements.

  5. Software for fast cameras and image handling on MAST

    International Nuclear Information System (INIS)

    Shibaev, S.

    2008-01-01

    The rapid progress in fast imaging gives new opportunities for fusion research. The data obtained by fast cameras play an important and ever-increasing role in analysis and understanding of plasma phenomena. The fast cameras produce a huge amount of data which creates considerable problems for acquisition, analysis, and storage. We use a number of fast cameras on the Mega-Amp Spherical Tokamak (MAST). They cover several spectral ranges: broadband visible, infra-red and narrow band filtered for spectroscopic studies. These cameras are controlled by programs developed in-house. The programs provide full camera configuration and image acquisition in the MAST shot cycle. Despite the great variety of image sources, all images should be stored in a single format. This simplifies development of data handling tools and hence the data analysis. A universal file format has been developed for MAST images which supports storage in both raw and compressed forms, using either lossless or lossy compression. A number of access and conversion routines have been developed for all languages used on MAST. Two movie-style display tools have been developed-Windows native and Qt based for Linux. The camera control programs run as autonomous data acquisition units with full camera configuration set and stored locally. This allows easy porting of the code to other data acquisition systems. The software developed for MAST fast cameras has been adapted for several other tokamaks where it is in regular use

  6. Delay line clipping in a scintillation camera system

    International Nuclear Information System (INIS)

    Hatch, K.F.

    1979-01-01

    The present invention provides a novel base line restoring circuit and a novel delay line clipping circuit in a scintillation camera system. Single and double delay line clipped signal waveforms are generated for increasing the operational frequency and fidelity of data detection of the camera system by base line distortion such as undershooting, overshooting, and capacitive build-up. The camera system includes a set of photomultiplier tubes and associated amplifiers which generate sequences of pulses. These pulses are pulse-height analyzed for detecting a scintillation having an energy level which falls within a predetermined energy range. Data pulses are combined to provide coordinates and energy of photopeak events. The amplifiers are biassed out of saturation over all ranges of pulse energy level and count rate. Single delay line clipping circuitry is provided for narrowing the pulse width of the decaying electrical data pulses which increase operating speed without the occurrence of data loss. (JTA)

  7. β-Catenin Regulates Primitive Streak Induction through Collaborative Interactions with SMAD2/SMAD3 and OCT4

    DEFF Research Database (Denmark)

    Funa, Nina Sofi Ayumi; Schachter, Karen; Lerdrup, Mads

    2015-01-01

    Canonical Wnt and Nodal signaling are both required for induction of the primitive streak (PS), which guides organization of the early embryo. The Wnt effector b-catenin is thought to function in these early lineage specification decisions via transcriptional activation of Nodal signaling. Here, we...... specification. This study provides mechanistic insight into how Wnt signaling controls early cell lineage decisions....

  8. Non-contact measurement of rotation angle with solo camera

    Science.gov (United States)

    Gan, Xiaochuan; Sun, Anbin; Ye, Xin; Ma, Liqun

    2015-02-01

    For the purpose to measure a rotation angle around the axis of an object, a non-contact rotation angle measurement method based on solo camera was promoted. The intrinsic parameters of camera were calibrated using chessboard on principle of plane calibration theory. The translation matrix and rotation matrix between the object coordinate and the camera coordinate were calculated according to the relationship between the corners' position on object and their coordinates on image. Then the rotation angle between the measured object and the camera could be resolved from the rotation matrix. A precise angle dividing table (PADT) was chosen as the reference to verify the angle measurement error of this method. Test results indicated that the rotation angle measurement error of this method did not exceed +/- 0.01 degree.

  9. Optimum color filters for CCD digital cameras

    Science.gov (United States)

    Engelhardt, Kai; Kunz, Rino E.; Seitz, Peter; Brunner, Harald; Knop, Karl

    1993-12-01

    As part of the ESPRIT II project No. 2103 (MASCOT) a high performance prototype color CCD still video camera was developed. Intended for professional usage such as in the graphic arts, the camera provides a maximum resolution of 3k X 3k full color pixels. A high colorimetric performance was achieved through specially designed dielectric filters and optimized matrixing. The color transformation was obtained by computer simulation of the camera system and non-linear optimization which minimized the perceivable color errors as measured in the 1976 CIELUV uniform color space for a set of about 200 carefully selected test colors. The color filters were designed to allow perfect colorimetric reproduction in principle and at the same time with imperceptible color noise and with special attention to fabrication tolerances. The camera system includes a special real-time digital color processor which carries out the color transformation. The transformation can be selected from a set of sixteen matrices optimized for different illuminants and output devices. Because the actual filter design was based on slightly incorrect data the prototype camera showed a mean colorimetric error of 2.7 j.n.d. (CIELUV) in experiments. Using correct input data in the redesign of the filters, a mean colorimetric error of only 1 j.n.d. (CIELUV) seems to be feasible, implying that it is possible with such an optimized color camera to achieve such a high colorimetric performance that the reproduced colors in an image cannot be distinguished from the original colors in a scene, even in direct comparison.

  10. Improving Situational Awareness in camera surveillance by combining top-view maps with camera images

    NARCIS (Netherlands)

    Kooi, F.L.; Zeeders, R.

    2009-01-01

    The goal of the experiment described is to improve today's camera surveillance in public spaces. Three designs with the camera images combined on a top-view map were compared to each other and to the current situation in camera surveillance. The goal was to test which design makes spatial

  11. A fast, noniterative approach for accelerated high-temporal resolution cine-CMR using dynamically interleaved streak removal in the power-spectral encoded domain with low-pass filtering (DISPEL) and modulo-prime spokes (MoPS).

    Science.gov (United States)

    Kawaji, Keigo; Patel, Mita B; Cantrell, Charles G; Tanaka, Akiko; Marino, Marco; Tamura, Satoshi; Wang, Hui; Wang, Yi; Carroll, Timothy J; Ota, Takeyoshi; Patel, Amit R

    2017-07-01

    To introduce a pair of accelerated non-Cartesian acquisition principles that when combined, exploit the periodicity of k-space acquisition, and thereby enable acquisition of high-temporal cine Cardiac Magnetic Resonance (CMR). The mathematical formulation of a noniterative, undersampled non-Cartesian cine acquisition and reconstruction is presented. First, a low-pass filtering step that exploits streaking artifact redundancy is provided (i.e., Dynamically Interleaved Streak removal in the Power-spectrum Encoded domain with Low-pass filtering [DISPEL]). Next, an effective radial acquisition for the DISPEL approach that exploits the property of prime numbers is described (i.e., Modulo-Prime Spoke [MoPS]). Both DISPEL and MoPS are examined using numerical simulation of a digital heart phantom to show that high-temporal cine-CMR is feasible without removing physiologic motion vs aperiodic interleaving using Golden Angles. The combined high-temporal cine approach is next examined in 11 healthy subjects for a time-volume curve assessment of left ventricular systolic and diastolic performance vs conventional Cartesian cine-CMR reference. The DISPEL method was first shown using simulation under different streak cycles to allow separation of undersampled radial streaking artifacts from physiologic motion with a sufficiently frequent streak-cycle interval. Radial interleaving with MoPS is next shown to allow interleaves with pseudo-Golden-Angle variants, and be more compatible with DISPEL against irrational and nonperiodic rotation angles, including the Golden-Angle-derived rotations. In the in vivo data, the proposed method showed no statistical difference in the systolic performance, while diastolic parameters sensitive to the cine's temporal resolution were statistically significant (P cine). We demonstrate a high-temporal resolution cine-CMR using DISPEL and MoPS, whose streaking artifact was separated from physiologic motion. © 2017 American Association of Physicists

  12. VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras

    Science.gov (United States)

    Champey, Patrick R.; Kobayashi, Ken; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.

  13. Advanced CCD camera developments

    Energy Technology Data Exchange (ETDEWEB)

    Condor, A. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  14. Mycosphaerella fijiensis, the black leaf streak pathogen of banana: progress towards understanding pathogen biology and detection, disease development, and the challenges of control.

    Science.gov (United States)

    Churchill, Alice C L

    2011-05-01

    Banana (Musa spp.) is grown throughout the tropical and subtropical regions of the world. The fruits are a key staple food in many developing countries and a source of income for subsistence farmers. Bananas are also a major, multibillion-dollar export commodity for consumption primarily in developed countries, where few banana cultivars are grown. The fungal pathogen Mycosphaerella fijiensis causes black leaf streak disease (BLSD; aka black Sigatoka leaf spot) on the majority of edible banana cultivars grown worldwide. The fact that most of these cultivars are sterile and unsuitable for the breeding of resistant lines necessitates the extensive use of fungicides as the primary means of disease control. BLSD is a significant threat to the food security of resource-poor populations who cannot afford fungicides, and increases the environmental and health hazards where large-acreage monocultures of banana (Cavendish subgroup, AAA genome) are grown for export. Mycosphaerella fijiensis M. Morelet is a sexual, heterothallic fungus having Pseudocercospora fijiensis (M. Morelet) Deighton as the anamorph stage. It is a haploid, hemibiotrophic ascomycete within the class Dothideomycetes, order Capnodiales and family Mycosphaerellaceae. Its taxonomic placement is based on DNA phylogeny, morphological analyses and cultural characteristics. Mycosphaerella fijiensis is a leaf pathogen that causes reddish-brown streaks running parallel to the leaf veins, which aggregate to form larger, dark-brown to black compound streaks. These streaks eventually form fusiform or elliptical lesions that coalesce, form a water-soaked border with a yellow halo and, eventually, merge to cause extensive leaf necrosis. The disease does not kill the plants immediately, but weakens them by decreasing the photosynthetic capacity of leaves, causing a reduction in the quantity and quality of fruit, and inducing the premature ripening of fruit harvested from infected plants. Although Musa spp. are the

  15. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    International Nuclear Information System (INIS)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef; Goiffon, Vincent; Corbiere, Franck; Rolando, Sebastien; Molina, Romain; Estribeau, Magali; Avon, Barbara; Magnan, Pierre; Paillet, Philippe; Duhamel, Olivier; Gaillardin, Marc; Raine, Melanie

    2015-01-01

    There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO 2 ) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations of the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera

  16. Multi-MGy Radiation Hardened Camera for Nuclear Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Girard, Sylvain; Boukenter, Aziz; Ouerdane, Youcef [Universite de Saint-Etienne, Lab. Hubert Curien, UMR-CNRS 5516, F-42000 Saint-Etienne (France); Goiffon, Vincent; Corbiere, Franck; Rolando, Sebastien; Molina, Romain; Estribeau, Magali; Avon, Barbara; Magnan, Pierre [ISAE, Universite de Toulouse, F-31055 Toulouse (France); Paillet, Philippe; Duhamel, Olivier; Gaillardin, Marc; Raine, Melanie [CEA, DAM, DIF, F-91297 Arpajon (France)

    2015-07-01

    There is an increasing interest in developing cameras for surveillance systems to monitor nuclear facilities or nuclear waste storages. Particularly, for today's and the next generation of nuclear facilities increasing safety requirements consecutive to Fukushima Daiichi's disaster have to be considered. For some applications, radiation tolerance needs to overcome doses in the MGy(SiO{sub 2}) range whereas the most tolerant commercial or prototypes products based on solid state image sensors withstand doses up to few kGy. The objective of this work is to present the radiation hardening strategy developed by our research groups to enhance the tolerance to ionizing radiations of the various subparts of these imaging systems by working simultaneously at the component and system design levels. Developing radiation-hardened camera implies to combine several radiation-hardening strategies. In our case, we decided not to use the simplest one, the shielding approach. This approach is efficient but limits the camera miniaturization and is not compatible with its future integration in remote-handling or robotic systems. Then, the hardening-by-component strategy appears mandatory to avoid the failure of one of the camera subparts at doses lower than the MGy. Concerning the image sensor itself, the used technology is a CMOS Image Sensor (CIS) designed by ISAE team with custom pixel designs used to mitigate the total ionizing dose (TID) effects that occur well below the MGy range in classical image sensors (e.g. Charge Coupled Devices (CCD), Charge Injection Devices (CID) and classical Active Pixel Sensors (APS)), such as the complete loss of functionality, the dark current increase and the gain drop. We'll present at the conference a comparative study between these radiation-hardened pixel radiation responses with respect to conventional ones, demonstrating the efficiency of the choices made. The targeted strategy to develop the complete radiation hard camera

  17. A solid state lightning propagation speed sensor

    Science.gov (United States)

    Mach, Douglas M.; Rust, W. David

    1989-01-01

    A device to measure the propagation speeds of cloud-to-ground lightning has been developed. The lightning propagation speed (LPS) device consists of eight solid state silicon photodetectors mounted behind precision horizontal slits in the focal plane of a 50-mm lens on a 35-mm camera. Although the LPS device produces results similar to those obtained from a streaking camera, the LPS device has the advantages of smaller size, lower cost, mobile use, and easier data collection and analysis. The maximum accuracy for the LPS is 0.2 microsec, compared with about 0.8 microsecs for the streaking camera. It is found that the return stroke propagation speed for triggered lightning is different than that for natural lightning if measurements are taken over channel segments less than 500 m. It is suggested that there are no significant differences between the propagation speeds of positive and negative flashes. Also, differences between natural and triggered dart leaders are discussed.

  18. Calibration of robot tool centre point using camera-based system

    Directory of Open Access Journals (Sweden)

    Gordić Zaviša

    2016-01-01

    Full Text Available Robot Tool Centre Point (TCP calibration problem is of great importance for a number of industrial applications, and it is well known both in theory and in practice. Although various techniques have been proposed for solving this problem, they mostly require tool jogging or long processing time, both of which affect process performance by extending cycle time. This paper presents an innovative way of TCP calibration using a set of two cameras. The robot tool is placed in an area where images in two orthogonal planes are acquired using cameras. Using robust pattern recognition, even deformed tool can be identified on images, and information about its current position and orientation forwarded to control unit for calibration. Compared to other techniques, test results show significant reduction in procedure complexity and calibration time. These improvements enable more frequent TCP checking and recalibration during production, thus improving the product quality.

  19. Holographic interferometry using a digital photo-camera

    International Nuclear Information System (INIS)

    Sekanina, H.; Hledik, S.

    2001-01-01

    The possibilities of running digital holographic interferometry using commonly available compact digital zoom photo-cameras are studied. The recently developed holographic setup, suitable especially for digital photo-cameras equipped with an un detachable object lens, is used. The method described enables a simple and straightforward way of both recording and reconstructing of a digital holographic interferograms. The feasibility of the new method is verified by digital reconstruction of the interferograms acquired, using a numerical code based on the fast Fourier transform. Experimental results obtained are presented and discussed. (authors)

  20. Solid state video cameras

    CERN Document Server

    Cristol, Y

    2013-01-01

    Solid State Video Cameras reviews the state of the art in the field of solid-state television cameras as compiled from patent literature. Organized into 10 chapters, the book begins with the basic array types of solid-state imagers and appropriate read-out circuits and methods. Documents relating to improvement of picture quality, such as spurious signal suppression, uniformity correction, or resolution enhancement, are also cited. The last part considerssolid-state color cameras.