WorldWideScience

Sample records for range image sensor

  1. High dynamic range imaging sensors and architectures

    CERN Document Server

    Darmont, Arnaud

    2013-01-01

    Illumination is a crucial element in many applications, matching the luminance of the scene with the operational range of a camera. When luminance cannot be adequately controlled, a high dynamic range (HDR) imaging system may be necessary. These systems are being increasingly used in automotive on-board systems, road traffic monitoring, and other industrial, security, and military applications. This book provides readers with an intermediate discussion of HDR image sensors and techniques for industrial and non-industrial applications. It describes various sensor and pixel architectures capable

  2. Introduction to sensors for ranging and imaging

    CERN Document Server

    Brooker, Graham

    2009-01-01

    ""This comprehensive text-reference provides a solid background in active sensing technology. It is concerned with active sensing, starting with the basics of time-of-flight sensors (operational principles, components), and going through the derivation of the radar range equation and the detection of echo signals, both fundamental to the understanding of radar, sonar and lidar imaging. Several chapters cover signal propagation of both electromagnetic and acoustic energy, target characteristics, stealth, and clutter. The remainder of the book introduces the range measurement process, active ima

  3. Active resonant subwavelength grating for scannerless range imaging sensors.

    Energy Technology Data Exchange (ETDEWEB)

    Kemme, Shanalyn A.; Nellums, Robert O.; Boye, Robert R.; Peters, David William

    2006-11-01

    In this late-start LDRD, we will present a design for a wavelength-agile, high-speed modulator that enables a long-term vision for the THz Scannerless Range Imaging (SRI) sensor. It takes the place of the currently-utilized SRI micro-channel plate which is limited to photocathode sensitive wavelengths (primarily in the visible and near-IR regimes). Two of Sandia's successful technologies--subwavelength diffractive optics and THz sources and detectors--are poised to extend the capabilities of the SRI sensor. The goal is to drastically broaden the SRI's sensing waveband--all the way to the THz regime--so the sensor can see through image-obscuring, scattering environments like smoke and dust. Surface properties, such as reflectivity, emissivity, and scattering roughness, vary greatly with the illuminating wavelength. Thus, objects that are difficult to image at the SRI sensor's present near-IR wavelengths may be imaged more easily at the considerably longer THz wavelengths (0.1 to 1mm). The proposed component is an active Resonant Subwavelength Grating (RSG). Sandia invested considerable effort on a passive RSG two years ago, which resulted in a highly-efficient (reflectivity greater than gold), wavelength-specific reflector. For this late-start LDRD proposal, we will transform the passive RSG design into an active laser-line reflector.

  4. A low-noise wide dynamic range CMOS image sensor with low and high temperatures resistance

    Science.gov (United States)

    Mizobuchi, Koichi; Adachi, Satoru; Tejada, Jose; Oshikubo, Hiromichi; Akahane, Nana; Sugawa, Shigetoshi

    2008-02-01

    A temperature-resistant 1/3 inch SVGA (800×600 pixels) 5.6 μm pixel pitch wide-dynamic-range (WDR) CMOS image sensor has been developed using a lateral-over-flow-integration-capacitor (LOFIC) in a pixel. The sensor chips are fabricated through 0.18 μm 2P3M process with totally optimized front-end-of-line (FEOL) & back-end-of-line (BEOL) for a lower dark current. By implementing a low electrical field potential design for photodiodes, reducing damages, recovering crystal defects and terminating interface states in the FEOL+BEOL, the dark current is improved to 12 e - /pixel-sec at 60 deg.C with 50% reduction from the previous very-low-dark-current (VLDC) FEOL and its contribution to the temporal noise is improved. Furthermore, design optimizations of the readout circuits, especially a signal-and noise-hold circuit and a programmable-gain-amplifier (PGA) are also implemented. The measured temporal noise is 2.4 e -rms at 60 fps (:36 MHz operation). The dynamic-range (DR) is extended to 100 dB with 237 ke - full well capacity. In order to secure the temperature-resistance, the sensor chip also receives both an inorganic cap onto micro lens and a metal hermetic seal package assembly. Image samples at low & high temperatures show significant improvement in image qualities.

  5. An Analog Gamma Correction Scheme for High Dynamic Range CMOS Logarithmic Image Sensors

    Science.gov (United States)

    Cao, Yuan; Pan, Xiaofang; Zhao, Xiaojin; Wu, Huisi

    2014-01-01

    In this paper, a novel analog gamma correction scheme with a logarithmic image sensor dedicated to minimize the quantization noise of the high dynamic applications is presented. The proposed implementation exploits a non-linear voltage-controlled-oscillator (VCO) based analog-to-digital converter (ADC) to perform the gamma correction during the analog-to-digital conversion. As a result, the quantization noise does not increase while the same high dynamic range of logarithmic image sensor is preserved. Moreover, by combining the gamma correction with the analog-to-digital conversion, the silicon area and overall power consumption can be greatly reduced. The proposed gamma correction scheme is validated by the reported simulation results and the experimental results measured for our designed test structure, which is fabricated with 0.35 μm standard complementary-metal-oxide-semiconductor (CMOS) process. PMID:25517692

  6. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders - from Optical Triangulation to the Automotive Field.

    Science.gov (United States)

    Wu, Jih-Huah; Pen, Cheng-Chung; Jiang, Joe-Air

    2008-03-13

    With their significant features, the applications of complementary metal-oxidesemiconductor (CMOS) image sensors covers a very extensive range, from industrialautomation to traffic applications such as aiming systems, blind guidance, active/passiverange finders, etc. In this paper CMOS image sensor-based active and passive rangefinders are presented. The measurement scheme of the proposed active/passive rangefinders is based on a simple triangulation method. The designed range finders chieflyconsist of a CMOS image sensor and some light sources such as lasers or LEDs. Theimplementation cost of our range finders is quite low. Image processing software to adjustthe exposure time (ET) of the CMOS image sensor to enhance the performance oftriangulation-based range finders was also developed. An extensive series of experimentswere conducted to evaluate the performance of the designed range finders. From theexperimental results, the distance measurement resolutions achieved by the active rangefinder and the passive range finder can be better than 0.6% and 0.25% within themeasurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests onapplications of the developed CMOS image sensor-based range finders to the automotivefield were also conducted. The experimental results demonstrated that our range finders arewell-suited for distance measurements in this field.

  7. Initial Comparison of the Lightning Imaging Sensor (LIS) with Lightning Detection and Ranging (LDAR)

    Science.gov (United States)

    Ushio, Tomoo; Driscoll, Kevin; Heckman, Stan; Boccippio, Dennis; Koshak, William; Christian, Hugh

    1999-01-01

    The mapping of the lightning optical pulses detected by the Lightning Imaging Sensor (LIS) is compared with the radiation sources by Lightning Detection and Ranging (LDAR) and the National Lightning Detection Network (NLDN) for three thunderstorms observed during and overpasses on 15 August 1998. The comparison involves 122 flashes including 42 ground and 80 cloud flashes. For ground flash, the LIS recorded the subsequent strokes and changes inside the cloud. For cloud flashes, LIS recorded those with higher sources in altitude and larger number of sources. The discrepancies between the LIS and LDAR flash locations are about 4.3 km for cloud flashes and 12.2 km for ground flashes. The reason for these differences remain a mystery.

  8. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics

    Directory of Open Access Journals (Sweden)

    Sheng-Hsun Hsieh

    2016-11-01

    Full Text Available For many practical applications of image sensors, how to extend the depth-of-field (DoF is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known “extended DoF” (EDoF technique, or “wavefront coding,” by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy.

  9. A contest of sensors in close range 3D imaging: performance evaluation with a new metric test object

    Directory of Open Access Journals (Sweden)

    M. Hess

    2014-06-01

    Full Text Available An independent means of 3D image quality assessment is introduced, addressing non-professional users of sensors and freeware, which is largely characterized as closed-sourced and by the absence of quality metrics for processing steps, such as alignment. A performance evaluation of commercially available, state-of-the-art close range 3D imaging technologies is demonstrated with the help of a newly developed Portable Metric Test Artefact. The use of this test object provides quality control by a quantitative assessment of 3D imaging sensors. It will enable users to give precise specifications which spatial resolution and geometry recording they expect as outcome from their 3D digitizing process. This will lead to the creation of high-quality 3D digital surrogates and 3D digital assets. The paper is presented in the form of a competition of teams, and a possible winner will emerge.

  10. A Very Low Dark Current Temperature-Resistant, Wide Dynamic Range, Complementary Metal Oxide Semiconductor Image Sensor

    Science.gov (United States)

    Mizobuchi, Koichi; Adachi, Satoru; Tejada, Jose; Oshikubo, Hiromichi; Akahane, Nana; Sugawa, Shigetoshi

    2008-07-01

    A very low dark current (VLDC) temperature-resistant approach which best suits a wide dynamic range (WDR) complementary metal oxide semiconductor (CMOS) image sensor with a lateral over-flow integration capacitor (LOFIC) has been developed. By implementing a low electric field photodiode without a trade-off of full well-capacity, reduced plasma damage, re-crystallization, and termination of silicon-silicon dioxide interface states in the front end of line and back end of line (FEOL and BEOL) in a 0.18 µm, two polycrystalline silicon, three metal (2P3M) process, the dark current is reduced to 11 e-/s/pixel (0.35 e-/s/µm2: pixel area normalized) at 60 °C, which is the lowest value ever reported. For further robustness at low and high temperatures, 1/3-in., 5.6-µm pitch, 800×600 pixel sensor chips with low noise readout circuits designed for a signal and noise hold circuit and a programmable gain amplifier (PGA) have also been deposited with an inorganic cap layer on a micro-lens and covered with a metal hermetically sealed package assembly. Image sensing performance results in 2.4 e-rms temporal noise and 100 dB dynamic range (DR) with 237 ke- full well-capacity. The operating temperature range is extended from -40 to 85 °C while retaining good image quality.

  11. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders — from Optical Triangulation to the Automotive Field

    Science.gov (United States)

    Wu, Jih-Huah; Pen, Cheng-Chung; Jiang, Joe-Air

    2008-01-01

    With their significant features, the applications of complementary metal-oxide semiconductor (CMOS) image sensors covers a very extensive range, from industrial automation to traffic applications such as aiming systems, blind guidance, active/passive range finders, etc. In this paper CMOS image sensor-based active and passive range finders are presented. The measurement scheme of the proposed active/passive range finders is based on a simple triangulation method. The designed range finders chiefly consist of a CMOS image sensor and some light sources such as lasers or LEDs. The implementation cost of our range finders is quite low. Image processing software to adjust the exposure time (ET) of the CMOS image sensor to enhance the performance of triangulation-based range finders was also developed. An extensive series of experiments were conducted to evaluate the performance of the designed range finders. From the experimental results, the distance measurement resolutions achieved by the active range finder and the passive range finder can be better than 0.6% and 0.25% within the measurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests on applications of the developed CMOS image sensor-based range finders to the automotive field were also conducted. The experimental results demonstrated that our range finders are well-suited for distance measurements in this field. PMID:27879789

  12. A Dynamic Range Enhanced Readout Technique with a Two-Step TDC for High Speed Linear CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Zhiyuan Gao

    2015-11-01

    Full Text Available This paper presents a dynamic range (DR enhanced readout technique with a two-step time-to-digital converter (TDC for high speed linear CMOS image sensors. A multi-capacitor and self-regulated capacitive trans-impedance amplifier (CTIA structure is employed to extend the dynamic range. The gain of the CTIA is auto adjusted by switching different capacitors to the integration node asynchronously according to the output voltage. A column-parallel ADC based on a two-step TDC is utilized to improve the conversion rate. The conversion is divided into coarse phase and fine phase. An error calibration scheme is also proposed to correct quantization errors caused by propagation delay skew within −Tclk~+Tclk. A linear CMOS image sensor pixel array is designed in the 0.13 μm CMOS process to verify this DR-enhanced high speed readout technique. The post simulation results indicate that the dynamic range of readout circuit is 99.02 dB and the ADC achieves 60.22 dB SNDR and 9.71 bit ENOB at a conversion rate of 2 MS/s after calibration, with 14.04 dB and 2.4 bit improvement, compared with SNDR and ENOB of that without calibration.

  13. Long range image enhancement

    CSIR Research Space (South Africa)

    Duvenhage, B

    2015-11-01

    Full Text Available and Vision Computing, Auckland, New Zealand, 23-24 November 2015 Long Range Image Enhancement Bernardt Duvenhage Council for Scientific and Industrial Research South Africa Email: bduvenhage@csir.co.za Abstract Turbulent pockets of air...

  14. Short-range test of the universality of gravitational constant G at the millimeter scale using a digital image sensor

    Science.gov (United States)

    Ninomiya, K.; Akiyama, T.; Hata, M.; Hatori, M.; Iguri, T.; Ikeda, Y.; Inaba, S.; Kawamura, H.; Kishi, R.; Murakami, H.; Nakaya, Y.; Nishio, H.; Ogawa, N.; Onishi, J.; Saiba, S.; Sakuta, T.; Tanaka, S.; Tanuma, R.; Totsuka, Y.; Tsutsui, R.; Watanabe, K.; Murata, J.

    2017-09-01

    The composition dependence of gravitational constant G is measured at the millimeter scale to test the weak equivalence principle, which may be violated at short range through new Yukawa interactions such as the dilaton exchange force. A torsion balance on a turning table with two identical tungsten targets surrounded by two different attractor materials (copper and aluminum) is used to measure gravitational torque by means of digital measurements of a position sensor. Values of the ratios \\tilde{G}_Al-W/\\tilde{G}_Cu-W -1 and \\tilde{G}_Cu-W/GN -1 were (0.9 +/- 1.1sta +/- 4.8sys) × 10-2 and (0.2 +/- 0.9sta +/- 2.1sys) × 10-2 , respectively; these were obtained at a center to center separation of 1.7 cm and surface to surface separation of 4.5 mm between target and attractor, which is consistent with the universality of G. A weak equivalence principle (WEP) violation parameter of η_Al-Cu(r∼ 1 cm)=(0.9 +/- 1.1sta +/- 4.9sys) × 10-2 at the shortest range of around 1 cm was also obtained.

  15. Analog Encoding Voltage—A Key to Ultra-Wide Dynamic Range and Low Power CMOS Image Sensor

    Directory of Open Access Journals (Sweden)

    Orly Yadid-Pecht

    2013-03-01

    Full Text Available Usually Wide Dynamic Range (WDR sensors that autonomously adjust their integration time to fit intra-scene illumination levels use a separate digital memory unit. This memory contains the data needed for the dynamic range. Motivated by the demands for low power and chip area reduction, we propose a different implementation of the aforementioned WDR algorithm by replacing the external digital memory with an analog in-pixel memory. This memory holds the effective integration time represented by analog encoding voltage (AEV. In addition, we present a “ranging” scheme of configuring the pixel integration time in which the effective integration time is configured at the first half of the frame. This enables a substantial simplification of the pixel control during the rest of the frame and thus allows for a significantly more remarkable DR extension. Furthermore, we present the implementation of “ranging” and AEV concepts on two different designs, which are targeted to reach five and eight decades of DR, respectively. We describe in detail the operation of both systems and provide the post-layout simulation results for the second solution. The simulations show that the second design reaches DR up to 170 dBs. We also provide a comparative analysis in terms of the number of operations per pixel required by our solution and by other widespread WDR algorithms. Based on the calculated results, we conclude that the proposed two designs, using “ranging” and AEV concepts, are attractive, since they obtain a wide dynamic range at high operation speed and low power consumption.

  16. Image processing occupancy sensor

    Science.gov (United States)

    Brackney, Larry J.

    2016-09-27

    A system and method of detecting occupants in a building automation system environment using image based occupancy detection and position determinations. In one example, the system includes an image processing occupancy sensor that detects the number and position of occupants within a space that has controllable building elements such as lighting and ventilation diffusers. Based on the position and location of the occupants, the system can finely control the elements to optimize conditions for the occupants, optimize energy usage, among other advantages.

  17. Semiconductor Sensors for a Wide Temperature Range

    OpenAIRE

    Nikolay GORBACHUK; Mikhail LARIONOV; Aleksey FIRSOV; Nikolay SHATIL

    2014-01-01

    Prototype sensors are described that are applicable for pressure, position, temperature, and field measurements in the temperature range of 4.2 to 300 K. The strain gauges utilize the silicon substrate and thin film technology. The tensosensitivity of strain sensors is 40 µV/mln-1 or better depending on metrological characteristics of semiconductor films, orientation, and current. The temperature sensors (thermistors) make use of the germanium powder bulk. The temperature coefficient of resis...

  18. Wide Operational Range Thermal Sensor

    Science.gov (United States)

    Goebel, John H. (Inventor); McMurray, Robert E., Jr. (Inventor)

    2005-01-01

    Bolometer system and method for detecting, at BLIP levels, presence of radiation over a broad range of wavelengths in an infrared spectrum and in a temperature range from 20 K to as high as room temperature. The radiation is received by a Si crystal having a region that is doped with one or more of In, Ga, S, Se, Te, B, Al, P, As and Sb in a concentration ratio in a range such as 5 x 10(exp -11) to 5 x 10(exp -6). Change in electrical resistance delta R due to receipt of the radiation is measured through a change in voltage difference or current within the crystal, and the quantity delta R is converted to an estimate of the amount of radiation received. Optionally, incident radiation having an energy high enough to promote photoconductivity is removed before detection.

  19. Photon-counting image sensors

    CERN Document Server

    Teranishi, Nobukazu; Theuwissen, Albert; Stoppa, David; Charbon, Edoardo

    2017-01-01

    The field of photon-counting image sensors is advancing rapidly with the development of various solid-state image sensor technologies including single photon avalanche detectors (SPADs) and deep-sub-electron read noise CMOS image sensor pixels. This foundational platform technology will enable opportunities for new imaging modalities and instrumentation for science and industry, as well as new consumer applications. Papers discussing various photon-counting image sensor technologies and selected new applications are presented in this all-invited Special Issue.

  20. Capacitive Proximity Sensor Has Longer Range

    Science.gov (United States)

    Vranish, John M.

    1992-01-01

    Capacitive proximity sensor on robot arm detects nearby object via capacitive effect of object on frequency of oscillator. Sensing element part of oscillator circuit operating at about 20 kHz. Total capacitance between sensing element and ground constitutes tuning capacitance of oscillator. Sensor circuit includes shield driven by replica of alternating voltage applied to sensing element. Driven shield concentrates sensing electrostatic field in exterior region to enhance sensitivity to object. Sensitivity and dynamic range has corresponding 12-to-1 improvement.

  1. UAV sensor systems for close-range operations

    Science.gov (United States)

    Larroque, Clement-Serge; Thompson, Karl S.; Hickman, Duncan

    2002-07-01

    Although UAV systems have received much interest over the last few years, much of this has focused on either relatively large platforms with complex on-board equipment, or micro systems (typically 6' in every dimension). The operational use of low-cost lightweight UAVs as over-the- hill reconnaissance systems is a new concept offering additional flexibility, providing local knowledge and helping maintain operational tempo. An extensive modeling trade-off study has been performed for different sensor technologies and combinations. The model considered configurations including cooled and uncooled IR sensors, visible-band CCD sensors and image intensifiers. These mathematical models provide an evaluation of sensor performance for both navigation and the gathering of reconnaissance imagery, through Resolution Elements calculations (Johnson criteria) and Signal-to-Noise Ratios. Based upon this analysis, a system specification is presented that exploits next generation sensor technologies. Results obtained from a number of UAV trials are reported and used in order to provide model verification and validation of both the operational concepts and the sensor system modeling activities. Considering the sensor system itself, the low-altitude close-range environment ensures high ground resolved distance and signal-to-noise ratios, with low-cost sensors. Coupled with up-to-date image processing software, the imagery provided directly to the section-level units via a simple standard image interface allows a reduction of time response. Finally, future modeling and trials activities are discussed in the framework of the lightweight UAV system roadmap.

  2. Semiconductor Sensors for a Wide Temperature Range

    Directory of Open Access Journals (Sweden)

    Nikolay GORBACHUK

    2014-01-01

    Full Text Available Prototype sensors are described that are applicable for pressure, position, temperature, and field measurements in the temperature range of 4.2 to 300 K. The strain gauges utilize the silicon substrate and thin film technology. The tensosensitivity of strain sensors is 40 µV/mln-1 or better depending on metrological characteristics of semiconductor films, orientation, and current. The temperature sensors (thermistors make use of the germanium powder bulk. The temperature coefficient of resistance is within 50-100 % /K at 4.2 K. The magnetic field sensors use GaAs films that offer weak temperature dependence of parameters at high sensitivity (up to 300-400 mV/T.

  3. CMOS sensors for atmospheric imaging

    Science.gov (United States)

    Pratlong, Jérôme; Burt, David; Jerram, Paul; Mayer, Frédéric; Walker, Andrew; Simpson, Robert; Johnson, Steven; Hubbard, Wendy

    2017-09-01

    Recent European atmospheric imaging missions have seen a move towards the use of CMOS sensors for the visible and NIR parts of the spectrum. These applications have particular challenges that are completely different to those that have driven the development of commercial sensors for applications such as cell-phone or SLR cameras. This paper will cover the design and performance of general-purpose image sensors that are to be used in the MTG (Meteosat Third Generation) and MetImage satellites and the technology challenges that they have presented. We will discuss how CMOS imagers have been designed with 4T pixel sizes of up to 250 μm square achieving good charge transfer efficiency, or low lag, with signal levels up to 2M electrons and with high line rates. In both devices a low noise analogue read-out chain is used with correlated double sampling to suppress the readout noise and give a maximum dynamic range that is significantly larger than in standard commercial devices. Radiation hardness is a particular challenge for CMOS detectors and both of these sensors have been designed to be fully radiation hard with high latch-up and single-event-upset tolerances, which is now silicon proven on MTG. We will also cover the impact of ionising radiation on these devices. Because with such large pixels the photodiodes have a large open area, front illumination technology is sufficient to meet the detection efficiency requirements but with thicker than standard epitaxial silicon to give improved IR response (note that this makes latch up protection even more important). However with narrow band illumination reflections from the front and back of the dielectric stack on the top of the sensor produce Fabry-Perot étalon effects, which have been minimised with process modifications. We will also cover the addition of precision narrow band filters inside the MTG package to provide a complete imaging subsystem. Control of reflected light is also critical in obtaining the

  4. Precipitable water and surface humidity over global oceans from special sensor microwave imager and European Center for Medium Range Weather Forecasts

    Science.gov (United States)

    Liu, W. T.; Tang, Wenqing; Wentz, Frank J.

    1992-01-01

    Global fields of precipitable water W from the special sensor microwave imager were compared with those from the European Center for Medium Range Weather Forecasts (ECMWF) model. They agree over most ocean areas; both data sets capture the two annual cycles examined and the interannual anomalies during an ENSO episode. They show significant differences in the dry air masses over the eastern tropical-subtropical oceans, particularly in the Southern Hemisphere. In these regions, comparisons with radiosonde data indicate that overestimation by the ECMWF model accounts for a large part of the differences. As a check on the W differences, surface-level specific humidity Q derived from W, using a statistical relation, was compared with Q from the ECMWF model. The differences in Q were found to be consistent with the differences in W, indirectly validating the Q-W relation. In both W and Q, SSMI was able to discern clearly the equatorial extension of the tongues of dry air in the eastern tropical ocean, while both ECMWF and climatological fields have reduced spatial gradients and weaker intensity.

  5. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process

    Directory of Open Access Journals (Sweden)

    Isao Takayanagi

    2018-01-01

    Full Text Available To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke−. Readout noise under the highest pixel gain condition is 1 e− with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR signal is obtained. Using this technology, a 1/2.7”, 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR approach.

  6. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process.

    Science.gov (United States)

    Takayanagi, Isao; Yoshimura, Norio; Mori, Kazuya; Matsuo, Shinichiro; Tanaka, Shunsuke; Abe, Hirofumi; Yasuda, Naoto; Ishikawa, Kenichiro; Okura, Shunsuke; Ohsawa, Shinji; Otaka, Toshinori

    2018-01-12

    To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke-. Readout noise under the highest pixel gain condition is 1 e- with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR) signal is obtained. Using this technology, a 1/2.7", 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR) approach.

  7. Close-range sensors for small unmanned bottom vehicles

    Science.gov (United States)

    Bernstein, Charles L.

    1999-07-01

    The Surf Zone Reconnaissance Project is developing sensor for small, autonomous, Underwater Bottom-crawling Vehicles (UBVs) for detection and classification of images and obstacles on the ocean bottom in depths between 0 and 20 feet. The challenge is to exploit many target features by using a suite of small, inexpensive, and low-power sensors. The goal is to enable the UBVs to detect and classify objects on the sea floor autonomously. A unique aspect of the project is that sensing can occur at very short ranges. The goal for detection range is two to four meters, and classification may involve direct contact between the sensor and the target. The techniques under development include mechanical impulse response, surface profiling, magnetic anomaly sensing, pulse-induction sensing, shape tracing, and imaging. Initial studies have confirmed the usefulness of several of these techniques. Specific behaviors, termed microbehaviors, are being developed to support each sensor by exploiting the vehicle's mobility. Project plans for FY99 and FY00 include construction of a prototype sensor suite and collection of a signature database. The database will be used to develop fusion, detection, and classification algorithms that will be demonstrated over the next four years for the Office of Naval Research.

  8. Visual Image Sensor Organ Replacement

    Science.gov (United States)

    Maluf, David A.

    2014-01-01

    This innovation is a system that augments human vision through a technique called "Sensing Super-position" using a Visual Instrument Sensory Organ Replacement (VISOR) device. The VISOR device translates visual and other sensors (i.e., thermal) into sounds to enable very difficult sensing tasks. Three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. Because the human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns, the translation of images into sounds reduces the risk of accidentally filtering out important clues. The VISOR device was developed to augment the current state-of-the-art head-mounted (helmet) display systems. It provides the ability to sense beyond the human visible light range, to increase human sensing resolution, to use wider angle visual perception, and to improve the ability to sense distances. It also allows compensation for movement by the human or changes in the scene being viewed.

  9. Sampling Number Effects in 2D and Range Imaging of Range-gated Acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Seong-Ouk; Park, Seung-Kyu; Baik, Sung-Hoon; Cho, Jai-Wan; Jeong, Kyung-Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, we analyzed the number effects of sampling images for making a 2D image and a range image from acquired RGI images. We analyzed the number effects of RGI images for making a 2D image and a range image using a RGI vision system. As the results, 2D image quality was not much depended on the number of sampling images but on how much well extract efficient RGI images. But, the number of RGI images was important for making a range image because range image quality was proportional to the number of RGI images. Image acquiring in a monitoring area of nuclear industry is an important function for safety inspection and preparing appropriate control plans. To overcome the non-visualization problem caused by airborne obstacle particles, vision systems should have extra-functions, such as active illumination lightening through disturbance airborne particles. One of these powerful active vision systems is a range-gated imaging system. The vision system based on the range-gated imaging system can acquire image data from raining or smoking environments. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and 3D images is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra-short exposure time to only get the illumination light. Here, the illuminant illuminates objects by flashing strong light through airborne disturbance particles. Thus, in contrast to passive conventional vision systems, the RGI active vision technology robust for low-visibility environments.

  10. Close-range sensors for small unmanned bottom vehicles: update

    Science.gov (United States)

    Bernstein, Charles L.

    2000-07-01

    The Surf Zone Reconnaissance Project is developing sensors for small, autonomous, Underwater Bottom-crawling Vehicles. The objective is to enable small, crawling robots to autonomously detect and classify mines and obstacles on the ocean bottom in depths between 0 and 10 feet. We have identified a promising set of techniques that will exploit the electromagnetic, shape, texture, image, and vibratory- modal features of this images. During FY99 and FY00 we have worked toward refining these techniques. Signature data sets have been collected for a standard target set to facilitate the development of sensor fusion and target detection and classification algorithms. Specific behaviors, termed microbehaviors, are developed to utilize the robot's mobility to position and operate the sensors. A first generation, close-range sensor suite, composed of 5 sensors, will be completed and tested on a crawling platform in FY00, and will be further refined and demonstrated in FY01 as part of the Mine Countermeasures 6.3 core program sponsored by the Office of Naval Research.

  11. Image-based occupancy sensor

    Science.gov (United States)

    Polese, Luigi Gentile; Brackney, Larry

    2015-05-19

    An image-based occupancy sensor includes a motion detection module that receives and processes an image signal to generate a motion detection signal, a people detection module that receives the image signal and processes the image signal to generate a people detection signal, a face detection module that receives the image signal and processes the image signal to generate a face detection signal, and a sensor integration module that receives the motion detection signal from the motion detection module, receives the people detection signal from the people detection module, receives the face detection signal from the face detection module, and generates an occupancy signal using the motion detection signal, the people detection signal, and the face detection signal, with the occupancy signal indicating vacancy or occupancy, with an occupancy indication specifying that one or more people are detected within the monitored volume.

  12. Uncooled thermal imaging sensor for UAV applications

    Science.gov (United States)

    Cochrane, Derick M.; Manning, Paul A.; Wyllie, Tim A.

    2001-10-01

    Research by DERA aimed at unmanned air vehicle (UAV) size reduction and control automation has led to a unique solution for a short range reconnaissance UAV system. Known as OBSERVER, the UAV conventionally carries a lightweight visible band sensor payload producing imagery with a large 40°x90° field of regard (FOR) to maximize spatial awareness and target detection ranges. Images taken from three CCD camera units set at elevations from plan view and up to the near horizon and are 'stitched' together to produce the large contiguous sensor footprint. This paper describes the design of a thermal imaging (TI) sensor which has been developed to be compatible with the OBSERVER UAV system. The sensor is based on UK uncooled thermal imaging technology research and offers a compact and lightweight solution operating in the 8-12 μm waveband without the need for cryogenic cooling. Infra-red radiation is gathered using two lead scandium tantalate (PST) hybrid thermal detectors each with a 384 X 288 pixel resolution, known as the Very Large Array (VLA). The TI system is designed to maintain the imaging format with that of the visible band sensor. In order to practically achieve this with adequate resolution performance, a dual field of view (FOV) optical system is used within a pitchable gimbal. This combines the advantages of a wide angle 40°x30° FOV for target detection and a narrow angle 13°x10° FOV 'foveal patch' to improve target recognition ranges. The gimbal system can be steered in elevation to give the full 90° coverage as with the visible band sensor footprint. The concept of operation is that targets can be detected over the large FOV and then the air vehicle is maneuvered so as to bring the target into the foveal patch view for recognition at an acceptable stand-off range.

  13. Calibration and control for range imaging in mobile robot navigation

    Energy Technology Data Exchange (ETDEWEB)

    Dorum, O.H. [Norges Tekniske Hoegskole, Trondheim (Norway). Div. of Computer Systems and Telematics; Hoover, A. [University of South Florida, Tampa, FL (United States). Dept. of Computer Science and Engineering; Jones, J.P. [Oak Ridge National Lab., TN (United States)

    1994-06-01

    This paper addresses some issues in the development of sensor-based systems for mobile robot navigation which use range imaging sensors as the primary source for geometric information about the environment. In particular, we describe a model of scanning laser range cameras which takes into account the properties of the mechanical system responsible for image formation and a calibration procedure which yields improved accuracy over previous models. In addition, we describe an algorithm which takes the limitations of these sensors into account in path planning and path execution. In particular, range imaging sensors are characterized by a limited field of view and a standoff distance -- a minimum distance nearer than which surfaces cannot be sensed. These limitations can be addressed by enriching the concept of configuration space to include information about what can be sensed from a given configuration, and using this information to guide path planning and path following.

  14. Imaging Sensors: Artificial and Natural

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 4; Issue 2. Imaging Sensors: Artificial and Natural. Vikram Dhar. General Article Volume 4 Issue 2 February 1999 pp 27-36. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/004/02/0027-0036 ...

  15. The emerging versatility of a scannerless range imager

    Energy Technology Data Exchange (ETDEWEB)

    Sackos, J.; Bradley, B.; Nellums, B.; Diegert, C.

    1996-04-01

    Sandia National Laboratories is nearing the completion of the initial development of a unique type of range imaging sensor. This innovative imaging optical radar is based on an active flood-light scene illuminator and an image intensified CCD camera receiver. It is an all solid-state device (no moving parts) and offers significant size, performance, reliability, simplicity, and affordability advantages over other types of 3-D sensor technologies, including: scanned laser radar, stereo vision, and structured lighting. The sensor is based on low cost, commercially available hardware, and is very well suited for affordable application to a wide variety of military and commercial uses, including: munition guidance, target recognition, robotic vision, automated inspection, driver enhanced vision, collision avoidance, site security and monitoring, terrain mapping, and facility surveying. This paper reviews the sensor technology and its development for the advanced conventional munition guidance application, and discusses a few of the many other emerging applications for this new innovative sensor technology.

  16. Thresholded Range Aggregation in Sensor Networks

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Lin, Zhifeng; Mamoulis, Nikos

    2010-01-01

    The recent advances in wireless sensor technologies (e.g., Mica, Telos motes) enable the economic deployment of lightweight sensors for capturing data from their surrounding environment, serving various monitoring tasks, like forest wildfire alarming and volcano activity. We propose a novel query...

  17. Image sensors for radiometric measurements in the ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Desa, E.S.; Desa, B.A.E.

    image sensors for use in obtaining high resolution spectra of the upward and downward irradiance light fields in the ocean. Image sensors studied here, have practical dynamic ranges of the order of (10 super(4)). Given this, it is possible to work...

  18. 3D imaging without range information

    Science.gov (United States)

    Rogers, J. D.; Myatt, D. R.

    2010-04-01

    Three-dimensional (3D) imaging technologies have considerable potential for aiding military operations in areas such as reconnaissance, mission planning and situational awareness through improved visualisation and user-interaction. This paper describes the development of fast 3D imaging capabilities from low-cost, passive sensors. The two systems discussed here are capable of passive depth perception and recovering 3D structure from a single electro-optic sensor attached to an aerial vehicle that is, for example, circling a target. Based on this example, the proposed method has been shown to produce high quality results when positional data of the sensor is known, and also in the more challenging case when the sensor geometry must be estimated from the input imagery alone. The methods described exploit prior knowledge concerning the type of sensor that is used to produce a more robust output.

  19. Passive Wireless Temperature Sensors with Enhanced Sensitivity and Range Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal describes the development of passive surface acoustic wave (SAW) temperature sensors with enhanced sensitivity and detection range for NASA application...

  20. Vertical Silicon Nanowires for Image Sensor Applications

    OpenAIRE

    Park, Hyunsung

    2014-01-01

    Conventional image sensors achieve color imaging using absorptive organic dye filters. These face considerable challenges however in the trend toward ever higher pixel densities and advanced imaging methods such as multispectral imaging and polarization-resolved imaging. In this dissertation, we investigate the optical properties of vertical silicon nanowires with the goal of image sensor applications. First, we demonstrate a multispectral imaging system that uses a novel filter that consists...

  1. Temperature Sensors Integrated into a CMOS Image Sensor

    NARCIS (Netherlands)

    Abarca Prouza, A.N.; Xie, S.; Markenhof, Jules; Theuwissen, A.J.P.

    2017-01-01

    In this work, a novel approach is presented for measuring relative temperature variations inside the pixel array of a CMOS image sensor itself. This approach can give important information when compensation for dark (current) fixed pattern noise (FPN) is needed. The test image sensor consists of

  2. Progress in sensor performance testing, modeling and range prediction using the TOD method : An overview

    NARCIS (Netherlands)

    Bijl, P.; Hogervorst, M.A.; Toet, A.

    2017-01-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide

  3. POTENTIALS OF IMAGE BASED ACTIVE RANGING TO CAPTURE DYNAMIC SCENES

    Directory of Open Access Journals (Sweden)

    B. Jutzi

    2012-09-01

    Full Text Available Obtaining a 3D description of man-made and natural environments is a basic task in Computer Vision and Remote Sensing. To this end, laser scanning is currently one of the dominating techniques to gather reliable 3D information. The scanning principle inherently needs a certain time interval to acquire the 3D point cloud. On the other hand, new active sensors provide the possibility of capturing range information by images with a single measurement. With this new technique image-based active ranging is possible which allows capturing dynamic scenes, e.g. like walking pedestrians in a yard or moving vehicles. Unfortunately most of these range imaging sensors have strong technical limitations and are not yet sufficient for airborne data acquisition. It can be seen from the recent development of highly specialized (far-range imaging sensors – so called flash-light lasers – that most of the limitations could be alleviated soon, so that future systems will be equipped with improved image size and potentially expanded operating range. The presented work is a first step towards the development of methods capable for application of range images in outdoor environments. To this end, an experimental setup was set up for investigating these proposed possibilities. With the experimental setup a measurement campaign was carried out and first results will be presented within this paper.

  4. Interferometric fiber optic sensors for biomedical applications of optoacoustic imaging.

    Science.gov (United States)

    Lamela, Horacio; Gallego, Daniel; Gutierrez, Rebeca; Oraevsky, Alexander

    2011-03-01

    We present a non-metallic interferometric silica optical fiber ultrasonic wideband sensor for optoacoustic imaging applications. The ultrasonic sensitivity of this sensor has been characterized over the frequency range from 1 to 10 MHz. A comparative analysis has been carried out between this sensor and an array of piezoelectric transducers using optoacoustic signals generated from an optical absorbent embedded in a tissue mimicking phantom. Also, a two dimensional reconstructed image of the phantom using the fiber interferometric sensor is presented and compared to the image obtained using the Laser Optoacoustic Imaging System, LOIS-64B. The feasibility of our fiber optic based sensor for wideband ultrasonic detection is demonstrated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Wide Stiffness Range Cavity Optomechanical Sensors for Atomic Force Microscopy

    CERN Document Server

    Liu, Yuxiang; Aksyuk, Vladimir; Srinivasan, Kartik

    2012-01-01

    We report on progress in developing compact sensors for atomic force microscopy (AFM), in which the mechanical transducer is integrated with near-field optical readout on a single chip. The motion of a nanoscale, doubly-clamped cantilever was transduced by an adjacent high quality factor silicon microdisk cavity. In particular, we show that displacement sensitivity on the order of 1 fm/(Hz)^(1/2) can be achieved while the cantilever stiffness is varied over four orders of magnitude (\\approx 0.01 N/m to \\approx 290 N/m). The ability to transduce both very soft and very stiff cantilevers extends the domain of applicability of this technique, potentially ranging from interrogation of microbiological samples (soft cantilevers) to imaging with high resolution (stiff cantilevers). Along with mechanical frequencies (> 250 kHz) that are much higher than those used in conventional AFM probes of similar stiffness, these results suggest that our cavity optomechanical sensors may have application in a wide variety of hig...

  6. FIRST EXPERIENCES WITH KINECT V2 SENSOR FOR CLOSE RANGE 3D MODELLING

    Directory of Open Access Journals (Sweden)

    E. Lachat

    2015-02-01

    Full Text Available RGB-D cameras, also known as range imaging cameras, are a recent generation of sensors. As they are suitable for measuring distances to objects at high frame rate, such sensors are increasingly used for 3D acquisitions, and more generally for applications in robotics or computer vision. This kind of sensors became popular especially since the Kinect v1 (Microsoft arrived on the market in November 2010. In July 2014, Windows has released a new sensor, the Kinect for Windows v2 sensor, based on another technology as its first device. However, due to its initial development for video games, the quality assessment of this new device for 3D modelling represents a major investigation axis. In this paper first experiences with Kinect v2 sensor are related, and the ability of close range 3D modelling is investigated. For this purpose, error sources on output data as well as a calibration approach are presented.

  7. A Biologically Inspired CMOS Image Sensor

    CERN Document Server

    Sarkar, Mukul

    2013-01-01

    Biological systems are a source of inspiration in the development of small autonomous sensor nodes. The two major types of optical vision systems found in nature are the single aperture human eye and the compound eye of insects. The latter are among the most compact and smallest vision sensors. The eye is a compound of individual lenses with their own photoreceptor arrays.  The visual system of insects allows them to fly with a limited intelligence and brain processing power. A CMOS image sensor replicating the perception of vision in insects is discussed and designed in this book for industrial (machine vision) and medical applications. The CMOS metal layer is used to create an embedded micro-polarizer able to sense polarization information. This polarization information is shown to be useful in applications like real time material classification and autonomous agent navigation. Further the sensor is equipped with in pixel analog and digital memories which allow variation of the dynamic range and in-pixel b...

  8. Performance of an Ultrasonic Ranging Sensor in Apple Tree Canopies

    Directory of Open Access Journals (Sweden)

    Alexandre Escolà

    2011-02-01

    Full Text Available Electronic canopy characterization is an important issue in tree crop management. Ultrasonic and optical sensors are the most used for this purpose. The objective of this work was to assess the performance of an ultrasonic sensor under laboratory and field conditions in order to provide reliable estimations of distance measurements to apple tree canopies. To this purpose, a methodology has been designed to analyze sensor performance in relation to foliage ranging and to interferences with adjacent sensors when working simultaneously. Results show that the average error in distance measurement using the ultrasonic sensor in laboratory conditions is ±0.53 cm. However, the increase of variability in field conditions reduces the accuracy of this kind of sensors when estimating distances to canopies. The average error in such situations is ±5.11 cm. When analyzing interferences of adjacent sensors 30 cm apart, the average error is ±17.46 cm. When sensors are separated 60 cm, the average error is ±9.29 cm. The ultrasonic sensor tested has been proven to be suitable to estimate distances to the canopy in field conditions when sensors are 60 cm apart or more and could, therefore, be used in a system to estimate structural canopy parameters in precision horticulture.

  9. Performance of an Ultrasonic Ranging Sensor in Apple Tree Canopies

    Science.gov (United States)

    Escolà, Alexandre; Planas, Santiago; Rosell, Joan Ramon; Pomar, Jesús; Camp, Ferran; Solanelles, Francesc; Gracia, Felip; Llorens, Jordi; Gil, Emilio

    2011-01-01

    Electronic canopy characterization is an important issue in tree crop management. Ultrasonic and optical sensors are the most used for this purpose. The objective of this work was to assess the performance of an ultrasonic sensor under laboratory and field conditions in order to provide reliable estimations of distance measurements to apple tree canopies. To this purpose, a methodology has been designed to analyze sensor performance in relation to foliage ranging and to interferences with adjacent sensors when working simultaneously. Results show that the average error in distance measurement using the ultrasonic sensor in laboratory conditions is ±0.53 cm. However, the increase of variability in field conditions reduces the accuracy of this kind of sensors when estimating distances to canopies. The average error in such situations is ±5.11 cm. When analyzing interferences of adjacent sensors 30 cm apart, the average error is ±17.46 cm. When sensors are separated 60 cm, the average error is ±9.29 cm. The ultrasonic sensor tested has been proven to be suitable to estimate distances to the canopy in field conditions when sensors are 60 cm apart or more and could, therefore, be used in a system to estimate structural canopy parameters in precision horticulture. PMID:22163749

  10. Performance of an ultrasonic ranging sensor in apple tree canopies.

    Science.gov (United States)

    Escolà, Alexandre; Planas, Santiago; Rosell, Joan Ramon; Pomar, Jesús; Camp, Ferran; Solanelles, Francesc; Gracia, Felip; Llorens, Jordi; Gil, Emilio

    2011-01-01

    Electronic canopy characterization is an important issue in tree crop management. Ultrasonic and optical sensors are the most used for this purpose. The objective of this work was to assess the performance of an ultrasonic sensor under laboratory and field conditions in order to provide reliable estimations of distance measurements to apple tree canopies. To this purpose, a methodology has been designed to analyze sensor performance in relation to foliage ranging and to interferences with adjacent sensors when working simultaneously. Results show that the average error in distance measurement using the ultrasonic sensor in laboratory conditions is ±0.53 cm. However, the increase of variability in field conditions reduces the accuracy of this kind of sensors when estimating distances to canopies. The average error in such situations is ±5.11 cm. When analyzing interferences of adjacent sensors 30 cm apart, the average error is ±17.46 cm. When sensors are separated 60 cm, the average error is ±9.29 cm. The ultrasonic sensor tested has been proven to be suitable to estimate distances to the canopy in field conditions when sensors are 60 cm apart or more and could, therefore, be used in a system to estimate structural canopy parameters in precision horticulture.

  11. Platinum sensors versus KTY and NTC in low temperature range

    Energy Technology Data Exchange (ETDEWEB)

    Wienand, K. [Heraeus Sensor-Nite GmbH, Kleinostheim (Germany); Gerwen, P. van [Heraeus Sensor-Nite N.V., Leuven (Netherlands); Reinwald, H.J. [Heraeus Sensor-Nite Int., Freiberg (Germany)

    2001-07-01

    On the automotive electronics market, negative temperature coefficient sensors (NTC) and silicon spreading resistance sensors (KTY) have increasingly been used above all in the temperature range between -40 and +150 C. The latest demands of the automotive industry show that these tight temperature limits will no longer meet the requirements in the future. Moreover, the automotive industry is more frequently expanding the temperature measuring range to between -55 C and 180 C, for example in engine oil. This trend can also be seen in the commercial vehicle field, for example with retarders which also heat the oil to a great extent. Due to these increasingly more demanding conditions, platinum (Pt) sensors are being used more and more, as they have a number of advantages compared with NTCs or KTYs. The pros and cons of using these three sensor types are explained in more detail in the following. (orig.)

  12. Current-mode CMOS hybrid image sensor

    Science.gov (United States)

    Benyhesan, Mohammad Kassim

    Digital imaging is growing rapidly making Complimentary Metal-Oxide-Semi conductor (CMOS) image sensor-based cameras indispensable in many modern life devices like cell phones, surveillance devices, personal computers, and tablets. For various purposes wireless portable image systems are widely deployed in many indoor and outdoor places such as hospitals, urban areas, streets, highways, forests, mountains, and towers. However, the increased demand on high-resolution image sensors and improved processing features is expected to increase the power consumption of the CMOS sensor-based camera systems. Increased power consumption translates into a reduced battery life-time. The increased power consumption might not be a problem if there is access to a nearby charging station. On the other hand, the problem arises if the image sensor is located in widely spread areas, unfavorable to human intervention, and difficult to reach. Given the limitation of energy sources available for wireless CMOS image sensor, an energy harvesting technique presents a viable solution to extend the sensor life-time. Energy can be harvested from the sun light or the artificial light surrounding the sensor itself. In this thesis, we propose a current-mode CMOS hybrid image sensor capable of energy harvesting and image capture. The proposed sensor is based on a hybrid pixel that can be programmed to perform the task of an image sensor and the task of a solar cell to harvest energy. The basic idea is to design a pixel that can be configured to exploit its internal photodiode to perform two functions: image sensing and energy harvesting. As a proof of concept a 40 x 40 array of hybrid pixels has been designed and fabricated in a standard 0.5 microm CMOS process. Measurement results show that up to 39 microW of power can be harvested from the array under 130 Klux condition with an energy efficiency of 220 nJ /pixel /frame. The proposed image sensor is a current-mode image sensor which has several

  13. Thermal luminescence spectroscopy chemical imaging sensor.

    Science.gov (United States)

    Carrieri, Arthur H; Buican, Tudor N; Roese, Erik S; Sutter, James; Samuels, Alan C

    2012-10-01

    The authors present a pseudo-active chemical imaging sensor model embodying irradiative transient heating, temperature nonequilibrium thermal luminescence spectroscopy, differential hyperspectral imaging, and artificial neural network technologies integrated together. We elaborate on various optimizations, simulations, and animations of the integrated sensor design and apply it to the terrestrial chemical contamination problem, where the interstitial contaminant compounds of detection interest (analytes) comprise liquid chemical warfare agents, their various derivative condensed phase compounds, and other material of a life-threatening nature. The sensor must measure and process a dynamic pattern of absorptive-emissive middle infrared molecular signature spectra of subject analytes to perform its chemical imaging and standoff detection functions successfully.

  14. Panoramic imaging perimeter sensor design and modeling

    Energy Technology Data Exchange (ETDEWEB)

    Pritchard, D.A.

    1993-12-31

    This paper describes the conceptual design and preliminary performance modeling of a 360-degree imaging sensor. This sensor combines automatic perimeter intrusion detection with immediate visual assessment and is intended to be used for fast deployment around fixed or temporary high-value assets. The sensor requirements, compiled from various government agencies, are summarized. The conceptual design includes longwave infrared and visible linear array technology. An auxiliary millimeter-wave sensing technology is also considered for use during periods of infrared and visible obscuration. The infrared detectors proposed for the sensor design are similar to the Standard Advanced Dewar Assembly Types Three A and B (SADA-IIIA/B). An overview of the sensor and processor is highlighted. The infrared performance of this sensor design has been predicted using existing thermal imaging system models and is described in the paper. Future plans for developing a prototype are also presented.

  15. Onboard Image Processing System for Hyperspectral Sensor.

    Science.gov (United States)

    Hihara, Hiroki; Moritani, Kotaro; Inoue, Masao; Hoshi, Yoshihiro; Iwasaki, Akira; Takada, Jun; Inada, Hitomi; Suzuki, Makoto; Seki, Taeko; Ichikawa, Satoshi; Tanii, Jun

    2015-09-25

    Onboard image processing systems for a hyperspectral sensor have been developed in order to maximize image data transmission efficiency for large volume and high speed data downlink capacity. Since more than 100 channels are required for hyperspectral sensors on Earth observation satellites, fast and small-footprint lossless image compression capability is essential for reducing the size and weight of a sensor system. A fast lossless image compression algorithm has been developed, and is implemented in the onboard correction circuitry of sensitivity and linearity of Complementary Metal Oxide Semiconductor (CMOS) sensors in order to maximize the compression ratio. The employed image compression method is based on Fast, Efficient, Lossless Image compression System (FELICS), which is a hierarchical predictive coding method with resolution scaling. To improve FELICS's performance of image decorrelation and entropy coding, we apply a two-dimensional interpolation prediction and adaptive Golomb-Rice coding. It supports progressive decompression using resolution scaling while still maintaining superior performance measured as speed and complexity. Coding efficiency and compression speed enlarge the effective capacity of signal transmission channels, which lead to reducing onboard hardware by multiplexing sensor signals into a reduced number of compression circuits. The circuitry is embedded into the data formatter of the sensor system without adding size, weight, power consumption, and fabrication cost.

  16. RSA/Legacy Wind Sensor Comparison. Part 2; Eastern Range

    Science.gov (United States)

    Short, David A.; Wheeler, Mark M.

    2006-01-01

    This report describes a comparison of data from ultrasonic and propeller-and-vane anemometers on 5 wind towers at Kennedy Space Center and Cape Canaveral Air Force Station. The ultrasonic sensors are scheduled to replace the Legacy propeller-and-vane sensors under the Range Standardization and Automation (RSA) program. Because previous studies have noted differences between peak wind speeds reported by mechanical and ultrasonic wind sensors, the latter having no moving parts, the 30th and 45th Weather Squadrons wanted to understand possible differences between the two sensor types. The period-of-record was 13-30 May 2005, A total of 357,626 readings of 1-minute average and peak wind speed/direction from each sensor type were used. Statistics of differences in speed and direction were used to identify 15 out of 19 RSA sensors having the most consistent performance, with respect to the Legacy sensors. RSA average wind speed data from these 15 showed a small positive bias of 0.38 kts. A slightly larger positive bias of 0.94 kts was found in the RSA peak wind speed.

  17. RSA/Legacy Wind Sensor Comparison. Part 1; Western Range

    Science.gov (United States)

    Short, David A.; Wheeler, Mark M.

    2006-01-01

    This report describes a comparison of data from ultrasonic and cup-and-vane anemometers on 5 wind towers at Vandenberg AFB. The ultrasonic sensors are scheduled to replace the Legacy cup-and-vane sensors under the Range Standardization and Automation (RSA) program. Because previous studies have noted differences between peak wind speeds reported by mechanical and ultrasonic wind sensors, the latter having no moving parts, the 30th and 45th Weather Squadrons wanted to understand possible differences between the two sensor types. The period-of-record was 13-30 May 2005. A total of 153,961 readings of I-minute average and peak wind speed/direction from each sensor type were used. Statistics of differences in speed and direction were used to identify 18 out of 34 RSA sensors having the most consistent performance, with respect to the Legacy sensors. Data from these 18 were used to form a composite comparison. A small positive bias in the composite RSA average wind speed increased from +0.5 kts at 15 kts, to +1 kt at 25 kts. A slightly larger positive bias in the RSA peak wind speed increased from +1 kt at 15 kts, to +2 kts at 30 kts.

  18. Range imager performance comparison in homodyne and heterodyne operating modes

    Science.gov (United States)

    Conroy, Richard M.; Dorrington, Adrian A.; Künnemeyer, Rainer; Cree, Michael J.

    2009-01-01

    Range imaging cameras measure depth simultaneously for every pixel in a given field of view. In most implementations the basic operating principles are the same. A scene is illuminated with an intensity modulated light source and the reflected signal is sampled using a gain-modulated imager. Previously we presented a unique heterodyne range imaging system that employed a bulky and power hungry image intensifier as the high speed gain-modulation mechanism. In this paper we present a new range imager using an internally modulated image sensor that is designed to operate in heterodyne mode, but can also operate in homodyne mode. We discuss homodyne and heterodyne range imaging, and the merits of the various types of hardware used to implement these systems. Following this we describe in detail the hardware and firmware components of our new ranger. We experimentally compare the two operating modes and demonstrate that heterodyne operation is less sensitive to some of the limitations suffered in homodyne mode, resulting in better linearity and ranging precision characteristics. We conclude by showing various qualitative examples that demonstrate the system's three-dimensional measurement performance.

  19. Imaging in scattering media using correlation image sensors and sparse convolutional coding

    KAUST Repository

    Heide, Felix

    2014-10-17

    Correlation image sensors have recently become popular low-cost devices for time-of-flight, or range cameras. They usually operate under the assumption of a single light path contributing to each pixel. We show that a more thorough analysis of the sensor data from correlation sensors can be used can be used to analyze the light transport in much more complex environments, including applications for imaging through scattering and turbid media. The key of our method is a new convolutional sparse coding approach for recovering transient (light-in-flight) images from correlation image sensors. This approach is enabled by an analysis of sparsity in complex transient images, and the derivation of a new physically-motivated model for transient images with drastically improved sparsity.

  20. Smart CMOS image sensor for lightning detection and imaging

    OpenAIRE

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-01-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel fra...

  1. A Wide Range Temperature Sensor Using SOI Technology

    Science.gov (United States)

    Patterson, Richard L.; Elbuluk, Malik E.; Hammoud, Ahmad

    2009-01-01

    Silicon-on-insulator (SOI) technology is becoming widely used in integrated circuit chips for its advantages over the conventional silicon counterpart. The decrease in leakage current combined with lower power consumption allows electronics to operate in a broader temperature range. This paper describes the performance of an SOIbased temperature sensor under extreme temperatures and thermal cycling. The sensor comprised of a temperature-to-frequency relaxation oscillator circuit utilizing an SOI precision timer chip. The circuit was evaluated under extreme temperature exposure and thermal cycling between -190 C and +210 C. The results indicate that the sensor performed well over the entire test temperature range and it was able to re-start at extreme temperatures.

  2. UV-sensitive scientific CCD image sensors

    Science.gov (United States)

    Vishnevsky, Grigory I.; Kossov, Vladimir G.; Iblyaminova, A. F.; Lazovsky, Leonid Y.; Vydrevitch, Michail G.

    1997-06-01

    An investigation of probe laser irradiation interaction with substances containing in an environment has long since become a recognized technique for contamination detection and identification. For this purpose, a near and midrange-IR laser irradiation is traditionally used. However, as many works presented on last ecology monitoring conferences show, in addition to traditional systems, rapidly growing are systems with laser irradiation from near-UV range (250 - 500 nm). Use of CCD imagers is one of the prerequisites for this allowing the development of a multi-channel computer-based spectral research system. To identify and analyze contaminating impurities on an environment, such methods as laser fluorescence analysis, UV absorption and differential spectroscopy, Raman scattering are commonly used. These methods are used to identify a large number of impurities (petrol, toluene, Xylene isomers, SO2, acetone, methanol), to detect and identify food pathogens in real time, to measure a concentration of NH3, SO2 and NO in combustion outbursts, to detect oil products in a water, to analyze contaminations in ground waters, to define ozone distribution in the atmosphere profile, to monitor various chemical processes including radioactive materials manufacturing, heterogeneous catalytic reactions, polymers production etc. Multi-element image sensor with enhanced UV sensitivity, low optical non-uniformity, low intrinsic noise and high dynamic range is a key element of all above systems. Thus, so called Virtual Phase (VP) CCDs possessing all these features, seems promising for ecology monitoring spectral measuring systems. Presently, a family of VP CCDs with different architecture and number of pixels is developed and being manufactured. All CCDs from this family are supported with a precise slow-scan digital image acquisition system that can be used in various image processing systems in astronomy, biology, medicine, ecology etc. An image is displayed directly on a PC

  3. Passive ranging using an infrared search and track sensor

    NARCIS (Netherlands)

    Visser, M. de; Schwering, P.B.W.; Groot, J.F. de; Hendricks, E.A.

    2006-01-01

    We present new techniques for passive ranging with a dual-band IR search and track (IRST) sensor aboard a ship. Three distance estimation methods are described: the atmospheric propagation model, the apparent surface of the target, and target motion analysis (TMA). These methods are tested on the

  4. Enhanced dynamic range x-ray imaging.

    Science.gov (United States)

    Haidekker, Mark A; Morrison, Logan Dain-Kelley; Sharma, Ajay; Burke, Emily

    2017-03-01

    X-ray images can suffer from excess contrast. Often, image exposure is chosen to visually optimize the region of interest, but at the expense of over- and underexposed regions elsewhere in the image. When image values are interpreted quantitatively as projected absorption, both over- and underexposure leads to the loss of quantitative information. We propose to combine multiple exposures into a composite that uses only pixels from those exposures in which they are neither under- nor overexposed. The composite image is created in analogy to visible-light high dynamic range photography. We present the mathematical framework for the recovery of absorbance from such composite images and demonstrate the method with biological and non-biological samples. We also show with an aluminum step-wedge that accurate recovery of step thickness from the absorbance values is possible, thereby highlighting the quantitative nature of the presented method. Due to the higher amount of detail encoded in an enhanced dynamic range x-ray image, we expect that the number of retaken images can be reduced, and patient exposure overall reduced. We also envision that the method can improve dual energy absorptiometry and even computed tomography by reducing the number of low-exposure ("photon-starved") projections. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Millimeter-wave sensor image enhancement

    Science.gov (United States)

    Wilson, William J.; Suess, Helmut

    1989-01-01

    Images from an airborne, scanning radiometer operating at a frequency of 98 GHz have been analyzed. The millimeter-wave images were obtained in 1985-1986 using the JPL millimeter-wave imaging sensor. The goal of this study was to enhance the information content of these images and make their interpretation easier. A visual-interpretative approach was used for information extraction from the images. This included application of nonlinear transform techniques for noise reduction and for color, contrast, and edge enhancement. Results of using the techniques on selected millimeter-wave images are discussed.

  6. Compact hyperspectral image sensor based on a novel hyperspectral encoder

    Science.gov (United States)

    Hegyi, Alex N.; Martini, Joerg

    2015-06-01

    A novel hyperspectral imaging sensor is demonstrated that can enable breakthrough applications of hyperspectral imaging in domains not previously accessible. Our technology consists of a planar hyperspectral encoder combined with a traditional monochrome image sensor. The encoder adds negligibly to the sensor's overall size, weight, power requirement, and cost (SWaP-C); therefore, the new imager can be incorporated wherever image sensors are currently used, such as in cell phones and other consumer electronics. In analogy to Fourier spectroscopy, the technique maintains a high optical throughput because narrow-band spectral filters are unnecessary. Unlike conventional Fourier techniques that rely on Michelson interferometry, our hyperspectral encoder is robust to vibration and amenable to planar integration. The device can be viewed within a computational optics paradigm: the hardware is uncomplicated and serves to increase the information content of the acquired data, and the complexity of the system, that is, the decoding of the spectral information, is shifted to computation. Consequently, system tradeoffs, for example, between spectral resolution and imaging speed or spatial resolution, are selectable in software. Our prototype demonstration of the hyperspectral imager is based on a commercially-available silicon CCD. The prototype encoder was inserted within the camera's ~1 cu. in. housing. The prototype can image about 49 independent spectral bands distributed from 350 nm to 1250 nm, but the technology may be extendable over a wavelength range from ~300 nm to ~10 microns, with suitable choice of detector.

  7. Disocclusion of 3d LIDAR Point Clouds Using Range Images

    Science.gov (United States)

    Biasutti, P.; Aujol, J.-F.; Brédif, M.; Bugeau, A.

    2017-05-01

    This paper proposes a novel framework for the disocclusion of mobile objects in 3D LiDAR scenes aquired via street-based Mobile Mapping Systems (MMS). Most of the existing lines of research tackle this problem directly in the 3D space. This work promotes an alternative approach by using a 2D range image representation of the 3D point cloud, taking advantage of the fact that the problem of disocclusion has been intensively studied in the 2D image processing community over the past decade. First, the point cloud is turned into a 2D range image by exploiting the sensor's topology. Using the range image, a semi-automatic segmentation procedure based on depth histograms is performed in order to select the occluding object to be removed. A variational image inpainting technique is then used to reconstruct the area occluded by that object. Finally, the range image is unprojected as a 3D point cloud. Experiments on real data prove the effectiveness of this procedure both in terms of accuracy and speed.

  8. Compressive Sensing Image Sensors-Hardware Implementation

    Directory of Open Access Journals (Sweden)

    Shahram Shirani

    2013-04-01

    Full Text Available The compressive sensing (CS paradigm uses simultaneous sensing and compression to provide an efficient image acquisition technique. The main advantages of the CS method include high resolution imaging using low resolution sensor arrays and faster image acquisition. Since the imaging philosophy in CS imagers is different from conventional imaging systems, new physical structures have been developed for cameras that use the CS technique. In this paper, a review of different hardware implementations of CS encoding in optical and electrical domains is presented. Considering the recent advances in CMOS (complementary metal–oxide–semiconductor technologies and the feasibility of performing on-chip signal processing, important practical issues in the implementation of CS in CMOS sensors are emphasized. In addition, the CS coding for video capture is discussed.

  9. Range-Image Acquisition for Discriminated Objects in a Range-gated Robot Vision System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seung-Kyu; Ahn, Yong-Jin; Park, Nak-Kyu; Baik, Sung-Hoon; Choi, Young-Soo; Jeong, Kyung-Min [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The imaging capability of a surveillance vision system from harsh low-visibility environments such as in fire and detonation areas is a key function to monitor the safety of the facilities. 2D and range image data acquired from low-visibility environment are important data to assess the safety and prepare appropriate countermeasures. Passive vision systems, such as conventional camera and binocular stereo vision systems usually cannot acquire image information when the reflected light is highly scattered and absorbed by airborne particles such as fog. In addition, the image resolution captured through low-density airborne particles is decreased because the image is blurred and dimmed by the scattering, emission and absorption. Active vision systems, such as structured light vision and projected stereo vision are usually more robust for harsh environment than passive vision systems. However, the performance is considerably decreased in proportion to the density of the particles. The RGI system provides 2D and range image data from several RGI images and it moreover provides clear images from low-visibility fog and smoke environment by using the sum of time-sliced images. Nowadays, the Range-gated (RG) imaging is an emerging technology in the field of surveillance for security applications, especially in the visualization of invisible night and fog environment. Although RGI viewing was discovered in the 1960's, this technology is, nowadays becoming more applicable by virtue of the rapid development of optical and sensor technologies. Especially, this system can be adopted in robot-vision system by virtue of its compact portable configuration. In contrast to passive vision systems, this technology enables operation even in harsh environments like fog and smoke. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been

  10. Imaging Sensors: Artificial and Natural

    Indian Academy of Sciences (India)

    nature, is that of 'smart' skins. These consist of semiconductor sensors embedded in the skin of ~he robot, vehicle or aircraft which will measure pressure, temperature, pH etc., monitoring its 'health' and enhancing its survivability. Obviously, all these sensory data have to be fused in the robot's brain (without overloading it!)

  11. Wide Dynamic Range CMOS Potentiostat for Amperometric Chemical Sensor

    Directory of Open Access Journals (Sweden)

    Wei-Song Wang

    2010-03-01

    Full Text Available Presented is a single-ended potentiostat topology with a new interface connection between sensor electrodes and potentiostat circuit to avoid deviation of cell voltage and linearly convert the cell current into voltage signal. Additionally, due to the increased harmonic distortion quantity when detecting low-level sensor current, the performance of potentiostat linearity which causes the detectable current and dynamic range to be limited is relatively decreased. Thus, to alleviate these irregularities, a fully-differential potentiostat is designed with a wide output voltage swing compared to single-ended potentiostat. Two proposed potentiostats were implemented using TSMC 0.18-μm CMOS process for biomedical application. Measurement results show that the fully differential potentiostat performs relatively better in terms of linearity when measuring current from 500 ºpA to 10 uA. Besides, the dynamic range value can reach a value of 86 dB.

  12. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    Science.gov (United States)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  13. Dual fluorescence sensor for trace oxygen and temperature with unmatched range and sensitivity.

    Science.gov (United States)

    Baleizão, Carlos; Nagl, Stefan; Schäferling, Michael; Berberan-Santos, Mário N; Wolfbeis, Otto S

    2008-08-15

    An optical dual sensor for oxygen and temperature is presented that is highly oxygen sensitive and covers a broad temperature range. Dual sensing is based on luminescence lifetime measurements. The novel sensor contains two luminescent compounds incorporated into polymer films. The temperature-sensitive dye (ruthenium tris-1,10-phenanthroline) has a highly temperature-dependent luminescence and is incorporated in poly(acrylonitrile) to avoid cross-sensitivity to oxygen. Fullerene C70 was used as the oxygen-sensitive probe owing to its strong thermally activated delayed fluorescence at elevated temperatures that is extremely oxygen sensitive. The cross-sensitivity of C70 to temperature is accounted for by means of the temperature sensor. C70 is incorporated into a highly oxygen-permeable polymer, either ethyl cellulose or organosilica. The two luminescent probes have different emission spectra and decay times, and their emissions can be discriminated using both parameters. Spatially resolved sensing is achieved by means of fluorescence lifetime imaging. The response times of the sensor to oxygen are short. The dual sensor exhibits a temperature operation range between at least 0 and 120 degrees C, and detection limits for oxygen in the ppbv range, operating for oxygen concentrations up to at least 50 ppmv. These ranges outperform all dual oxygen and temperature sensors reported so far. The dual sensor presented in this study is especially appropriate for measurements under extreme conditions such as high temperatures and ultralow oxygen levels. This dual sensor is a key step forward in a number of scientifically or commercially important applications including food packaging, for monitoring of hyperthermophilic microorganisms, in space technology, and safety and security applications in terms of detection of oxygen leaks.

  14. Estimation of Image Sensor Fill Factor Using a Single Arbitrary Image

    Directory of Open Access Journals (Sweden)

    Wei Wen

    2017-03-01

    Full Text Available Achieving a high fill factor is a bottleneck problem for capturing high-quality images. There are hardware and software solutions to overcome this problem. In the solutions, the fill factor is known. However, this is an industrial secrecy by most image sensor manufacturers due to its direct effect on the assessment of the sensor quality. In this paper, we propose a method to estimate the fill factor of a camera sensor from an arbitrary single image. The virtual response function of the imaging process and sensor irradiance are estimated from the generation of virtual images. Then the global intensity values of the virtual images are obtained, which are the result of fusing the virtual images into a single, high dynamic range radiance map. A non-linear function is inferred from the original and global intensity values of the virtual images. The fill factor is estimated by the conditional minimum of the inferred function. The method is verified using images of two datasets. The results show that our method estimates the fill factor correctly with significant stability and accuracy from one single arbitrary image according to the low standard deviation of the estimated fill factors from each of images and for each camera.

  15. Efficient ranging-sensor navigation methods for indoor aircraft

    Science.gov (United States)

    Sobers, David Michael, Jr.

    Unmanned Aerial Vehicles (UAVs) are often used for reconnaissance, search and rescue, damage assessment, exploration, and other tasks that are dangerous or prohibitively difficult for humans to perform. Often, these tasks include traversing indoor environments where radio links are unreliable, hindering the use of remote pilot links or ground-based control, and effectively eliminating Global Positioning System (GPS) signals as a potential localization method. As a result, any vehicle capable of indoor flight must be able to stabilize itself and perform all guidance, navigation, and control (GNC) tasks without dependence on a radio link, which may be available only intermittently. Stability and control of rotorcraft UAVs is usually achieved by either a passive stability system, such as a Bell stabilizer bar, or by actively measuring body accelerations and angular rates with an onboard Inertial Measurement Unit (IMU) and using that data for feedback control. However, neither active nor passive attitude stabilization methods provide position control by themselves. Therefore, GNC methods must either be tolerant to position drift or have some means of estimating and controlling position, which requires an external reference in order to measure and correct errors in the position estimate. GPS signals are often the most convenient method for providing this external position reference. As a result, most UAVs utilize GPS for localization and to bound error on position drift. Unfortunately, the availability of GPS signals in unknown environments is not assured, especially during indoor operation. As a result, other sensors must be used to provide position information relative to the environment. This research covers a description of different ranging sensors and methods for incorporating them into the overall guidance, navigation, and control system. Various sensors are analyzed to determine their performance characteristics and suitability for indoor navigation, including

  16. Perimeter Coverage Scheduling in Wireless Sensor Networks Using Sensors with a Single Continuous Cover Range

    Directory of Open Access Journals (Sweden)

    Hung Ka-Shun

    2010-01-01

    Full Text Available In target monitoring problem, it is generally assumed that the whole target object can be monitored by a single sensor if the target falls within its sensing range. Unfortunately, this assumption becomes invalid when the target object is very large that a sensor can only monitor part of it. In this paper, we study the perimeter coverage problem where the perimeter of a big object needs to be monitored, but each sensor can only cover a single continuous portion of the perimeter. We describe how to schedule the sensors so as to maximize the network lifetime in this problem. We formally prove that the perimeter coverage scheduling problem is NP-hard in general. However, polynomial time solution exists in some special cases. We further identify the sufficient conditions for a scheduling algorithm to be a 2-approximation solution to the general problem, and propose a simple distributed 2-approximation solution with a small message overhead.

  17. Short-Range Ultra-Wideband Imaging with Multiple-Input Multiple-Output Arrays

    NARCIS (Netherlands)

    Zhuge, X.

    2010-01-01

    Compact, cost-efficient and high-resolution imaging sensors are especially desirable in the field of short-range observation and surveillance. Such sensors are of great value in fields of security, rescue and medical applications. Systems can be formed for various practical purposes, such as

  18. Dynamic Range and Sensitivity Requirements of Satellite Ocean Color Sensors: Learning from the Past

    Science.gov (United States)

    Hu, Chuanmin; Feng, Lian; Lee, Zhongping; Davis, Curtiss O.; Mannino, Antonio; McClain, Charles R.; Franz, Bryan A.

    2012-01-01

    Sensor design and mission planning for satellite ocean color measurements requires careful consideration of the signal dynamic range and sensitivity (specifically here signal-to-noise ratio or SNR) so that small changes of ocean properties (e.g., surface chlorophyll-a concentrations or Chl) can be quantified while most measurements are not saturated. Past and current sensors used different signal levels, formats, and conventions to specify these critical parameters, making it difficult to make cross-sensor comparisons or to establish standards for future sensor design. The goal of this study is to quantify these parameters under uniform conditions for widely used past and current sensors in order to provide a reference for the design of future ocean color radiometers. Using measurements from the Moderate Resolution Imaging Spectroradiometer onboard the Aqua satellite (MODISA) under various solar zenith angles (SZAs), typical (L(sub typical)) and maximum (L(sub max)) at-sensor radiances from the visible to the shortwave IR were determined. The Ltypical values at an SZA of 45 deg were used as constraints to calculate SNRs of 10 multiband sensors at the same L(sub typical) radiance input and 2 hyperspectral sensors at a similar radiance input. The calculations were based on clear-water scenes with an objective method of selecting pixels with minimal cross-pixel variations to assure target homogeneity. Among the widely used ocean color sensors that have routine global coverage, MODISA ocean bands (1 km) showed 2-4 times higher SNRs than the Sea-viewing Wide Field-of-view Sensor (Sea-WiFS) (1 km) and comparable SNRs to the Medium Resolution Imaging Spectrometer (MERIS)-RR (reduced resolution, 1.2 km), leading to different levels of precision in the retrieved Chl data product. MERIS-FR (full resolution, 300 m) showed SNRs lower than MODISA and MERIS-RR with the gain in spatial resolution. SNRs of all MODISA ocean bands and SeaWiFS bands (except the SeaWiFS near-IR bands

  19. Cell phones as imaging sensors

    Science.gov (United States)

    Bhatti, Nina; Baker, Harlyn; Marguier, Joanna; Berclaz, Jérôme; Süsstrunk, Sabine

    2010-04-01

    Camera phones are ubiquitous, and consumers have been adopting them faster than any other technology in modern history. When connected to a network, though, they are capable of more than just picture taking: Suddenly, they gain access to the power of the cloud. We exploit this capability by providing a series of image-based personal advisory services. These are designed to work with any handset over any cellular carrier using commonly available Multimedia Messaging Service (MMS) and Short Message Service (SMS) features. Targeted at the unsophisticated consumer, these applications must be quick and easy to use, not requiring download capabilities or preplanning. Thus, all application processing occurs in the back-end system (i.e., as a cloud service) and not on the handset itself. Presenting an image to an advisory service in the cloud, a user receives information that can be acted upon immediately. Two of our examples involve color assessment - selecting cosmetics and home décor paint palettes; the third provides the ability to extract text from a scene. In the case of the color imaging applications, we have shown that our service rivals the advice quality of experts. The result of this capability is a new paradigm for mobile interactions - image-based information services exploiting the ubiquity of camera phones.

  20. Quality metrics for sensor images

    Science.gov (United States)

    Ahumada, AL

    1993-01-01

    Methods are needed for evaluating the quality of augmented visual displays (AVID). Computational quality metrics will help summarize, interpolate, and extrapolate the results of human performance tests with displays. The FLM Vision group at NASA Ames has been developing computational models of visual processing and using them to develop computational metrics for similar problems. For example, display modeling systems use metrics for comparing proposed displays, halftoning optimizing methods use metrics to evaluate the difference between the halftone and the original, and image compression methods minimize the predicted visibility of compression artifacts. The visual discrimination models take as input two arbitrary images A and B and compute an estimate of the probability that a human observer will report that A is different from B. If A is an image that one desires to display and B is the actual displayed image, such an estimate can be regarded as an image quality metric reflecting how well B approximates A. There are additional complexities associated with the problem of evaluating the quality of radar and IR enhanced displays for AVID tasks. One important problem is the question of whether intruding obstacles are detectable in such displays. Although the discrimination model can handle detection situations by making B the original image A plus the intrusion, this detection model makes the inappropriate assumption that the observer knows where the intrusion will be. Effects of signal uncertainty need to be added to our models. A pilot needs to make decisions rapidly. The models need to predict not just the probability of a correct decision, but the probability of a correct decision by the time the decision needs to be made. That is, the models need to predict latency as well as accuracy. Luce and Green have generated models for auditory detection latencies. Similar models are needed for visual detection. Most image quality models are designed for static imagery

  1. Lightning Imaging Sensor (LIS) on TRMM Science Data V4

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lightning Imaging Sensor (LIS) Science Data was collected by the Lightning Imaging Sensor (LIS), which was an instrument on the Tropical Rainfall Measurement...

  2. Lightning Imaging Sensor (LIS) on TRMM Backgrounds V4

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lightning Imaging Sensor (LIS) Backgrounds was collected by the Lightning Imaging Sensor (LIS), which was an instrument on the Tropical Rainfall Measurement...

  3. Target detection and recognition techniques of line imaging ladar sensor

    Science.gov (United States)

    Sun, Zhi-hui; Deng, Jia-hao; Yan, Xiao-wei

    2009-07-01

    A line imaging ladar sensor using linear diode laser array and linear avalanche photodiode (APD) array is developed for precise terminal guidance and intelligent proximity fuzing applications. The detection principle of line imaging ladar is discussed in detail, and design method of the line imaging ladar sensor system is given. Taking military tank target as example, simulated tank height and intensity images are obtained by the line imaging ladar simulation system. The subsystems of line imaging ladar sensor including transmitter and receiver are designed. Multi-pulse coherent algorithm and correlation detection method are adopted to improve the SNR of echo and to estimate time-of-flight, respectively. Experiment results show that the power SNR can be improved by N (number of coherent average) times and the maximum range error is 0.25 m. A few of joint transform correlation (JTC) techniques are discussed to improve noncooperative target recognition capability in height image with complex background. Simulation results show that binary JTC, non-zero-order modified fringe-adjusted JTC and non-zero-order amplitude-modulated JTC can improve the target recognition performance effectively.

  4. Smart CMOS image sensor for lightning detection and imaging.

    Science.gov (United States)

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-03-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel frame-to-frame difference comparison with an adjustable threshold and on-chip digital processing allowing an efficient localization of a faint lightning pulse on the entire large format array at a frequency of 1 kHz. A CMOS prototype sensor with a 256×256 pixel array and a 60 μm pixel pitch has been fabricated using a 0.35 μm 2P 5M technology and tested to validate the selected detection approach.

  5. High Resolution, Range/Range-Rate Imager Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Visidyne proposes to develop a design for a small, lightweight, high resolution, in x, y, and z Doppler imager to assist in the guidance, navigation and control...

  6. A protein-dye hybrid system as a narrow range tunable intracellular pH sensor.

    Science.gov (United States)

    Anees, Palapuravan; Sudheesh, Karivachery V; Jayamurthy, Purushothaman; Chandrika, Arunkumar R; Omkumar, Ramakrishnapillai V; Ajayaghosh, Ayyappanpillai

    2016-11-18

    Accurate monitoring of pH variations inside cells is important for the early diagnosis of diseases such as cancer. Even though a variety of different pH sensors are available, construction of a custom-made sensor array for measuring minute variations in a narrow biological pH window, using easily available constituents, is a challenge. Here we report two-component hybrid sensors derived from a protein and organic dye nanoparticles whose sensitivity range can be tuned by choosing different ratios of the components, to monitor the minute pH variations in a given system. The dye interacts noncovalently with the protein at lower pH and covalently at higher pH, triggering two distinguishable fluorescent signals at 700 and 480 nm, respectively. The pH sensitivity region of the probe can be tuned for every unit of the pH window resulting in custom-made pH sensors. These narrow range tunable pH sensors have been used to monitor pH variations in HeLa cells using the fluorescence imaging technique.

  7. Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Chen Qu

    2017-09-01

    Full Text Available The CMOS (Complementary Metal-Oxide-Semiconductor is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze, causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.

  8. Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors.

    Science.gov (United States)

    Qu, Chen; Bi, Du-Yan; Sui, Ping; Chao, Ai-Nong; Wang, Yun-Fei

    2017-09-22

    The CMOS (Complementary Metal-Oxide-Semiconductor) is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze), causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF) framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.

  9. CMOS Image Sensors for High Speed Applications.

    Science.gov (United States)

    El-Desouki, Munir; Deen, M Jamal; Fang, Qiyin; Liu, Louis; Tse, Frances; Armstrong, David

    2009-01-01

    Recent advances in deep submicron CMOS technologies and improved pixel designs have enabled CMOS-based imagers to surpass charge-coupled devices (CCD) imaging technology for mainstream applications. The parallel outputs that CMOS imagers can offer, in addition to complete camera-on-a-chip solutions due to being fabricated in standard CMOS technologies, result in compelling advantages in speed and system throughput. Since there is a practical limit on the minimum pixel size (4∼5 μm) due to limitations in the optics, CMOS technology scaling can allow for an increased number of transistors to be integrated into the pixel to improve both detection and signal processing. Such smart pixels truly show the potential of CMOS technology for imaging applications allowing CMOS imagers to achieve the image quality and global shuttering performance necessary to meet the demands of ultrahigh-speed applications. In this paper, a review of CMOS-based high-speed imager design is presented and the various implementations that target ultrahigh-speed imaging are described. This work also discusses the design, layout and simulation results of an ultrahigh acquisition rate CMOS active-pixel sensor imager that can take 8 frames at a rate of more than a billion frames per second (fps).

  10. CMOS Image Sensors for High Speed Applications

    Directory of Open Access Journals (Sweden)

    M. Jamal Deen

    2009-01-01

    Full Text Available Recent advances in deep submicron CMOS technologies and improved pixel designs have enabled CMOS-based imagers to surpass charge-coupled devices (CCD imaging technology for mainstream applications. The parallel outputs that CMOS imagers can offer, in addition to complete camera-on-a-chip solutions due to being fabricated in standard CMOS technologies, result in compelling advantages in speed and system throughput. Since there is a practical limit on the minimum pixel size (4~5 μm due to limitations in the optics, CMOS technology scaling can allow for an increased number of transistors to be integrated into the pixel to improve both detection and signal processing. Such smart pixels truly show the potential of CMOS technology for imaging applications allowing CMOS imagers to achieve the image quality and global shuttering performance necessary to meet the demands of ultrahigh-speed applications. In this paper, a review of CMOS-based high-speed imager design is presented and the various implementations that target ultrahigh-speed imaging are described. This work also discusses the design, layout and simulation results of an ultrahigh acquisition rate CMOS active-pixel sensor imager that can take 8 frames at a rate of more than a billion frames per second (fps.

  11. Adaptive Optics for Satellite Imaging and Space Debris Ranging

    Science.gov (United States)

    Bennet, F.; D'Orgeville, C.; Price, I.; Rigaut, F.; Ritchie, I.; Smith, C.

    Earth's space environment is becoming crowded and at risk of a Kessler syndrome, and will require careful management for the future. Modern low noise high speed detectors allow for wavefront sensing and adaptive optics (AO) in extreme circumstances such as imaging small orbiting bodies in Low Earth Orbit (LEO). The Research School of Astronomy and Astrophysics (RSAA) at the Australian National University have been developing AO systems for telescopes between 1 and 2.5m diameter to image and range orbiting satellites and space debris. Strehl ratios in excess of 30% can be achieved for targets in LEO with an AO loop running at 2kHz, allowing the resolution of small features (system developed at RSAA consists of a high speed EMCCD Shack-Hartmann wavefront sensor, a deformable mirror (DM), and realtime computer (RTC), and an imaging camera. The system works best as a laser guide star system but will also function as a natural guide star AO system, with the target itself being the guide star. In both circumstances tip-tilt is provided by the target on the imaging camera. The fast tip-tilt modes are not corrected optically, and are instead removed by taking images at a moderate speed (>30Hz) and using a shift and add algorithm. This algorithm can also incorporate lucky imaging to further improve the final image quality. A similar AO system for space debris ranging is also in development in collaboration with Electro Optic Systems (EOS) and the Space Environment Management Cooperative Research Centre (SERC), at the Mount Stromlo Observatory in Canberra, Australia. The system is designed for an AO corrected upward propagated 1064nm pulsed laser beam, from which time of flight information is used to precisely range the target. A 1.8m telescope is used for both propagation and collection of laser light. A laser guide star, Shack-Hartmann wavefront sensor, and DM are used for high order correction, and tip-tilt correction provided by reflected sunlight from the target. The

  12. IR sensors and imagers in networked operations

    Science.gov (United States)

    Breiter, Rainer; Cabanski, Wolfgang

    2005-05-01

    "Network-centric Warfare" is a common slogan describing an overall concept of networked operation of sensors, information and weapons to gain command and control superiority. Referring to IR sensors, integration and fusion of different channels like day/night or SAR images or the ability to spread image data among various users are typical requirements. Looking for concrete implementations the German Army future infantryman IdZ is an example where a group of ten soldiers build a unit with every soldier equipped with a personal digital assistant (PDA) for information display, day photo camera and a high performance thermal imager for every unit. The challenge to allow networked operation among such a unit is bringing information together and distribution over a capable network. So also AIM's thermal reconnaissance and targeting sight HuntIR which was selected for the IdZ program provides this capabilities by an optional wireless interface. Besides the global approach of Network-centric Warfare network technology can also be an interesting solution for digital image data distribution and signal processing behind the FPA replacing analog video networks or specific point to point interfaces. The resulting architecture can provide capabilities of data fusion from e.g. IR dual-band or IR multicolor sensors. AIM has participated in a German/UK collaboration program to produce a demonstrator for day/IR video distribution via Gigabit Ethernet for vehicle applications. In this study Ethernet technology was chosen for network implementation and a set of electronics was developed for capturing video data of IR and day imagers and Gigabit Ethernet video distribution. The demonstrator setup follows the requirements of current and future vehicles having a set of day and night imager cameras and a crew station with several members. Replacing the analog video path by a digital video network also makes it easy to implement embedded training by simply feeding the network with

  13. Wide range pressure sensor based on a piezoelectric bimorph microcantilever

    OpenAIRE

    MORTET, Vincent; PETERSEN, Rainer; HAENEN, Ken; D'OLIESLAEGER, Marc

    2006-01-01

    Since the development of the atomic force microscope, interest in microfabricated cantilevers has grown. Cantilevers are excellent micromechanical sensors. In this work, we use a commercially available piezoelectric bimorph cantilever as pressure and temperature sensor. The piezoelectric layer acts as both sensor and actuator. The sensor detects the change in the resonance frequencies due to the drag force of the surrounding gas. The frequency shift of the resonant modes is measured as a func...

  14. SENSOR CORRECTION AND RADIOMETRIC CALIBRATION OF A 6-BAND MULTISPECTRAL IMAGING SENSOR FOR UAV REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    J. Kelcey

    2012-07-01

    Full Text Available The increased availability of unmanned aerial vehicles (UAVs has resulted in their frequent adoption for a growing range of remote sensing tasks which include precision agriculture, vegetation surveying and fine-scale topographic mapping. The development and utilisation of UAV platforms requires broad technical skills covering the three major facets of remote sensing: data acquisition, data post-processing, and image analysis. In this study, UAV image data acquired by a miniature 6-band multispectral imaging sensor was corrected and calibrated using practical image-based data post-processing techniques. Data correction techniques included dark offset subtraction to reduce sensor noise, flat-field derived per-pixel look-up-tables to correct vignetting, and implementation of the Brown- Conrady model to correct lens distortion. Radiometric calibration was conducted with an image-based empirical line model using pseudo-invariant features (PIFs. Sensor corrections and radiometric calibration improve the quality of the data, aiding quantitative analysis and generating consistency with other calibrated datasets.

  15. Vertically integrated thin film color sensor arrays for imaging applications.

    Science.gov (United States)

    Knipp, Dietmar; Street, Robert A; Stiebig, Helmut; Krause, Mathias; Lu, Jeng-Ping; Ready, Steve; Ho, Jackson

    2006-04-17

    Large area color sensor arrays based on vertically integrated thin-film sensors were realized. The complete color information of each color pixel is detected at the same position of the sensor array without using optical filters. The sensor arrays consist of amorphous silicon thin film color sensors integrated on top of amorphous silicon readout transistors. The spectral sensitivity of the sensors is controlled by the applied bias voltage. The operating principle of the color sensor arrays is described. Furthermore, the image quality and the pixel cross talk of the sensor arrays is analyzed by measurements of the line spread function and the modulation transfer function.

  16. Calibration methods of force sensors in the micro-Newton range

    Science.gov (United States)

    Nafari, Alexandra; Alavian Ghavanini, Farzan; Bring, Martin; Svensson, Krister; Enoksson, Peter

    2007-10-01

    A micromachined capacitive force sensor operating in the micro-Newton range has been calibrated using both dynamic and static methods. Both calibrations are non-destructive, accurate and traceable to Système International (SI) fundamental units. The dynamic calibration is a differential mass loading resonant method where the resonance frequency with and without an added mass is measured. This gives enough information to compute the spring constant. In this paper, we evaluate the resonant mass loading method for more complex MEMS devices. Analytical calculations and finite element analysis have been performed to investigate the dynamic properties of the sensor, e.g. modal interference. The frequency response was measured with the third harmonic method where the third harmonic of the current through the sensor was measured. To detect and analyse the resonance mode of the structure during excitation, a scanning laser Doppler vibrometer was used. Two designs of a capacitive nanoindenter force sensor with flexure-type springs have been evaluated using these methods. The quality of the resonant calibration method has been tested using static mass loading in combination with transmission electron microscopy imaging of the sensor displacement. This shows that the resonant method can be extended to calibrate more complex structures than plain cantilevers. Both calibration methods used are traceable to SI fundamental units as they are based on masses weighed on a calibrated scale. The masses used do not need to be fixed or glued in any way, making the calibration non-destructive.

  17. 3D CAPTURING PERFORMANCES OF LOW-COST RANGE SENSORS FOR MASS-MARKET APPLICATIONS

    Directory of Open Access Journals (Sweden)

    G. Guidi

    2016-06-01

    Full Text Available Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010, several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution.

  18. D Capturing Performances of Low-Cost Range Sensors for Mass-Market Applications

    Science.gov (United States)

    Guidi, G.; Gonizzi, S.; Micoli, L.

    2016-06-01

    Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010), several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i) Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii) F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution.

  19. Hyperspectral Foveated Imaging Sensor for Objects Identification and Tracking Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Optical tracking and identification sensors have numerous NASA and non-NASA applications. For example, airborne or spaceborne imaging sensors are used to visualize...

  20. X-ray examination apparatus with an imaging arrangement having a plurality of image sensors

    NARCIS (Netherlands)

    Slump, Cornelis H.

    1995-01-01

    An imaging arrangement including a multi-sensor for use in an x-ray examination apparatus is described that combines a plurality of partially overlapping sub-images, resulting in an increased effective sensor area when compared to a single sensor-image. Thus an imaging arrangement is provided

  1. X-ray examination apparatus with an imaging arrangement having a plurality of image sensors

    NARCIS (Netherlands)

    Slump, Cornelis H.; Harms, M.O.

    1999-01-01

    An imaging arrangement including a multi-sensor for use in an x-ray examination apparatus is described that combines a plurality of partially overlapping sub-images, resulting in an increased effective sensor area when compared to a single sensor-image. Thus an imaging arrangement is provided

  2. Wide range pressure sensor based on a piezoelectric bimorph microcantilever

    Science.gov (United States)

    Mortet, V.; Petersen, R.; Haenen, K.; D'Olieslaeger, M.

    2006-03-01

    Since the development of the atomic force microscope, interest in microfabricated cantilevers has grown. Cantilevers are excellent micromechanical sensors. In this work, we use a commercially available piezoelectric bimorph cantilever as pressure and temperature sensor. The piezoelectric layer acts as both sensor and actuator. The sensor detects the change in the resonance frequencies due to the drag force of the surrounding gas. The frequency shift of the resonant modes is measured as a function of the pressure and the temperature. The results show that both pressure and temperature can be measured simultaneously using the piezoelectric bimorph cantilever's resonant frequencies.

  3. The CAOS camera platform: ushering in a paradigm change in extreme dynamic range imager design

    Science.gov (United States)

    Riza, Nabeel A.

    2017-02-01

    Multi-pixel imaging devices such as CCD, CMOS and Focal Plane Array (FPA) photo-sensors dominate the imaging world. These Photo-Detector Array (PDA) devices certainly have their merits including increasingly high pixel counts and shrinking pixel sizes, nevertheless, they are also being hampered by limitations in instantaneous dynamic range, inter-pixel crosstalk, quantum full well capacity, signal-to-noise ratio, sensitivity, spectral flexibility, and in some cases, imager response time. Recently invented is the Coded Access Optical Sensor (CAOS) Camera platform that works in unison with current Photo-Detector Array (PDA) technology to counter fundamental limitations of PDA-based imagers while providing high enough imaging spatial resolution and pixel counts. Using for example the Texas Instruments (TI) Digital Micromirror Device (DMD) to engineer the CAOS camera platform, ushered in is a paradigm change in advanced imager design, particularly for extreme dynamic range applications.

  4. Imaging sensor constellation for tomographic chemical cloud mapping.

    Science.gov (United States)

    Cosofret, Bogdan R; Konno, Daisei; Faghfouri, Aram; Kindle, Harry S; Gittins, Christopher M; Finson, Michael L; Janov, Tracy E; Levreault, Mark J; Miyashiro, Rex K; Marinelli, William J

    2009-04-01

    A sensor constellation capable of determining the location and detailed concentration distribution of chemical warfare agent simulant clouds has been developed and demonstrated on government test ranges. The constellation is based on the use of standoff passive multispectral infrared imaging sensors to make column density measurements through the chemical cloud from two or more locations around its periphery. A computed tomography inversion method is employed to produce a 3D concentration profile of the cloud from the 2D line density measurements. We discuss the theoretical basis of the approach and present results of recent field experiments where controlled releases of chemical warfare agent simulants were simultaneously viewed by three chemical imaging sensors. Systematic investigations of the algorithm using synthetic data indicate that for complex functions, 3D reconstruction errors are less than 20% even in the case of a limited three-sensor measurement network. Field data results demonstrate the capability of the constellation to determine 3D concentration profiles that account for ~?86%? of the total known mass of material released.

  5. Development of integrated semiconductor optical sensors for functional brain imaging

    Science.gov (United States)

    Lee, Thomas T.

    Optical imaging of neural activity is a widely accepted technique for imaging brain function in the field of neuroscience research, and has been used to study the cerebral cortex in vivo for over two decades. Maps of brain activity are obtained by monitoring intensity changes in back-scattered light, called Intrinsic Optical Signals (IOS), that correspond to fluctuations in blood oxygenation and volume associated with neural activity. Current imaging systems typically employ bench-top equipment including lamps and CCD cameras to study animals using visible light. Such systems require the use of anesthetized or immobilized subjects with craniotomies, which imposes limitations on the behavioral range and duration of studies. The ultimate goal of this work is to overcome these limitations by developing a single-chip semiconductor sensor using arrays of sources and detectors operating at near-infrared (NIR) wavelengths. A single-chip implementation, combined with wireless telemetry, will eliminate the need for immobilization or anesthesia of subjects and allow in vivo studies of free behavior. NIR light offers additional advantages because it experiences less absorption in animal tissue than visible light, which allows for imaging through superficial tissues. This, in turn, reduces or eliminates the need for traumatic surgery and enables long-term brain-mapping studies in freely-behaving animals. This dissertation concentrates on key engineering challenges of implementing the sensor. This work shows the feasibility of using a GaAs-based array of vertical-cavity surface emitting lasers (VCSELs) and PIN photodiodes for IOS imaging. I begin with in-vivo studies of IOS imaging through the skull in mice, and use these results along with computer simulations to establish minimum performance requirements for light sources and detectors. I also evaluate the performance of a current commercial VCSEL for IOS imaging, and conclude with a proposed prototype sensor.

  6. New fabrication techniques for high dynamic range tunneling sensors

    Science.gov (United States)

    Chang, David T.; Stratton, Fred P.; Kubena, Randall L.; Vickers-Kirby, Deborah J.; Joyce, Richard J.; Schimert, Thomas R.; Gooch, Roland W.

    2000-08-01

    We have developed high dynamic range (105-106 g's) tunneling accelerometers1,2 that may be ideal for smart munitions applications by employing both surface and bulk micromachining processing techniques. The highly miniaturized surface-micromachined devices can be manufactured at very low cost and integrated on chip with the control electronics. Bulk-micromachined devices with Si as the cantilever material should have reduced long-term bias drift as well as better stability at higher temperatures. Fully integrated sensors may provide advantages in minimizing microphonics for high-g applications. Previously, we described initial test results using electrostatic forces generated by a self-test electrode located under a Au cantilever3. In this paper, we describe more recent testing of Ni and Au cantilever devices on a shaker table using a novel, low input voltage (5 V) servo controller on both printed wiring board and surface-mount control circuitry. In addition, we report our initial test results for devices packaged using a low-temperature wafer-level vacuum packaging technique for low-cost manufacturing.

  7. Optical Sensor for Diverse Organic Vapors at ppm Concentration Ranges

    Directory of Open Access Journals (Sweden)

    Dora M. Paolucci

    2011-03-01

    Full Text Available A broadly responsive optical organic vapor sensor is described that responds to low concentrations of organic vapors without significant interference from water vapor. Responses to several classes of organic vapors are highlighted, and trends within classes are presented. The relationship between molecular properties (vapor pressure, boiling point, polarizability, and refractive index and sensor response are discussed.

  8. Optical fiber voltage sensors for broad temperature ranges

    Science.gov (United States)

    Rose, A. H.; Day, G. W.

    1992-01-01

    We describe the development of an optical fiber ac voltage sensor for aircraft and spacecraft applications. Among the most difficult specifications to meet for this application is a temperature stability of +/- 1 percent from -65 C to +125 C. This stability requires a careful selection of materials, components, and optical configuration with further compensation using an optical-fiber temperature sensor located near the sensing element. The sensor is a polarimetric design, based on the linear electro-optic effect in bulk Bi4Ge3O12. The temperature sensor is also polarimetric, based on the temperature dependence of the birefringence of bulk SiO2. The temperature sensor output is used to automatically adjust the calibration of the instrument.

  9. Shack-Hartmann wavefront sensor with large dynamic range by adaptive spot search method.

    Science.gov (United States)

    Shinto, Hironobu; Saita, Yusuke; Nomura, Takanori

    2016-07-10

    A Shack-Hartmann wavefront sensor (SHWFS) that consists of a microlens array and an image sensor has been used to measure the wavefront aberrations of human eyes. However, a conventional SHWFS has finite dynamic range depending on the diameter of the each microlens. The dynamic range cannot be easily expanded without a decrease of the spatial resolution. In this study, an adaptive spot search method to expand the dynamic range of an SHWFS is proposed. In the proposed method, spots are searched with the help of their approximate displacements measured with low spatial resolution and large dynamic range. By the proposed method, a wavefront can be correctly measured even if the spot is beyond the detection area. The adaptive spot search method is realized by using the special microlens array that generates both spots and discriminable patterns. The proposed method enables expanding the dynamic range of an SHWFS with a single shot and short processing time. The performance of the proposed method is compared with that of a conventional SHWFS by optical experiments. Furthermore, the dynamic range of the proposed method is quantitatively evaluated by numerical simulations.

  10. Characteristics of different frequency ranges in scanning electron microscope images

    Energy Technology Data Exchange (ETDEWEB)

    Sim, K. S., E-mail: kssim@mmu.edu.my; Nia, M. E.; Tan, T. L.; Tso, C. P.; Ee, C. S. [Faculty of Engineering and Technology, Multimedia University, 75450 Melaka (Malaysia)

    2015-07-22

    We demonstrate a new approach to characterize the frequency range in general scanning electron microscope (SEM) images. First, pure frequency images are generated from low frequency to high frequency, and then, the magnification of each type of frequency image is implemented. By comparing the edge percentage of the SEM image to the self-generated frequency images, we can define the frequency ranges of the SEM images. Characterization of frequency ranges of SEM images benefits further processing and analysis of those SEM images, such as in noise filtering and contrast enhancement.

  11. Extended-Range Passive RFID and Sensor Tags

    Science.gov (United States)

    Fink, Patrick W.; Kennedy, Timothy F.; Lin, Gregory Y.; Barton, Richard

    2012-01-01

    Extended-range passive radio-frequency identification (RFID) tags and related sensor tags are undergoing development. A tag of this type incorporates a retroreflective antenna array, so that it reflects significantly more signal power back toward an interrogating radio transceiver than does a comparable passive RFID tag of prior design, which does not incorporate a retroreflective antenna array. Therefore, for a given amount of power radiated by the transmitter in the interrogating transceiver, a tag of this type can be interrogated at a distance greater than that of the comparable passive RFID or sensor tag of prior design. The retroreflective antenna array is, more specifically, a Van Atta array, named after its inventor and first published in a patent issued in 1959. In its simplest form, a Van Atta array comprises two antenna elements connected by a transmission line so that the signal received by each antenna element is reradiated by the other antenna element (see Figure 1). The phase relationships among the received and reradiated signals are such as to produce constructive interference of the reradiated signals; that is, to concentrate the reradiated signal power in a direction back toward the source. Hence, an RFID tag equipped with a Van Atta antenna array automatically tracks the interrogating transceiver. The effective gain of a Van Atta array is the same as that of a traditional phased antenna array having the same number of antenna elements. Additional pairs of antenna elements connected by equal-length transmission lines can be incorporated into a Van Atta array to increase its directionality. Like some RFID tags here-to-fore commercially available, an RFID or sensor tag of the present developmental type includes one-port surface-acoustic-wave (SAW) devices. In simplified terms, the mode of operation of a basic one-port SAW device as used heretofore in an RFID device is the following: An interrogating radio signal is converted, at an input end, from

  12. ISAR imaging using the instantaneous range instantaneous Doppler method

    CSIR Research Space (South Africa)

    Wazna, TM

    2015-10-01

    Full Text Available In Inverse Synthetic Aperture Radar (ISAR) imaging, the Range Instantaneous Doppler (RID) method is used to compensate for the nonuniform rotational motion of the target that degrades the Doppler resolution of the ISAR image. The Instantaneous Range...

  13. Modeling and simulation of TDI CMOS image sensors

    Science.gov (United States)

    Nie, Kai-ming; Yao, Su-ying; Xu, Jiang-tao; Gao, Jing

    2013-09-01

    In this paper, a mathematical model of TDI CMOS image sensors was established in behavioral level through MATLAB based on the principle of a TDI CMOS image sensor using temporal oversampling rolling shutter in the along-track direction. The geometric perspective and light energy transmission relationships between the scene and the image on the sensor are included in the proposed model. A graphical user interface (GUI) of the model was also established. A high resolution satellitic picture was used to model the virtual scene being photographed. The effectiveness of the proposed model was verified by computer simulations based on the satellitic picture. In order to guide the design of TDI CMOS image sensors, the impacts of some parameters of TDI CMOS image sensors including pixel pitch, pixel photosensitive size, and integration time on the performance of the sensors were researched through the proposed model. The impacts of the above parameters on the sensors were quantified by sensor's modulation transfer function (MTF) of the along-track direction, which was calculated by slanted-edge method. The simulation results indicated that the TDI CMOS image sensor can get a better performance with smaller pixel photosensitive size and shorter integration time. The proposed model is useful in the process of researching and developing a TDI CMOS image sensor.

  14. High dynamic range electric field sensor for electromagnetic pulse detection

    National Research Council Canada - National Science Library

    Lin, Che-Yun; Wang, Alan X; Lee, Beom Suk; Zhang, Xingyu; Chen, Ray T

    2011-01-01

    ...) polymer Y-fed directional coupler for electromagnetic wave detection. This electrode-less, all optical, wideband electrical field sensor is fabricated using standard processing for E-O polymer photonic devices...

  15. Multiple-Event, Single-Photon Counting Imaging Sensor

    Science.gov (United States)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  16. Scannerless laser range imaging using loss modulation

    Energy Technology Data Exchange (ETDEWEB)

    Sandusky, John V [Albuquerque, NM

    2011-08-09

    A scannerless 3-D imaging apparatus is disclosed which utilizes an amplitude modulated cw light source to illuminate a field of view containing a target of interest. Backscattered light from the target is passed through one or more loss modulators which are modulated at the same frequency as the light source, but with a phase delay .delta. which can be fixed or variable. The backscattered light is demodulated by the loss modulator and detected with a CCD, CMOS or focal plane array (FPA) detector to construct a 3-D image of the target. The scannerless 3-D imaging apparatus, which can operate in the eye-safe wavelength region 1.4-1.7 .mu.m and which can be constructed as a flash LADAR, has applications for vehicle collision avoidance, autonomous rendezvous and docking, robotic vision, industrial inspection and measurement, 3-D cameras, and facial recognition.

  17. Displacement Damage Effects in Pinned Photodiode CMOS Image Sensors

    OpenAIRE

    Virmontois, Cédric; Goiffon, Vincent; Corbière, Franck; Magnan, Pierre; Girard, Sylvain; Bardoux, Alain

    2012-01-01

    This paper investigates the effects of displacement damage in Pinned Photodiode (PPD) CMOS Image Sensors (CIS) using proton and neutron irradiations. The DDD ranges from 12 TeV/g to ${1.2 times 10^{6}}$ TeV/g. Particle fluence up to $5 times 10^{14}$ n.cm $^{-2}$ is investigated to observe electro-optic degradation in harsh environments. The dark current is also investigated and it would appear that it is possible to use the dark current spectroscopy in PPD CIS. The dark current random telegr...

  18. Virtual View Image over Wireless Visual Sensor Network

    Directory of Open Access Journals (Sweden)

    Gamantyo Hendrantoro

    2011-12-01

    Full Text Available In general, visual sensors are applied to build virtual view images. When number of visual sensors increases then quantity and quality of the information improves. However, the view images generation is a challenging task in Wireless Visual Sensor Network environment due to energy restriction, computation complexity, and bandwidth limitation. Hence this paper presents a new method of virtual view images generation from selected cameras on Wireless Visual Sensor Network. The aim of the paper is to meet bandwidth and energy limitations without reducing information quality. The experiment results showed that this method could minimize number of transmitted imageries with sufficient information.

  19. Special Sensor Microwave Imager/Sounder (SSMIS) Sensor Data Record (SDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager/Sounder (SSMIS) is a series of passive microwave conically scanning imagers and sounders onboard the DMSP satellites beginning...

  20. Enhanced Strain Measurement Range of an FBG Sensor Embedded in Seven-Wire Steel Strands

    Directory of Open Access Journals (Sweden)

    Jae-Min Kim

    2017-07-01

    Full Text Available FBG sensors offer many advantages, such as a lack of sensitivity to electromagnetic waves, small size, high durability, and high sensitivity. However, their maximum strain measurement range is lower than the yield strain range (about 1.0% of steel strands when embedded in steel strands. This study proposes a new FBG sensing technique in which an FBG sensor is recoated with polyimide and protected by a polyimide tube in an effort to enhance the maximum strain measurement range of FBG sensors embedded in strands. The validation test results showed that the proposed FBG sensing technique has a maximum strain measurement range of 1.73% on average, which is 1.73 times higher than the yield strain of the strands. It was confirmed that recoating the FBG sensor with polyimide and protecting the FBG sensor using a polyimide tube could effectively enhance the maximum strain measurement range of FBG sensors embedded in strands.

  1. Wavelength encoded fiber sensor for extreme temperature range

    Science.gov (United States)

    Barrera, D.; Finazzi, V.; Coviello, G.; Bueno, A.; Sales, S.; Pruneri, V.

    2010-09-01

    We have successfully created Chemical Composition Gratings (CCGs) into two different types of optical fiber: standard telecommunications Germanium doped fibers and photosensitive Germanium/Boron co-doped fibers. We have performed temperature cycles for analyzing the sensing properties and degradation or hysteresis with respect to the CCG sensors. The results show that CCG sensors based on Germanium/Boron co-doped photosensitive fiber have an almost linear response and negligible hysteresis effects, with a response of almost 100°C/s.

  2. A 14-megapixel 36 x 24-mm2 image sensor

    Science.gov (United States)

    Meynants, Guy; Scheffer, Danny; Dierickx, Bart; Alaerts, Andre

    2004-06-01

    We will present a 3044 x 4556 pixels CMOS image sensor with a pixel array of 36 x 24 mm2, equal to the size of 35 mm film. Though primarily developed for digital photography, the compatibility of the device with standard optics for film cameras makes the device also attractive for machine vision applications as well as many scientific and highresolution applications. The sensor makes use of a standard rolling shutter 3-transistor active pixel in standard 0.35 μm CMOS technology. On-chip double sampling is used to reduce fixed pattern noise. The pixel is 8 μm large, has 60,000 electrons full well charge and a conversion gain of 18.5 μV/electron. The product of quantum efficiency and fill factor of the monochrome device is 40%. Temporal noise is 35 electrons, offering a dynamic range of 65.4 dB. Dark current is 4.2 mV/s at 30 degrees C. Fixed pattern noise is less than 1.5 mV RMS over the entire focal plane and less than 1 mV RMS in local windows of 32 x 32 pixels. The sensor is read out over 4 parallel outputs at 15 MHz each, offering 3.2 images/second. The device runs at 3.3 V and consumes 200 mW.

  3. FASTICA based denoising for single sensor Digital Cameras images

    OpenAIRE

    Shawetangi kala; Raj Kumar Sahu

    2012-01-01

    Digital color cameras use a single sensor equipped with a color filter array (CFA) to capture scenes in color. Since each sensor cell can record only one color value, the other two missing components at each position need to be interpolated. The color interpolation process is usually called color demosaicking (CDM). The quality of demosaicked images is degraded due to the sensor noise introduced during the image acquisition process. Many advanced denoising algorithms, which are designed for ...

  4. High Dynamic Range Digital Imaging of Spacecraft

    Science.gov (United States)

    Karr, Brian A.; Chalmers, Alan; Debattista, Kurt

    2014-01-01

    The ability to capture engineering imagery with a wide degree of dynamic range during rocket launches is critical for post launch processing and analysis [USC03, NNC86]. Rocket launches often present an extreme range of lightness, particularly during night launches. Night launches present a two-fold problem: capturing detail of the vehicle and scene that is masked by darkness, while also capturing detail in the engine plume.

  5. Optimal Broadband Noise Matching to Inductive Sensors: Application to Magnetic Particle Imaging.

    Science.gov (United States)

    Zheng, Bo; Goodwill, Patrick W; Dixit, Neerav; Xiao, Di; Zhang, Wencong; Gunel, Beliz; Lu, Kuan; Scott, Greig C; Conolly, Steven M

    2017-10-01

    Inductive sensor-based measurement techniques are useful for a wide range of biomedical applications. However, optimizing the noise performance of these sensors is challenging at broadband frequencies, owing to the frequency-dependent reactance of the sensor. In this work, we describe the fundamental limits of noise performance and bandwidth for these sensors in combination with a low-noise amplifier. We also present three equivalent methods of noise matching to inductive sensors using transformer-like network topologies. Finally, we apply these techniques to improve the noise performance in magnetic particle imaging, a new molecular imaging modality with excellent detection sensitivity. Using a custom noise-matched amplifier, we experimentally demonstrate an 11-fold improvement in noise performance in a small animal magnetic particle imaging scanner.

  6. A compact 3D omnidirectional range sensor of high resolution for robust reconstruction of environments.

    Science.gov (United States)

    Marani, Roberto; Renò, Vito; Nitti, Massimiliano; D'Orazio, Tiziana; Stella, Ettore

    2015-01-22

    In this paper, an accurate range sensor for the three-dimensional reconstruction of environments is designed and developed. Following the principles of laser profilometry, the device exploits a set of optical transmitters able to project a laser line on the environment. A high-resolution and high-frame-rate camera assisted by a telecentric lens collects the laser light reflected by a parabolic mirror, whose shape is designed ad hoc to achieve a maximum measurement error of 10 mm when the target is placed 3 m away from the laser source. Measurements are derived by means of an analytical model, whose parameters are estimated during a preliminary calibration phase. Geometrical parameters, analytical modeling and image processing steps are validated through several experiments, which indicate the capability of the proposed device to recover the shape of a target with high accuracy. Experimental measurements show Gaussian statistics, having standard deviation of 1.74 mm within the measurable range. Results prove that the presented range sensor is a good candidate for environmental inspections and measurements.

  7. A Compact 3D Omnidirectional Range Sensor of High Resolution for Robust Reconstruction of Environments

    Directory of Open Access Journals (Sweden)

    Roberto Marani

    2015-01-01

    Full Text Available In this paper, an accurate range sensor for the three-dimensional reconstruction of environments is designed and developed. Following the principles of laser profilometry, the device exploits a set of optical transmitters able to project a laser line on the environment. A high-resolution and high-frame-rate camera assisted by a telecentric lens collects the laser light reflected by a parabolic mirror, whose shape is designed ad hoc to achieve a maximum measurement error of 10 mm when the target is placed 3 m away from the laser source. Measurements are derived by means of an analytical model, whose parameters are estimated during a preliminary calibration phase. Geometrical parameters, analytical modeling and image processing steps are validated through several experiments, which indicate the capability of the proposed device to recover the shape of a target with high accuracy. Experimental measurements show Gaussian statistics, having standard deviation of 1.74 mm within the measurable range. Results prove that the presented range sensor is a good candidate for environmental inspections and measurements.

  8. Imaging Extracellular Protein Concentration with Nanoplasmonic Sensors

    OpenAIRE

    Byers, Jeff M.; Christodoulides, Joseph A.; Delehanty, James B.; Raghu, Deepa; Raphael, Marc P.

    2015-01-01

    Extracellular protein concentrations and gradients queue a wide range of cellular responses, such as cell motility and division. Spatio-temporal quantification of these concentrations as produced by cells has proven challenging. As a result, artificial gradients must be introduced to the cell culture to correlate signal and response. Here we demonstrate a label-free nanoplasmonic imaging technique that can directly map protein concentrations as secreted by single cells in real time and which ...

  9. Visual Control of Robots Using Range Images

    Directory of Open Access Journals (Sweden)

    Fernando Torres

    2010-08-01

    Full Text Available In the last years, 3D-vision systems based on the time-of-flight (ToF principle have gained more importance in order to obtain 3D information from the workspace. In this paper, an analysis of the use of 3D ToF cameras to guide a robot arm is performed. To do so, an adaptive method to simultaneous visual servo control and camera calibration is presented. Using this method a robot arm is guided by using range information obtained from a ToF camera. Furthermore, the self-calibration method obtains the adequate integration time to be used by the range camera in order to precisely determine the depth information.

  10. Testing accuracy of long-range ultrasonic sensors for olive tree canopy measurements.

    Science.gov (United States)

    Gamarra-Diezma, Juan Luis; Miranda-Fuentes, Antonio; Llorens, Jordi; Cuenca, Andrés; Blanco-Roldán, Gregorio L; Rodríguez-Lizana, Antonio

    2015-01-28

    Ultrasonic sensors are often used to adjust spray volume by allowing the calculation of the crown volume of tree crops. The special conditions of the olive tree require the use of long-range sensors, which are less accurate and faster than the most commonly used sensors. The main objectives of the study were to determine the suitability of the sensor in terms of sound cone determination, angle errors, crosstalk errors and field measurements. Different laboratory tests were performed to check the suitability of a commercial long-range ultrasonic sensor, as were the experimental determination of the sound cone diameter at several distances for several target materials, the determination of the influence of the angle of incidence of the sound wave on the target and distance on the accuracy of measurements for several materials and the determination of the importance of the errors due to interference between sensors for different sensor spacings and distances for two different materials. Furthermore, sensor accuracy was tested under real field conditions. The results show that the studied sensor is appropriate for olive trees because the sound cone is narrower for an olive tree than for the other studied materials, the olive tree canopy does not have a large influence on the sensor accuracy with respect to distance and angle, the interference errors are insignificant for high sensor spacings and the sensor's field distance measurements were deemed sufficiently accurate.

  11. Building accurate geometric models from abundant range imaging information

    Science.gov (United States)

    Diegert, Carl F.; Sackos, John T.; Nellums, Robert O.

    1997-08-01

    We define two simple metrics for accuracy of models built from range imaging information. We apply the metric to a model built from a recent range image taken at the laser radar Development and Evaluation Facility, Eglin AFB, using a scannerless range imager (SRI) from Sandia National Laboratories. We also present graphical displays of the residual information produced as a byproduct of this measurement, and discuss mechanisms that these data suggest for further improvement in the performance of this already impressive SRI.

  12. Collaborative Image Coding and Transmission over Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Min Wu

    2007-01-01

    Full Text Available The imaging sensors are able to provide intuitive visual information for quick recognition and decision. However, imaging sensors usually generate vast amount of data. Therefore, processing and coding of image data collected in a sensor network for the purpose of energy efficient transmission poses a significant technical challenge. In particular, multiple sensors may be collecting similar visual information simultaneously. We propose in this paper a novel collaborative image coding and transmission scheme to minimize the energy for data transmission. First, we apply a shape matching method to coarsely register images to find out maximal overlap to exploit the spatial correlation between images acquired from neighboring sensors. For a given image sequence, we transmit background image only once. A lightweight and efficient background subtraction method is employed to detect targets. Only the regions of target and their spatial locations are transmitted to the monitoring center. The whole image can then be reconstructed by fusing the background and the target images as well as their spatial locations. Experimental results show that the energy for image transmission can indeed be greatly reduced with collaborative image coding and transmission.

  13. Wearable Wide-Range Strain Sensors Based on Ionic Liquids and Monitoring of Human Activities

    Directory of Open Access Journals (Sweden)

    Shao-Hui Zhang

    2017-11-01

    Full Text Available Wearable sensors for detection of human activities have encouraged the development of highly elastic sensors. In particular, to capture subtle and large-scale body motion, stretchable and wide-range strain sensors are highly desired, but still a challenge. Herein, a highly stretchable and transparent stain sensor based on ionic liquids and elastic polymer has been developed. The as-obtained sensor exhibits impressive stretchability with wide-range strain (from 0.1% to 400%, good bending properties and high sensitivity, whose gauge factor can reach 7.9. Importantly, the sensors show excellent biological compatibility and succeed in monitoring the diverse human activities ranging from the complex large-scale multidimensional motions to subtle signals, including wrist, finger and elbow joint bending, finger touch, breath, speech, swallow behavior and pulse wave.

  14. Software defined multi-spectral imaging for Arctic sensor networks

    Science.gov (United States)

    Siewert, Sam; Angoth, Vivek; Krishnamurthy, Ramnarayan; Mani, Karthikeyan; Mock, Kenrick; Singh, Surjith B.; Srivistava, Saurav; Wagner, Chris; Claus, Ryan; Vis, Matthew Demi

    2016-05-01

    Availability of off-the-shelf infrared sensors combined with high definition visible cameras has made possible the construction of a Software Defined Multi-Spectral Imager (SDMSI) combining long-wave, near-infrared and visible imaging. The SDMSI requires a real-time embedded processor to fuse images and to create real-time depth maps for opportunistic uplink in sensor networks. Researchers at Embry Riddle Aeronautical University working with University of Alaska Anchorage at the Arctic Domain Awareness Center and the University of Colorado Boulder have built several versions of a low-cost drop-in-place SDMSI to test alternatives for power efficient image fusion. The SDMSI is intended for use in field applications including marine security, search and rescue operations and environmental surveys in the Arctic region. Based on Arctic marine sensor network mission goals, the team has designed the SDMSI to include features to rank images based on saliency and to provide on camera fusion and depth mapping. A major challenge has been the design of the camera computing system to operate within a 10 to 20 Watt power budget. This paper presents a power analysis of three options: 1) multi-core, 2) field programmable gate array with multi-core, and 3) graphics processing units with multi-core. For each test, power consumed for common fusion workloads has been measured at a range of frame rates and resolutions. Detailed analyses from our power efficiency comparison for workloads specific to stereo depth mapping and sensor fusion are summarized. Preliminary mission feasibility results from testing with off-the-shelf long-wave infrared and visible cameras in Alaska and Arizona are also summarized to demonstrate the value of the SDMSI for applications such as ice tracking, ocean color, soil moisture, animal and marine vessel detection and tracking. The goal is to select the most power efficient solution for the SDMSI for use on UAVs (Unoccupied Aerial Vehicles) and other drop

  15. Hydrogen peroxide sensor: Uniformly decorated silver nanoparticles on polypyrrole for wide detection range

    Energy Technology Data Exchange (ETDEWEB)

    Nia, Pooria Moozarm, E-mail: pooriamn@yahoo.com; Meng, Woi Pei, E-mail: pmwoi@um.edu.my; Alias, Y., E-mail: yatimah70@um.edu.my

    2015-12-01

    Graphical abstract: - Highlights: • Electrochemical method was used for depositing silver nanoparticles and polypyrrole. • Silver nanoparticles (25 nm) were uniformly decorated on electrodeposited polypyrrole. • (Ag(NH{sub 3}){sub 2}OH) precursor showed better electrochemical performance than (AgNO{sub 3}). • The sensor showed superior performance toward H{sub 2}O{sub 2}. - Abstract: Electrochemically synthesized polypyrrole (PPy) decorated with silver nanoparticles (AgNPs) was prepared and used as a nonenzymatic sensor for hydrogen peroxide (H{sub 2}O{sub 2}) detection. Polypyrrole was fabricated through electrodeposition, while silver nanoparticles were deposited on polypyrrole by the same technique. The field emission scanning electron microscopy (FESEM) images showed that the electrodeposited AgNPs were aligned along the PPy uniformly and the mean particle size of AgNPs is around 25 nm. The electrocatalytic activity of AgNPs-PPy-GCE toward H{sub 2}O{sub 2} was studied using chronoamperometry and cyclic voltammetry. The first linear section was in the range of 0.1–5 mM with a limit of detection of 0.115 μmol l{sup −1} and the second linear section was raised to 120 mM with a correlation factor of 0.256 μmol l{sup −1} (S/N of 3). Moreover, the sensor presented excellent stability, selectivity, repeatability and reproducibility. These excellent performances make AgNPs-PPy/GCE an ideal nonenzymatic H{sub 2}O{sub 2} sensor.

  16. Smart image sensors: an emerging key technology for advanced optical measurement and microsystems

    Science.gov (United States)

    Seitz, Peter

    1996-08-01

    Optical microsystems typically include photosensitive devices, analog preprocessing circuitry and digital signal processing electronics. The advances in semiconductor technology have made it possible today to integrate all photosensitive and electronical devices on one 'smart image sensor' or photo-ASIC (application-specific integrated circuits containing photosensitive elements). It is even possible to provide each 'smart pixel' with additional photoelectronic functionality, without compromising the fill factor substantially. This technological capability is the basis for advanced cameras and optical microsystems showing novel on-chip functionality: Single-chip cameras with on- chip analog-to-digital converters for less than $10 are advertised; image sensors have been developed including novel functionality such as real-time selectable pixel size and shape, the capability of performing arbitrary convolutions simultaneously with the exposure, as well as variable, programmable offset and sensitivity of the pixels leading to image sensors with a dynamic range exceeding 150 dB. Smart image sensors have been demonstrated offering synchronous detection and demodulation capabilities in each pixel (lock-in CCD), and conventional image sensors are combined with an on-chip digital processor for complete, single-chip image acquisition and processing systems. Technological problems of the monolithic integration of smart image sensors include offset non-uniformities, temperature variations of electronic properties, imperfect matching of circuit parameters, etc. These problems can often be overcome either by designing additional compensation circuitry or by providing digital correction routines. Where necessary for technological or economic reasons, smart image sensors can also be combined with or realized as hybrids, making use of commercially available electronic components. It is concluded that the possibilities offered by custom smart image sensors will influence the design

  17. Photoacoustic imaging with planoconcave optical microresonator sensors: feasibility studies based on phantom imaging

    Science.gov (United States)

    Guggenheim, James A.; Zhang, Edward Z.; Beard, Paul C.

    2017-03-01

    The planar Fabry-Pérot (FP) sensor provides high quality photoacoustic (PA) images but beam walk-off limits sensitivity and thus penetration depth to ≍1 cm. Planoconcave microresonator sensors eliminate beam walk-off enabling sensitivity to be increased by an order-of-magnitude whilst retaining the highly favourable frequency response and directional characteristics of the FP sensor. The first tomographic PA images obtained in a tissue-realistic phantom using the new sensors are described. These show that the microresonator sensors provide near identical image quality as the planar FP sensor but with significantly greater penetration depth (e.g. 2-3cm) due to their higher sensitivity. This offers the prospect of whole body small animal imaging and clinical imaging to depths previously unattainable using the FP planar sensor.

  18. Snapshot Spectral and Color Imaging Using a Regular Digital Camera with a Monochromatic Image Sensor

    Science.gov (United States)

    Hauser, J.; Zheludev, V. A.; Golub, M. A.; Averbuch, A.; Nathan, M.; Inbar, O.; Neittaanmäki, P.; Pölönen, I.

    2017-10-01

    Spectral imaging (SI) refers to the acquisition of the three-dimensional (3D) spectral cube of spatial and spectral data of a source object at a limited number of wavelengths in a given wavelength range. Snapshot spectral imaging (SSI) refers to the instantaneous acquisition (in a single shot) of the spectral cube, a process suitable for fast changing objects. Known SSI devices exhibit large total track length (TTL), weight and production costs and relatively low optical throughput. We present a simple SSI camera based on a regular digital camera with (i) an added diffusing and dispersing phase-only static optical element at the entrance pupil (diffuser) and (ii) tailored compressed sensing (CS) methods for digital processing of the diffused and dispersed (DD) image recorded on the image sensor. The diffuser is designed to mix the spectral cube data spectrally and spatially and thus to enable convergence in its reconstruction by CS-based algorithms. In addition to performing SSI, this SSI camera is capable to perform color imaging using a monochromatic or gray-scale image sensor without color filter arrays.

  19. SNAPSHOT SPECTRAL AND COLOR IMAGING USING A REGULAR DIGITAL CAMERA WITH A MONOCHROMATIC IMAGE SENSOR

    Directory of Open Access Journals (Sweden)

    J. Hauser

    2017-10-01

    Full Text Available Spectral imaging (SI refers to the acquisition of the three-dimensional (3D spectral cube of spatial and spectral data of a source object at a limited number of wavelengths in a given wavelength range. Snapshot spectral imaging (SSI refers to the instantaneous acquisition (in a single shot of the spectral cube, a process suitable for fast changing objects. Known SSI devices exhibit large total track length (TTL, weight and production costs and relatively low optical throughput. We present a simple SSI camera based on a regular digital camera with (i an added diffusing and dispersing phase-only static optical element at the entrance pupil (diffuser and (ii tailored compressed sensing (CS methods for digital processing of the diffused and dispersed (DD image recorded on the image sensor. The diffuser is designed to mix the spectral cube data spectrally and spatially and thus to enable convergence in its reconstruction by CS-based algorithms. In addition to performing SSI, this SSI camera is capable to perform color imaging using a monochromatic or gray-scale image sensor without color filter arrays.

  20. Multi-sensor image fusion and its applications

    CERN Document Server

    Blum, Rick S

    2005-01-01

    Taking another lesson from nature, the latest advances in image processing technology seek to combine image data from several diverse types of sensors in order to obtain a more accurate view of the scene: very much the same as we rely on our five senses. Multi-Sensor Image Fusion and Its Applications is the first text dedicated to the theory and practice of the registration and fusion of image data, covering such approaches as statistical methods, color-related techniques, model-based methods, and visual information display strategies.After a review of state-of-the-art image fusion techniques,

  1. Target Image Matching Algorithm Based on Binocular CCD Ranging

    Directory of Open Access Journals (Sweden)

    Dongming Li

    2014-01-01

    Full Text Available This paper proposed target image in a subpixel level matching algorithm for binocular CCD ranging, which is based on the principle of binocular CCD ranging. In the paper, firstly, we introduced the ranging principle of the binocular ranging system and deduced a binocular parallax formula. Secondly, we deduced the algorithm which was named improved cross-correlation matching algorithm and cubic surface fitting algorithm for target images matched, and it could achieve a subpixel level matching for binocular CCD ranging images. Lastly, through experiment we have analyzed and verified the actual CCD ranging images, then analyzed the errors of the experimental results and corrected the formula of calculating system errors. Experimental results showed that the actual measurement accuracy of a target within 3 km was higher than 0.52%, which meet the accuracy requirements of the high precision binocular ranging.

  2. Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing

    Directory of Open Access Journals (Sweden)

    Arko Lucieer

    2012-05-01

    Full Text Available Unmanned aerial vehicles (UAVs represent a quickly evolving technology, broadening the availability of remote sensing tools to small-scale research groups across a variety of scientific fields. Development of UAV platforms requires broad technical skills covering platform development, data post-processing, and image analysis. UAV development is constrained by a need to balance technological accessibility, flexibility in application and quality in image data. In this study, the quality of UAV imagery acquired by a miniature 6-band multispectral imaging sensor was improved through the application of practical image-based sensor correction techniques. Three major components of sensor correction were focused upon: noise reduction, sensor-based modification of incoming radiance, and lens distortion. Sensor noise was reduced through the use of dark offset imagery. Sensor modifications through the effects of filter transmission rates, the relative monochromatic efficiency of the sensor and the effects of vignetting were removed through a combination of spatially/spectrally dependent correction factors. Lens distortion was reduced through the implementation of the Brown–Conrady model. Data post-processing serves dual roles in data quality improvement, and the identification of platform limitations and sensor idiosyncrasies. The proposed corrections improve the quality of the raw multispectral imagery, facilitating subsequent quantitative image analysis.

  3. Real-time extended dynamic range imaging in shearography.

    Science.gov (United States)

    Groves, Roger M; Pedrini, Giancarlo; Osten, Wolfgang

    2008-10-20

    Extended dynamic range (EDR) imaging is a postprocessing technique commonly associated with photography. Multiple images of a scene are recorded by the camera using different shutter settings and are merged into a single higher dynamic range image. Speckle interferometry and holography techniques require a well-modulated intensity signal to extract the phase information, and of these techniques shearography is most sensitive to different object surface reflectivities as it uses self-referencing from a sheared image. In this paper the authors demonstrate real-time EDR imaging in shearography and present experimental results from a difficult surface reflectivity sample: a wooden panel painting containing gold and dark earth color paint.

  4. Color digital holography using a single monochromatic imaging sensor.

    Science.gov (United States)

    Kiire, Tomohiro; Barada, Daisuke; Sugisaka, Jun-ichiro; Hayasaki, Yoshio; Yatagai, Toyohiko

    2012-08-01

    Color digital holography utilizing the Doppler effect is proposed. The time variation of holograms produced by superposing images at three wavelengths is recorded using a high-speed monochromatic imaging sensor. The complex amplitude at each wavelength can be extracted from frequency information contained in the Fourier transforms of the recorded holograms. An image of the object is reconstructed by the angular spectrum method. Reconstructed monochromatic images at the three wavelengths are combined to produce a color image for display.

  5. Vision communications based on LED array and imaging sensor

    Science.gov (United States)

    Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    In this paper, we propose a brand new communication concept, called as "vision communication" based on LED array and image sensor. This system consists of LED array as a transmitter and digital device which include image sensor such as CCD and CMOS as receiver. In order to transmit data, the proposed communication scheme simultaneously uses the digital image processing and optical wireless communication scheme. Therefore, the cognitive communication scheme is possible with the help of recognition techniques used in vision system. By increasing data rate, our scheme can use LED array consisting of several multi-spectral LEDs. Because arranged each LED can emit multi-spectral optical signal such as visible, infrared and ultraviolet light, the increase of data rate is possible similar to WDM and MIMO skills used in traditional optical and wireless communications. In addition, this multi-spectral capability also makes it possible to avoid the optical noises in communication environment. In our vision communication scheme, the data packet is composed of Sync. data and information data. Sync. data is used to detect the transmitter area and calibrate the distorted image snapshots obtained by image sensor. By making the optical rate of LED array be same with the frame rate (frames per second) of image sensor, we can decode the information data included in each image snapshot based on image processing and optical wireless communication techniques. Through experiment based on practical test bed system, we confirm the feasibility of the proposed vision communications based on LED array and image sensor.

  6. Range detection for AGV using a rotating sonar sensor

    Science.gov (United States)

    Chiang, Wen-chuan; Ramamurthy, Dhyana Chandra; Mundhenk, Terrell N.; Hall, Ernest L.

    1998-10-01

    A single rotating sonar element is used with a restricted angle of sweep to obtain readings to develop a range map for the unobstructed path of an autonomous guided vehicle (AGV). A Polaroid ultrasound transducer element is mounted on a micromotor with an encoder feedback. The motion of this motor is controlled using a Galil DMC 1000 motion control board. The encoder is interfaced with the DMC 1000 board using an intermediate IMC 1100 break-out board. By adjusting the parameters of the Polaroid element, it is possible to obtain range readings at known angles with respect to the center of the robot. The readings are mapped to obtain a range map of the unobstructed path in front of the robot. The idea can be extended to a 360 degree mapping by changing the assembly level programming on the Galil Motion control board. Such a system would be compact and reliable over a range of environments and AGV applications.

  7. LIGHTNING IMAGING SENSOR (LIS) SCIENCE DATA V4

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lightning Imaging Sensor (LIS) is an instrument on the Tropical Rainfall Measurement Mission satellite (TRMM) used to detect the distribution and variability of...

  8. Low-Mass Planar Photonic Imaging Sensor Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a revolutionary electro-optical (EO) imaging sensor concept that provides a low-mass, low-volume alternative to the traditional bulky optical telescope...

  9. Joint focus stacking and high dynamic range imaging

    Science.gov (United States)

    Qian, Qinchun; Gunturk, Bahadir K.; Batur, Aziz U.

    2013-01-01

    Focus stacking and high dynamic range (HDR) imaging are two paradigms of computational photography. Focus stacking aims to produce an image with greater depth of field (DOF) from a set of images taken with different focus distances, whereas HDR imaging aims to produce an image with higher dynamic range from a set of images taken with different exposure settings. In this paper, we present an algorithm which combines focus stacking and HDR imaging in order to produce an image with both higher dynamic range and greater DOF than any of the input images. The proposed algorithm includes two main parts: (i) joint photometric and geometric registration and (ii) joint focus stacking and HDR image creation. In the first part, images are first photometrically registered using an algorithm that is insensitive to small geometric variations, and then geometrically registered using an optical flow algorithm. In the second part, images are merged through weighted averaging, where the weights depend on both local sharpness and exposure information. We provide experimental results with real data to illustrate the algorithm. The algorithm is also implemented on a smartphone with Android operating system.

  10. A Biologically Inspired CMOS Image Sensor

    NARCIS (Netherlands)

    Sarkar, M.

    2011-01-01

    Biological systems are a source of inspiration in the development of small autonomous sensor nodes. The two major types of optical vision systems found in nature are the single aperture human eye and the compound eye of insects. The latter are among the most compact and smallest vision sensors. The

  11. Fusion: ultra-high-speed and IR image sensors

    Science.gov (United States)

    Etoh, T. Goji; Dao, V. T. S.; Nguyen, Quang A.; Kimata, M.

    2015-08-01

    Most targets of ultra-high-speed video cameras operating at more than 1 Mfps, such as combustion, crack propagation, collision, plasma, spark discharge, an air bag at a car accident and a tire under a sudden brake, generate sudden heat. Researchers in these fields require tools to measure the high-speed motion and heat simultaneously. Ultra-high frame rate imaging is achieved by an in-situ storage image sensor. Each pixel of the sensor is equipped with multiple memory elements to record a series of image signals simultaneously at all pixels. Image signals stored in each pixel are read out after an image capturing operation. In 2002, we developed an in-situ storage image sensor operating at 1 Mfps 1). However, the fill factor of the sensor was only 15% due to a light shield covering the wide in-situ storage area. Therefore, in 2011, we developed a backside illuminated (BSI) in-situ storage image sensor to increase the sensitivity with 100% fill factor and a very high quantum efficiency 2). The sensor also achieved a much higher frame rate,16.7 Mfps, thanks to the wiring on the front side with more freedom 3). The BSI structure has another advantage that it has less difficulties in attaching an additional layer on the backside, such as scintillators. This paper proposes development of an ultra-high-speed IR image sensor in combination of advanced nano-technologies for IR imaging and the in-situ storage technology for ultra-highspeed imaging with discussion on issues in the integration.

  12. Expanding the dynamic measurement range for polymeric nanoparticle pH sensors

    DEFF Research Database (Denmark)

    Sun, Honghao; Almdal, Kristoffer; Andresen, Thomas Lars

    2011-01-01

    Conventional optical nanoparticle pH sensors that are designed for ratiometric measurements in cells have been based on utilizing one sensor fluorophore and one reference fluorophore in each nanoparticle, which results in a relatively narrow dynamic measurement range. This results in substantial...

  13. Extended Special Sensor Microwave Imager (SSM/I) Sensor Data Record (SDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager (SSM/I) is a seven-channel linearly polarized passive microwave radiometer that operates at frequencies of 19.36 (vertically and...

  14. SEGMENTATION AND QUALITY ANALYSIS OF LONG RANGE CAPTURED IRIS IMAGE

    Directory of Open Access Journals (Sweden)

    Anand Deshpande

    2016-05-01

    Full Text Available The iris segmentation plays a major role in an iris recognition system to increase the performance of the system. This paper proposes a novel method for segmentation of iris images to extract the iris part of long range captured eye image and an approach to select best iris frame from the iris polar image sequences by analyzing the quality of iris polar images. The quality of iris image is determined by the frequency components present in the iris polar images. The experiments are carried out on CASIA-long range captured iris image sequences. The proposed segmentation method is compared with Hough transform based segmentation and it has been determined that the proposed method gives higher accuracy for segmentation than Hough transform.

  15. Fuzzy image processing in sun sensor

    Science.gov (United States)

    Mobasser, S.; Liebe, C. C.; Howard, A.

    2003-01-01

    This paper will describe how the fuzzy image processing is implemented in the instrument. Comparison of the Fuzzy image processing and a more conventional image processing algorithm is provided and shows that the Fuzzy image processing yields better accuracy then conventional image processing.

  16. Images from Bits: Non-Iterative Image Reconstruction for Quanta Image Sensors

    Directory of Open Access Journals (Sweden)

    Stanley H. Chan

    2016-11-01

    Full Text Available A quanta image sensor (QIS is a class of single-photon imaging devices that measure light intensity using oversampled binary observations. Because of the stochastic nature of the photon arrivals, data acquired by QIS is a massive stream of random binary bits. The goal of image reconstruction is to recover the underlying image from these bits. In this paper, we present a non-iterative image reconstruction algorithm for QIS. Unlike existing reconstruction methods that formulate the problem from an optimization perspective, the new algorithm directly recovers the images through a pair of nonlinear transformations and an off-the-shelf image denoising algorithm. By skipping the usual optimization procedure, we achieve orders of magnitude improvement in speed and even better image reconstruction quality. We validate the new algorithm on synthetic datasets, as well as real videos collected by one-bit single-photon avalanche diode (SPAD cameras.

  17. SAW Passive Wireless Sensor-RFID Tags with Enhanced Range Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal describes the development of passive wireless surface acoustic wave (SAW) RFID sensor-tags with enhanced range for remote monitoring of large groups of...

  18. Range image registration using a photometric metric under unknown lighting.

    Science.gov (United States)

    Thomas, Diego; Sugimoto, Akihiro

    2013-09-01

    Based on the spherical harmonics representation of image formation, we derive a new photometric metric for evaluating the correctness of a given rigid transformation aligning two overlapping range images captured under unknown, distant, and general illumination. We estimate the surrounding illumination and albedo values of points of the two range images from the point correspondences induced by the input transformation. We then synthesize the color of both range images using albedo values transferred using the point correspondences to compute the photometric reprojection error. This way allows us to accurately register two range images by finding the transformation that minimizes the photometric reprojection error. We also propose a practical method using the proposed photometric metric to register pairs of range images devoid of salient geometric features, captured under unknown lighting. Our method uses a hypothesize-and-test strategy to search for the transformation that minimizes our photometric metric. Transformation candidates are efficiently generated by employing the spherical representation of each range image. Experimental results using both synthetic and real data demonstrate the usefulness of the proposed metric.

  19. Quality Factor Effect on the Wireless Range of Microstrip Patch Antenna Strain Sensors

    Directory of Open Access Journals (Sweden)

    Ali Daliri

    2014-01-01

    Full Text Available Recently introduced passive wireless strain sensors based on microstrip patch antennas have shown great potential for reliable health and usage monitoring in aerospace and civil industries. However, the wireless interrogation range of these sensors is limited to few centimeters, which restricts their practical application. This paper presents an investigation on the effect of circular microstrip patch antenna (CMPA design on the quality factor and the maximum practical wireless reading range of the sensor. The results reveal that by using appropriate substrate materials the interrogation distance of the CMPA sensor can be increased four-fold, from the previously reported 5 to 20 cm, thus improving considerably the viability of this type of wireless sensors for strain measurement and damage detection.

  20. Range image segmentation for tree detection in forest scans

    Directory of Open Access Journals (Sweden)

    A. Bienert

    2013-10-01

    Full Text Available To make a tree-wise analysis inside a forest stand, the trees have to be identified. An interactive segmentation is often labourintensive and time-consuming. Therefore, an automatic detection process will aspired using a range image. This paper presents a method for the segmentation of range images extracted from terrestrial laser scanner point clouds of forest stands. After range image generation the segmentation is carried out with a connectivity analysis using the differences of the range values as homogeneity criterion. Subsequently, the tree detection is performed interactively by analysing one horizontal image line. When passing objects with a specific width, the object indicates a potential tree. By using the edge points of a segmented pixel group the tree position and diameter is calculated. Results from one test site are presented to show the performance of the method.

  1. Unsynchronized scanning with a low-cost laser range finder for real-time range imaging

    Science.gov (United States)

    Hatipoglu, Isa; Nakhmani, Arie

    2017-06-01

    Range imaging plays an essential role in many fields: 3D modeling, robotics, heritage, agriculture, forestry, reverse engineering. One of the most popular range-measuring technologies is laser scanner due to its several advantages: long range, high precision, real-time measurement capabilities, and no dependence on lighting conditions. However, laser scanners are very costly. Their high cost prevents widespread use in applications. Due to the latest developments in technology, now, low-cost, reliable, faster, and light-weight 1D laser range finders (LRFs) are available. A low-cost 1D LRF with a scanning mechanism, providing the ability of laser beam steering for additional dimensions, enables to capture a depth map. In this work, we present an unsynchronized scanning with a low-cost LRF to decrease scanning period and reduce vibrations caused by stop-scan in synchronized scanning. Moreover, we developed an algorithm for alignment of unsynchronized raw data and proposed range image post-processing framework. The proposed technique enables to have a range imaging system for a fraction of the price of its counterparts. The results prove that the proposed method can fulfill the need for a low-cost laser scanning for range imaging for static environments because the most significant limitation of the method is the scanning period which is about 2 minutes for 55,000 range points (resolution of 250x220 image). In contrast, scanning the same image takes around 4 minutes in synchronized scanning. Once faster, longer range, and narrow beam LRFs are available, the methods proposed in this work can produce better results.

  2. Zinc oxide nanoparticle based optical fiber humidity sensor having linear response throughout a large dynamic range.

    Science.gov (United States)

    Aneesh, R; Khijwania, Sunil K

    2011-09-20

    The main objective of the present work is to develop an optical fiber relative humidity (RH) sensor having a linear response throughout over the widest possible dynamic range. We report an optical fiber RH sensor based on the evanescent wave absorption spectroscopy that fulfills this objective. The fiber sensor employs a specific nanoparticle (zinc oxide) doped sol-gel nanostructured sensing film of optimum thickness, synthesized over a short length of a centrally decladded straight and uniform optical fiber. A detailed experimental investigation is carried out to analyze the sensor response/characteristics. Fiber sensor response is observed to be linear throughout the dynamic range as wide as 4% to 96% RH. The observed linear sensitivity for the fiber sensor is 0.0012 RH(-1). The average response time of the reported sensor is observed to be as short as 0.06 s during the humidification. In addition, the sensor exhibited a very good degree of reversibility and extremely high reliability as well as repeatability.

  3. Detection of sudden death syndrome using a multispectral imaging sensor

    Science.gov (United States)

    Sudden death syndrome (SDS), caused by the fungus Fusarium solani f. sp. glycines, is a widespread mid- to late-season disease with distinctive foliar symptoms. This paper reported the development of an image analysis based method to detect SDS using a multispectral image sensor. A hue, saturation a...

  4. Color Sensitivity Multiple Exposure Fusion using High Dynamic Range Image

    Directory of Open Access Journals (Sweden)

    Varsha Borole

    2014-02-01

    Full Text Available In this paper, we present a high dynamic range imaging (HDRI method using a capturing camera image using normally exposure, over exposure and under exposure. We make three different images from a multiple input image using local histogram stretching. Because the proposed method generated three histogram-stretched images from a multiple input image, ghost artifacts that are the result of the relative motion between the camera and objects during exposure time, are inherently removed. Therefore, the proposed method can be applied to a consumer compact camera to provide the ghost artifacts free HDRI. Experiments with several sets of test images with different exposures show that the proposed method gives a better performance than existing methods in terms of visual results and computation time.

  5. Toroidal sensor arrays for real-time photoacoustic imaging

    Science.gov (United States)

    Bychkov, Anton S.; Cherepetskaya, Elena B.; Karabutov, Alexander A.; Makarov, Vladimir A.

    2017-07-01

    This article addresses theoretical and numerical investigation of image formation in photoacoustic (PA) imaging with complex-shaped concave sensor arrays. The spatial resolution and the size of sensitivity region of PA and laser ultrasonic (LU) imaging systems are assessed using sensitivity maps and spatial resolution maps in the image plane. This paper also discusses the relationship between the size of high-sensitivity regions and the spatial resolution of real-time imaging systems utilizing toroidal arrays. It is shown that the use of arrays with toroidal geometry significantly improves the diagnostic capabilities of PA and LU imaging to investigate biological objects, rocks, and composite materials.

  6. Toroidal sensor arrays for real-time photoacoustic imaging.

    Science.gov (United States)

    Bychkov, Anton S; Cherepetskaya, Elena B; Karabutov, Alexander A; Makarov, Vladimir A

    2017-07-01

    This article addresses theoretical and numerical investigation of image formation in photoacoustic (PA) imaging with complex-shaped concave sensor arrays. The spatial resolution and the size of sensitivity region of PA and laser ultrasonic (LU) imaging systems are assessed using sensitivity maps and spatial resolution maps in the image plane. This paper also discusses the relationship between the size of high-sensitivity regions and the spatial resolution of real-time imaging systems utilizing toroidal arrays. It is shown that the use of arrays with toroidal geometry significantly improves the diagnostic capabilities of PA and LU imaging to investigate biological objects, rocks, and composite materials.

  7. Optical Tomography System: Charge-coupled Device Linear Image Sensors

    Directory of Open Access Journals (Sweden)

    M. Idroas

    2010-09-01

    Full Text Available This paper discussed an optical tomography system based on charge-coupled device (CCD linear image sensors. The developed system consists of a lighting system, a measurement section and a data acquisition system. Four CCD linear image sensors are configured around a flow pipe with an octagonal-shaped measurement section, for a four projections system. The four CCD linear image sensors consisting of 2048 pixels with a pixel size of 14 micron by 14 micron are used to produce a high-resolution system. A simple optical model is mapped into the system’s sensitivity matrix to relate the optical attenuation due to variations of optical density within the measurement section. A reconstructed tomographic image is produced based on the model using MATLAB software. The designed instrumentation system is calibrated and tested through different particle size measurements from different projections.

  8. Wide-Range Temperature Sensors with High-Level Pulse Train Output

    Science.gov (United States)

    Hammoud, Ahmad; Patterson, Richard L.

    2009-01-01

    Two types of temperature sensors have been developed for wide-range temperature applications. The two sensors measure temperature in the range of -190 to +200 C and utilize a thin-film platinum RTD (resistance temperature detector) as the temperature-sensing element. Other parts used in the fabrication of these sensors include NPO (negative-positive- zero) type ceramic capacitors for timing, thermally-stable film or wirewound resistors, and high-temperature circuit boards and solder. The first type of temperature sensor is a relaxation oscillator circuit using an SOI (silicon-on-insulator) operational amplifier as a comparator. The output is a pulse train with a period that is roughly proportional to the temperature being measured. The voltage level of the pulse train is high-level, for example 10 V. The high-level output makes the sensor less sensitive to noise or electromagnetic interference. The output can be read by a frequency or period meter and then converted into a temperature reading. The second type of temperature sensor is made up of various types of multivibrator circuits using an SOI type 555 timer and the passive components mentioned above. Three configurations have been developed that were based on the technique of charging and discharging a capacitor through a resistive element to create a train of pulses governed by the capacitor-resistor time constant. Both types of sensors, which operated successfully over the wide temperature range, have potential use in extreme temperature environments including jet engines and space exploration missions.

  9. Ranging Consistency Based on Ranging-Compensated Temperature-Sensing Sensor for Inter-Satellite Link of Navigation Constellation.

    Science.gov (United States)

    Meng, Zhijun; Yang, Jun; Guo, Xiye; Zhou, Yongbin

    2017-06-13

    Global Navigation Satellite System performance can be significantly enhanced by introducing inter-satellite links (ISLs) in navigation constellation. The improvement in position, velocity, and time accuracy as well as the realization of autonomous functions requires ISL distance measurement data as the original input. To build a high-performance ISL, the ranging consistency among navigation satellites is an urgent problem to be solved. In this study, we focus on the variation in the ranging delay caused by the sensitivity of the ISL payload equipment to the ambient temperature in space and propose a simple and low-power temperature-sensing ranging compensation sensor suitable for onboard equipment. The experimental results show that, after the temperature-sensing ranging compensation of the ISL payload equipment, the ranging consistency becomes less than 0.2 ns when the temperature change is 90 °C.

  10. Aerial Triangulation Close-range Images with Dual Quaternion

    Directory of Open Access Journals (Sweden)

    SHENG Qinghong

    2015-05-01

    Full Text Available A new method for the aerial triangulation of close-range images based on dual quaternion is presented. Using dual quaternion to represent the spiral screw motion of the beam in the space, the real part of dual quaternion represents the angular elements of all the beams in the close-range area networks, the real part and the dual part of dual quaternion represents the line elements corporately. Finally, an aerial triangulation adjustment model based on dual quaternion is established, and the elements of interior orientation and exterior orientation and the object coordinates of the ground points are calculated. Real images and large attitude angle simulated images are selected to run the experiments of aerial triangulation. The experimental results show that the new method for the aerial triangulation of close-range images based on dual quaternion can obtain higher accuracy.

  11. A novel track imaging system as a range counter

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z. [National Institute of Radiological Sciences (Japan); Matsufuji, N. [National Institute of Radiological Sciences (Japan); Tokyo Institute of Technology (Japan); Kanayama, S. [Chiba University (Japan); Ishida, A. [National Institute of Radiological Sciences (Japan); Tokyo Institute of Technology (Japan); Kohno, T. [Tokyo Institute of Technology (Japan); Koba, Y.; Sekiguchi, M.; Kitagawa, A.; Murakami, T. [National Institute of Radiological Sciences (Japan)

    2016-05-01

    An image-intensified, camera-based track imaging system has been developed to measure the tracks of ions in a scintillator block. To study the performance of the detector unit in the system, two types of scintillators, a dosimetrically tissue-equivalent plastic scintillator EJ-240 and a CsI(Tl) scintillator, were separately irradiated with carbon ion ({sup 12}C) beams of therapeutic energy from HIMAC at NIRS. The images of individual ion tracks in the scintillators were acquired by the newly developed track imaging system. The ranges reconstructed from the images are reported here. The range resolution of the measurements is 1.8 mm for 290 MeV/u carbon ions, which is considered a significant improvement on the energy resolution of the conventional ΔE/E method. The detector is compact and easy to handle, and it can fit inside treatment rooms for in-situ studies, as well as satisfy clinical quality assurance purposes.

  12. Ultra-High-Speed Image Signal Accumulation Sensor

    Directory of Open Access Journals (Sweden)

    Takeharu Goji Etoh

    2010-04-01

    Full Text Available Averaging of accumulated data is a standard technique applied to processing data with low signal-to-noise ratios (SNR, such as image signals captured in ultra-high-speed imaging. The authors propose an architecture layout of an ultra-high-speed image sensor capable of on-chip signal accumulation. The very high frame rate is enabled by employing an image sensor structure with a multi-folded CCD in each pixel, which serves as an in situ image signal storage. The signal accumulation function is achieved by direct connection of the first and the last storage elements of the in situ storage CCD. It has been thought that the multi-folding is achievable only by driving electrodes with complicated and impractical layouts. Simple configurations of the driving electrodes to overcome the difficulty are presented for two-phase and four-phase transfer CCD systems. The in situ storage image sensor with the signal accumulation function is named Image Signal Accumulation Sensor (ISAS.

  13. Research-grade CMOS image sensors for demanding space applications

    Science.gov (United States)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2017-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid- 90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  14. Image and Sensor Data Processing for Target Acquisition and Recognition.

    Science.gov (United States)

    1980-11-01

    reprisontativo d’images d’antratne- mont dout il connait la viriti terrain . Pour chacune des cibles do cec images, lordinateur calculera les n paramitres...l’objet, glissement limitd A sa lergeur. DOaprds las rdsultets obtenus jusqu’A meintenent, nous navons pas observE de glissement impor- tant et ATR> I TR...AEROSPACE RESEARCH AND DEVELOPMENT (ORGANISATION DU TRAITE DE L’ATLANTIQUE NORD) AGARDonferenceJoceedin io.290 IMAGE AND SENSOR DATA PROCESSING FOR TARGET

  15. Whisk Broom Imaging Sensor LandSat-7

    OpenAIRE

    2007-01-01

    sim present anim Simulation Presentation Animation Interactive Media Element This presentation demonstrates using animations how a whisk-broom imaging sensor operates. It shows: The optical path through the primary and secondary mirrors to the Scan Line Correction (SLC) assembly., How the satellite captures images of the ground using the Scan Mirror assembly., The change in the scanned image when the SLC is turned off. SS3020 Introduction to Measurement and Signatur...

  16. A mobile ferromagnetic shape detection sensor using a Hall sensor array and magnetic imaging.

    Science.gov (United States)

    Misron, Norhisam; Shin, Ng Wei; Shafie, Suhaidi; Marhaban, Mohd Hamiruce; Mailah, Nashiren Farzilah

    2011-01-01

    This paper presents a mobile Hall sensor array system for the shape detection of ferromagnetic materials that are embedded in walls or floors. The operation of the mobile Hall sensor array system is based on the principle of magnetic flux leakage to describe the shape of the ferromagnetic material. Two permanent magnets are used to generate the magnetic flux flow. The distribution of magnetic flux is perturbed as the ferromagnetic material is brought near the permanent magnets and the changes in magnetic flux distribution are detected by the 1-D array of the Hall sensor array setup. The process for magnetic imaging of the magnetic flux distribution is done by a signal processing unit before it displays the real time images using a netbook. A signal processing application software is developed for the 1-D Hall sensor array signal acquisition and processing to construct a 2-D array matrix. The processed 1-D Hall sensor array signals are later used to construct the magnetic image of ferromagnetic material based on the voltage signal and the magnetic flux distribution. The experimental results illustrate how the shape of specimens such as square, round and triangle shapes is determined through magnetic images based on the voltage signal and magnetic flux distribution of the specimen. In addition, the magnetic images of actual ferromagnetic objects are also illustrated to prove the functionality of mobile Hall sensor array system for actual shape detection. The results prove that the mobile Hall sensor array system is able to perform magnetic imaging in identifying various ferromagnetic materials.

  17. Long-range measurement system using ultrasonic range sensor with high-power transmitter array in air.

    Science.gov (United States)

    Kumar, Sahdev; Furuhashi, Hideo

    2017-02-01

    A long-range measurement system comprising an ultrasonic range sensor with a high-power ultrasonic transmitter array in air was investigated. The system is simple in construction and can be used under adverse conditions such as fog, rain, darkness, and smoke. However, due to ultrasonic waves are well absorbed by air molecules, the measurable range is limited to a few meters. Therefore, we developed a high-power ultrasonic transmitter array consisting of 144 transmitting elements. All elements are arranged in the form of a 12×12 array pattern. The sound pressure level at 5m from the transmitter array was >30dB higher than that of a single element. A measuring range of over 25m was achieved using this transmitter array in conjunction with a receiver array having 32 receiving elements. The characteristics of the transmitter array and range sensor system are discussed by comparing simulation and experimental results. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. DISOCCLUSION OF 3D LIDAR POINT CLOUDS USING RANGE IMAGES

    Directory of Open Access Journals (Sweden)

    P. Biasutti

    2017-05-01

    Full Text Available This paper proposes a novel framework for the disocclusion of mobile objects in 3D LiDAR scenes aquired via street-based Mobile Mapping Systems (MMS. Most of the existing lines of research tackle this problem directly in the 3D space. This work promotes an alternative approach by using a 2D range image representation of the 3D point cloud, taking advantage of the fact that the problem of disocclusion has been intensively studied in the 2D image processing community over the past decade. First, the point cloud is turned into a 2D range image by exploiting the sensor’s topology. Using the range image, a semi-automatic segmentation procedure based on depth histograms is performed in order to select the occluding object to be removed. A variational image inpainting technique is then used to reconstruct the area occluded by that object. Finally, the range image is unprojected as a 3D point cloud. Experiments on real data prove the effectiveness of this procedure both in terms of accuracy and speed.

  19. Novel Absolute Displacement Sensor with Wide Range Based on Malus Law

    Directory of Open Access Journals (Sweden)

    Yonggang Lin

    2009-12-01

    Full Text Available The paper presents a novel wide range absolute displacement sensor based on polarized light detection principle. The sensor comprises of two sets of polarized light detecting systems which are coupled by pulleys. The inherent disadvantage in optic system like light source intensity drift is solved and absolute measurement with wide-range is achieved. A prototype and the relevant test bed have been built. The test results are in good agreement with expectation. Its measurement range is 540 mm, and its linearity is better than 0.05%.

  20. Application of time-hopping UWB range-bit rate performance in the UWB sensor networks

    NARCIS (Netherlands)

    Nascimento, J.R.V. do; Nikookar, H.

    2008-01-01

    In this paper, the achievable range-bit rate performance is evaluated for Time-Hopping (TH) UWB networks complying with the FCC outdoor emission limits in the presence of Multiple Access Interference (MAI). Application of TH-UWB range-bit rate performance is presented for UWB sensor networks.

  1. Theoretical and Experimental Study on Wide Range Optical Fiber Turbine Flow Sensor

    Directory of Open Access Journals (Sweden)

    Yuhuan Du

    2016-07-01

    Full Text Available In this paper, a novel fiber turbine flow sensor was proposed and demonstrated for liquid measurement with optical fiber, using light intensity modulation to measure the turbine rotational speed for converting to flow rate. The double-circle-coaxial (DCC fiber probe was introduced in frequency measurement for the first time. Through the divided ratio of two rings light intensity, the interference in light signals acquisition can be eliminated. To predict the characteristics between the output frequency and flow in the nonlinear range, the turbine flow sensor model was built. Via analyzing the characteristics of turbine flow sensor, piecewise linear equations were achieved in expanding the flow measurement range. Furthermore, the experimental verification was tested. The results showed that the flow range ratio of DN20 turbine flow sensor was improved 2.9 times after using piecewise linear in the nonlinear range. Therefore, combining the DCC fiber sensor and piecewise linear method, it can be developed into a strong anti-electromagnetic interference(anti-EMI and wide range fiber turbine flowmeter.

  2. Analysis of imaging for laser triangulation sensors under Scheimpflug rule.

    Science.gov (United States)

    Miks, Antonin; Novak, Jiri; Novak, Pavel

    2013-07-29

    In this work a detailed analysis of the problem of imaging of objects lying in the plane tilted with respect to the optical axis of the rotationally symmetrical optical system is performed by means of geometrical optics theory. It is shown that the fulfillment of the so called Scheimpflug condition (Scheimpflug rule) does not guarantee the sharp image of the object as it is usually declared because of the fact that due to the dependence of aberrations of real optical systems on the object distance the image becomes blurred. The f-number of a given optical system also varies with the object distance. It is shown the influence of above mentioned effects on the accuracy of the laser triangulation sensors measurements. A detailed analysis of laser triangulation sensors, based on geometrical optics theory, is performed and relations for the calculation of measurement errors and construction parameters of laser triangulation sensors are derived.

  3. The Theoretical Highest Frame Rate of Silicon Image Sensors

    Directory of Open Access Journals (Sweden)

    Takeharu Goji Etoh

    2017-02-01

    Full Text Available The frame rate of the digital high-speed video camera was 2000 frames per second (fps in 1989, and has been exponentially increasing. A simulation study showed that a silicon image sensor made with a 130 nm process technology can achieve about 1010 fps. The frame rate seems to approach the upper bound. Rayleigh proposed an expression on the theoretical spatial resolution limit when the resolution of lenses approached the limit. In this paper, the temporal resolution limit of silicon image sensors was theoretically analyzed. It is revealed that the limit is mainly governed by mixing of charges with different travel times caused by the distribution of penetration depth of light. The derived expression of the limit is extremely simple, yet accurate. For example, the limit for green light of 550 nm incident to silicon image sensors at 300 K is 11.1 picoseconds. Therefore, the theoretical highest frame rate is 90.1 Gfps (about 1011 fps

  4. A Wirelessly Powered Smart Contact Lens with Reconfigurable Wide Range and Tunable Sensitivity Sensor Readout Circuitry.

    Science.gov (United States)

    Chiou, Jin-Chern; Hsu, Shun-Hsi; Huang, Yu-Chieh; Yeh, Guan-Ting; Liou, Wei-Ting; Kuei, Cheng-Kai

    2017-01-07

    This study presented a wireless smart contact lens system that was composed of a reconfigurable capacitive sensor interface circuitry and wirelessly powered radio-frequency identification (RFID) addressable system for sensor control and data communication. In order to improve compliance and reduce user discomfort, a capacitive sensor was embedded on a soft contact lens of 200 μm thickness using commercially available bio-compatible lens material and a standard manufacturing process. The results indicated that the reconfigurable sensor interface achieved sensitivity and baseline tuning up to 120 pF while consuming only 110 μW power. The range and sensitivity tuning of the readout circuitry ensured a reliable operation with respect to sensor fabrication variations and independent calibration of the sensor baseline for individuals. The on-chip voltage scaling allowed the further extension of the detection range and prevented the implementation of large on-chip elements. The on-lens system enabled the detection of capacitive variation caused by pressure changes in the range of 2.25 to 30 mmHg and hydration level variation from a distance of 1 cm using incident power from an RFID reader at 26.5 dBm.

  5. Fixed Pattern Noise pixel-wise linear correction for crime scene imaging CMOS sensor

    Science.gov (United States)

    Yang, Jie; Messinger, David W.; Dube, Roger R.; Ientilucci, Emmett J.

    2017-05-01

    Filtered multispectral imaging technique might be a potential method for crime scene documentation and evidence detection due to its abundant spectral information as well as non-contact and non-destructive nature. Low-cost and portable multispectral crime scene imaging device would be highly useful and efficient. The second generation crime scene imaging system uses CMOS imaging sensor to capture spatial scene and bandpass Interference Filters (IFs) to capture spectral information. Unfortunately CMOS sensors suffer from severe spatial non-uniformity compared to CCD sensors and the major cause is Fixed Pattern Noise (FPN). IFs suffer from "blue shift" effect and introduce spatial-spectral correlated errors. Therefore, Fixed Pattern Noise (FPN) correction is critical to enhance crime scene image quality and is also helpful for spatial-spectral noise de-correlation. In this paper, a pixel-wise linear radiance to Digital Count (DC) conversion model is constructed for crime scene imaging CMOS sensor. Pixel-wise conversion gain Gi,j and Dark Signal Non-Uniformity (DSNU) Zi,j are calculated. Also, conversion gain is divided into four components: FPN row component, FPN column component, defects component and effective photo response signal component. Conversion gain is then corrected to average FPN column and row components and defects component so that the sensor conversion gain is uniform. Based on corrected conversion gain and estimated image incident radiance from the reverse of pixel-wise linear radiance to DC model, corrected image spatial uniformity can be enhanced to 7 times as raw image, and the bigger the image DC value within its dynamic range, the better the enhancement.

  6. Advanced pixel architectures for scientific image sensors

    CERN Document Server

    Coath, R; Godbeer, A; Wilson, M; Turchetta, R

    2009-01-01

    We present recent developments from two projects targeting advanced pixel architectures for scientific applications. Results are reported from FORTIS, a sensor demonstrating variants on a 4T pixel architecture. The variants include differences in pixel and diode size, the in-pixel source follower transistor size and the capacitance of the readout node to optimise for low noise and sensitivity to small amounts of charge. Results are also reported from TPAC, a complex pixel architecture with ~160 transistors per pixel. Both sensors were manufactured in the 0.18μm INMAPS process, which includes a special deep p-well layer and fabrication on a high resistivity epitaxial layer for improved charge collection efficiency.

  7. Simulating The Performance Of Imaging Sensors For Use In Realistic Tactical Environments

    Science.gov (United States)

    Matise, Brian K.; Rogne, Timothy J.; Gerhart, Grant R.; Graziano, James M.

    1985-10-01

    An imaging sensor simulation model is described which allows a modeled or measured scene radiance map to be displayed on a video monitor as it would be seen if viewed through a simulated sensor under simulated environmental conditions. The model includes atmospheric effects (transmittance, path radiance, and single-scattered solar radiance) by incorporating a modified version of the LOWTRAN 6 code. Obscuration and scattered radiance introduced into the scene by battlefield induced contaminants are represented by a battlefield effects module. This module treats smoke clouds as a series of Gaussian puffs whose transport and diffusion are modeled in a semi-random fashion to simulate atmospheric turbulence. The imaging sensor is modeled by rigorous application of appropriate optical transfer functions with appropriate insertion of random system noise. The simulation includes atmospheric turbulence transfer functions according to the method of Fried. Of particular use to sensor designers, the various effects may be applied individually or in sequence to observe which effects are responsible for image distortion. Sensor parameters may be modified interactively, or recalled from a sensor library. The range of the sensor from a measured scene may be varied in the simulation, and background and target radiance maps may be combined into a single image. The computer model itself is written in FORTRAN IV so that it may be transported between a wide variety of computer installations. Currently, versions of the model are running on a VAX 11/750 and an Amdahl 5860. The model is menu driven allowing for convenient operation. The model has been designed to output processed images to a COMTAL image processing system for observer interpretation. Preliminary validation of the simulation using unbiased observer interpretation of minimum resolvable temperature (MRT)-type bar patterns is presented.

  8. Short-Range Sensor for Underwater Robot Navigation using Line-lasers and Vision

    DEFF Research Database (Denmark)

    Hansen, Peter Nicholas; Nielsen, Mikkel Cornelius; Christensen, David Johan

    2015-01-01

    for distance estimation, the sensor offers three dimensional interpretation of the environment. This is obtained by triangulation of points extracted from the image using the Hough Transform. We evaluate the system in simulation and by physical proof-of-concept experiments on an OpenROV platform...

  9. Oil exploration oriented multi-sensor image fusion algorithm

    Directory of Open Access Journals (Sweden)

    Xiaobing Zhang

    2017-04-01

    Full Text Available In order to accurately forecast the fracture and fracture dominance direction in oil exploration, in this paper, we propose a novel multi-sensor image fusion algorithm. The main innovations of this paper lie in that we introduce Dual-tree complex wavelet transform (DTCWT in data fusion and divide an image to several regions before image fusion. DTCWT refers to a new type of wavelet transform, and it is designed to solve the problem of signal decomposition and reconstruction based on two parallel transforms of real wavelet. We utilize DTCWT to segment the features of the input images and generate a region map, and then exploit normalized Shannon entropy of a region to design the priority function. To test the effectiveness of our proposed multi-sensor image fusion algorithm, four standard pairs of images are used to construct the dataset. Experimental results demonstrate that the proposed algorithm can achieve high accuracy in multi-sensor image fusion, especially for images of oil exploration.

  10. Oil exploration oriented multi-sensor image fusion algorithm

    Science.gov (United States)

    Xiaobing, Zhang; Wei, Zhou; Mengfei, Song

    2017-04-01

    In order to accurately forecast the fracture and fracture dominance direction in oil exploration, in this paper, we propose a novel multi-sensor image fusion algorithm. The main innovations of this paper lie in that we introduce Dual-tree complex wavelet transform (DTCWT) in data fusion and divide an image to several regions before image fusion. DTCWT refers to a new type of wavelet transform, and it is designed to solve the problem of signal decomposition and reconstruction based on two parallel transforms of real wavelet. We utilize DTCWT to segment the features of the input images and generate a region map, and then exploit normalized Shannon entropy of a region to design the priority function. To test the effectiveness of our proposed multi-sensor image fusion algorithm, four standard pairs of images are used to construct the dataset. Experimental results demonstrate that the proposed algorithm can achieve high accuracy in multi-sensor image fusion, especially for images of oil exploration.

  11. Image-based environmental monitoring sensor application using an embedded wireless sensor network.

    Science.gov (United States)

    Paek, Jeongyeup; Hicks, John; Coe, Sharon; Govindan, Ramesh

    2014-08-28

    This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet's built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Cannot Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions.

  12. CSRQ: Communication-Efficient Secure Range Queries in Two-Tiered Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hua Dai

    2016-02-01

    Full Text Available In recent years, we have seen many applications of secure query in two-tiered wireless sensor networks. Storage nodes are responsible for storing data from nearby sensor nodes and answering queries from Sink. It is critical to protect data security from a compromised storage node. In this paper, the Communication-efficient Secure Range Query (CSRQ—a privacy and integrity preserving range query protocol—is proposed to prevent attackers from gaining information of both data collected by sensor nodes and queries issued by Sink. To preserve privacy and integrity, in addition to employing the encoding mechanisms, a novel data structure called encrypted constraint chain is proposed, which embeds the information of integrity verification. Sink can use this encrypted constraint chain to verify the query result. The performance evaluation shows that CSRQ has lower communication cost than the current range query protocols.

  13. Self-Similarity Superresolution for Resource-Constrained Image Sensor Node in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yuehai Wang

    2014-01-01

    Full Text Available Wireless sensor networks, in combination with image sensors, open up a grand sensing application field. It is a challenging problem to recover a high resolution (HR image from its low resolution (LR counterpart, especially for low-cost resource-constrained image sensors with limited resolution. Sparse representation-based techniques have been developed recently and increasingly to solve this ill-posed inverse problem. Most of these solutions are based on an external dictionary learned from huge image gallery, consequently needing tremendous iteration and long time to match. In this paper, we explore the self-similarity inside the image itself, and propose a new combined self-similarity superresolution (SR solution, with low computation cost and high recover performance. In the self-similarity image super resolution model (SSIR, a small size sparse dictionary is learned from the image itself by the methods such as KSVD. The most similar patch is searched and specially combined during the sparse regulation iteration. Detailed information, such as edge sharpness, is preserved more faithfully and clearly. Experiment results confirm the effectiveness and efficiency of this double self-learning method in the image super resolution.

  14. Ladar range image denoising by a nonlocal probability statistics algorithm

    Science.gov (United States)

    Xia, Zhi-Wei; Li, Qi; Xiong, Zhi-Peng; Wang, Qi

    2013-01-01

    According to the characteristic of range images of coherent ladar and the basis of nonlocal means (NLM), a nonlocal probability statistics (NLPS) algorithm is proposed in this paper. The difference is that NLM performs denoising using the mean of the conditional probability distribution function (PDF) while NLPS using the maximum of the marginal PDF. In the algorithm, similar blocks are found out by the operation of block matching and form a group. Pixels in the group are analyzed by probability statistics and the gray value with maximum probability is used as the estimated value of the current pixel. The simulated range images of coherent ladar with different carrier-to-noise ratio and real range image of coherent ladar with 8 gray-scales are denoised by this algorithm, and the results are compared with those of median filter, multitemplate order mean filter, NLM, median nonlocal mean filter and its incorporation of anatomical side information, and unsupervised information-theoretic adaptive filter. The range abnormality noise and Gaussian noise in range image of coherent ladar are effectively suppressed by NLPS.

  15. Polymer-free optode nanosensors for dynamic, reversible, and ratiometric sodium imaging in the physiological range.

    Science.gov (United States)

    Ruckh, Timothy T; Mehta, Ankeeta A; Dubach, J Matthew; Clark, Heather A

    2013-11-28

    This work introduces a polymer-free optode nanosensor for ratiometric sodium imaging. Transmembrane ion dynamics are often captured by electrophysiology and calcium imaging, but sodium dyes suffer from short excitation wavelengths and poor selectivity. Optodes, optical sensors composed of a polymer matrix with embedded sensing chemistry, have been translated into nanosensors that selectively image ion concentrations. Polymer-free nanosensors were fabricated by emulsification and were stable by diameter and sensitivity for at least one week. Ratiometric fluorescent measurements demonstrated that the nanosensors are selective for sodium over potassium by ~1.4 orders of magnitude, have a dynamic range centered at 20 mM, and are fully reversible. The ratiometric signal changes by 70% between 10 and 100 mM sodium, showing that they are sensitive to changes in sodium concentration. These nanosensors will provide a new tool for sensitive and quantitative ion imaging.

  16. CMOS Image Sensors for High Speed Applications

    OpenAIRE

    Jamal Deen, M.; Qiyin Fang; Louis Liu; Frances Tse; David Armstrong; Munir El-Desouki

    2009-01-01

    Recent advances in deep submicron CMOS technologies and improved pixel designs have enabled CMOS-based imagers to surpass charge-coupled devices (CCD) imaging technology for mainstream applications. The parallel outputs that CMOS imagers can offer, in addition to complete camera-on-a-chip solutions due to being fabricated in standard CMOS technologies, result in compelling advantages in speed and system throughput. Since there is a practical limit on the minimum pixel size (4~5 μm) due to ...

  17. Autonomous vision networking: miniature wireless sensor networks with imaging technology

    Science.gov (United States)

    Messinger, Gioia; Goldberg, Giora

    2006-09-01

    The recent emergence of integrated PicoRadio technology, the rise of low power, low cost, System-On-Chip (SOC) CMOS imagers, coupled with the fast evolution of networking protocols and digital signal processing (DSP), created a unique opportunity to achieve the goal of deploying large-scale, low cost, intelligent, ultra-low power distributed wireless sensor networks for the visualization of the environment. Of all sensors, vision is the most desired, but its applications in distributed sensor networks have been elusive so far. Not any more. The practicality and viability of ultra-low power vision networking has been proven and its applications are countless, from security, and chemical analysis to industrial monitoring, asset tracking and visual recognition, vision networking represents a truly disruptive technology applicable to many industries. The presentation discusses some of the critical components and technologies necessary to make these networks and products affordable and ubiquitous - specifically PicoRadios, CMOS imagers, imaging DSP, networking and overall wireless sensor network (WSN) system concepts. The paradigm shift, from large, centralized and expensive sensor platforms, to small, low cost, distributed, sensor networks, is possible due to the emergence and convergence of a few innovative technologies. Avaak has developed a vision network that is aided by other sensors such as motion, acoustic and magnetic, and plans to deploy it for use in military and commercial applications. In comparison to other sensors, imagers produce large data files that require pre-processing and a certain level of compression before these are transmitted to a network server, in order to minimize the load on the network. Some of the most innovative chemical detectors currently in development are based on sensors that change color or pattern in the presence of the desired analytes. These changes are easily recorded and analyzed by a CMOS imager and an on-board DSP processor

  18. Blue fluorescent cGMP sensor for multiparameter fluorescence imaging.

    Directory of Open Access Journals (Sweden)

    Yusuke Niino

    Full Text Available Cyclic GMP (cGMP regulates many physiological processes by cooperating with the other signaling molecules such as cyclic AMP (cAMP and Ca(2+. Genetically encoded sensors for cGMP have been developed based on fluorescence resonance energy transfer (FRET between fluorescent proteins. However, to analyze the dynamic relationship among these second messengers, combined use of existing sensors in a single cell is inadequate because of the significant spectral overlaps. A single wavelength indicator is an effective alternative to avoid this problem, but color variants of a single fluorescent protein-based biosensor are limited. In this study, to construct a new color fluorescent sensor, we converted the FRET-based sensor into a single wavelength indicator using a dark FRET acceptor. We developed a blue fluorescent cGMP biosensor, which is spectrally compatible with a FRET-based cAMP sensor using cyan and yellow fluorescent proteins (CFP/YFP. We cotransfected them and loaded a red fluorescent probe for Ca(2+ into cells, and accomplished triple-parameter fluorescence imaging of these cyclic nucleotides and Ca(2+, confirming the applicability of this combination to individually monitor their dynamics in a single cell. This blue fluorescent sensor and the approach using this FRET pair would be useful for multiparameter fluorescence imaging to understand complex signal transduction networks.

  19. Retina-like sensor image coordinates transformation and display

    Science.gov (United States)

    Cao, Fengmei; Cao, Nan; Bai, Tingzhu; Song, Shengyu

    2015-03-01

    For a new kind of retina-like senor camera, the image acquisition, coordinates transformation and interpolation need to be realized. Both of the coordinates transformation and interpolation are computed in polar coordinate due to the sensor's particular pixels distribution. The image interpolation is based on sub-pixel interpolation and its relative weights are got in polar coordinates. The hardware platform is composed of retina-like senor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes the real-time image acquisition, coordinate transformation and interpolation.

  20. Biologically based sensor fusion for medical imaging

    Science.gov (United States)

    Aguilar, Mario; Garrett, Aaron L.

    2001-03-01

    We present an architecture for the fusion of multiple medical image modalities that enhances the original imagery and combines the complimentary information of the various modalities. The design principles follow the organization of the color vision system in humans and primates. Mainly, the design of within- modality enhancement and between-modality combination for fusion is based on the neural connectivity of retina and visual cortex. The architecture is based on a system developed for night vision applications while the first author was at MIT Lincoln Laboratory. Results of fusing various modalities are presented, including: a) fusion of T1-weighted and T2-weighted MRI images, b) fusion of PD, T1 weighted, and T2-weighted, and c) fusion of SPECT and MRI/CT. The results will demonstrate the ability to fuse such disparate imaging modalities with regard to information content and complimentarities. These results will show how both brightness and color contrast are used in the resulting color fused images to convey information to the user. In addition, we will demonstrate the ability to preserve the high spatial resolution of modalities such as MRI even when combined with poor resolution images such as from SPECT scans. We conclude by motivating the use of the fusion method to derive more powerful image features to be used in segmentation and pattern recognition.

  1. Passive millimeter-wave imaging at short and medium range

    Science.gov (United States)

    Essen, H.; Fuchs, H.-H.; Nötel, D.; Klöppel, F.; Pergande, P.; Stanko, S.

    2005-11-01

    During recent year's research on radiometric signatures, non-imaging, of the exhaust jet of missiles and imaging, on small vehicles in critical background scenarios were conducted by the mmW/submmW-group at FGAN-FHR. The equipment used for these investigations was of low technological status using simple single channel radiometers on a scanning pedestal. Meanwhile components of improved performance are available on a cooperative basis with the Institute for Applied Solid State Physics (Fraunhofer-IAF). Using such components a considerable progress concerning the temperature resolution and image generation time could be achieved. Emphasis has been put on the development of a demonstrator for CWD applications and on an imaging system for medium range applications, up to 200 m. The short range demonstrator is a scanning system operating alternatively at 35 GHz or 94 GHz to detect hidden materials as explosives, guns, knifes beneath the clothing. The demonstrator uses a focal plane array approach using 4 channels in azimuth, while mechanical scanning is used for the elevation. The medium range demonstrator currently employs a single channel radiometer on a pedestal for elevation over azimuth scanning. To improve the image quality, methods have been implemented using a Lorentzian algorithm with Wiener filtering.

  2. Low data rate architecture for smart image sensor

    Science.gov (United States)

    Darwish, Amani; Sicard, Gilles; Fesquet, Laurent

    2014-03-01

    An innovative smart image sensor architecture based on event-driven asynchronous functioning is presented in this paper. The proposed architecture has been designed in order to control the sensor data flow by extracting only the relevant information from the image sensor and performing spatial and temporal redundancies suppression in video streaming. We believe that this data flow reduction leads to a system power consumption reduction which is essential in mobile devices. In this first proposition, we present our new pixel behaviour as well as our new asynchronous read-out architecture. Simulations using both Matlab and VHDL were performed in order to validate the proposed pixel behaviour and the reading protocol. These simulations results have met our expectations and confirmed the suggested ideas.

  3. Shadow correction in high dynamic range images for generating orthophotos

    Science.gov (United States)

    Suzuki, Hideo; Chikatsu, Hirofumi

    2011-07-01

    High dynamic range imagery is widely used in remote sensing. With the widespread use of aerial digital cameras such as the DMC, ADS40, RMK-D, and UltraCamD, high dynamic range imaging is generally expected for generating minuteness orthophotos in digital aerial photogrammetry. However, high dynamic range images (12-bit, 4,096 gray levels) are generally compressed into an 8-bit depth digital image (256 gray levels) owing to huge amount of data and interface with peripherals such as monitors and printers. This means that a great deal of image data is eliminated from the original image, and this introduces a new shadow problem. In particular, the influence of shadows in urban areas causes serious problems when generating minuteness orthophotos and performing house detection. Therefore, shadow problems can be solved by addressing the image compression problems. There is a large body of literature on image compression techniques such as logarithmic compression and tone mapping algorithms. However, logarithmic compression tends to cause loss of details in dark and/or light areas. Furthermore, the logarithmic method intends to operate on the full scene. This means that high-resolution luminance information can not be obtained. Even though tone mapping algorithms have the ability to operate over both full scene and local scene, background knowledge is required. To resolve the shadow problem in digital aerial photogrammetry, shadow areas should be recognized and corrected automatically without the loss of luminance information. To this end, a practical shadow correction method using 12-bit real data acquired by DMC is investigated in this paper.

  4. A GRAPH READER USING A CCD IMAGE SENSOR

    African Journals Online (AJOL)

    2008-01-18

    Jan 18, 2008 ... 3. Data Processing. The microcontroller, the CCD sensor, the stepper motor and the rest of the system are interfaced to the PC where data processing and overall control are done. A software program in. QUICKBASIC is used to process the pixels. First the 1024 pixels of an image line are received from the.

  5. DNA as Sensors and Imaging Agents for Metal Ions

    Science.gov (United States)

    Xiang, Yu

    2014-01-01

    Increasing interests in detecting metal ions in many chemical and biomedical fields have created demands for developing sensors and imaging agents for metal ions with high sensitivity and selectivity. This review covers recent progress in DNA-based sensors and imaging agents for metal ions. Through both combinatorial selection and rational design, a number of metal ion-dependent DNAzymes and metal ion-binding DNA structures that can selectively recognize specific metal ions have been obtained. By attaching these DNA molecules with signal reporters such as fluorophores, chromophores, electrochemical tags, and Raman tags, a number of DNA-based sensors for both diamagnetic and paramagnetic metal ions have been developed for fluorescent, colorimetric, electrochemical, and surface Raman detections. These sensors are highly sensitive (with detection limit down to 11 ppt) and selective (with selectivity up to millions-fold) toward specific metal ions. In addition, through further development to simplify the operation, such as the use of “dipstick tests”, portable fluorometers, computer-readable discs, and widely available glucose meters, these sensors have been applied for on-site and real-time environmental monitoring and point-of-care medical diagnostics. The use of these sensors for in situ cellular imaging has also been reported. The generality of the combinatorial selection to obtain DNAzymes for almost any metal ion in any oxidation state, and the ease of modification of the DNA with different signal reporters make DNA an emerging and promising class of molecules for metal ion sensing and imaging in many fields of applications. PMID:24359450

  6. Short-Range Noncontact Sensors for Healthcare and Other Emerging Applications: A Review

    Directory of Open Access Journals (Sweden)

    Changzhan Gu

    2016-07-01

    Full Text Available Short-range noncontact sensors are capable of remotely detecting the precise movements of the subjects or wirelessly estimating the distance from the sensor to the subject. They find wide applications in our day lives such as noncontact vital sign detection of heart beat and respiration, sleep monitoring, occupancy sensing, and gesture sensing. In recent years, short-range noncontact sensors are attracting more and more efforts from both academia and industry due to their vast applications. Compared to other radar architectures such as pulse radar and frequency-modulated continuous-wave (FMCW radar, Doppler radar is gaining more popularity in terms of system integration and low-power operation. This paper reviews the recent technical advances in Doppler radars for healthcare applications, including system hardware improvement, digital signal processing, and chip integration. This paper also discusses the hybrid FMCW-interferometry radars and the emerging applications and the future trends.

  7. A Full Parallel Event Driven Readout Technique for Area Array SPAD FLIM Image Sensors.

    Science.gov (United States)

    Nie, Kaiming; Wang, Xinlei; Qiao, Jun; Xu, Jiangtao

    2016-01-27

    This paper presents a full parallel event driven readout method which is implemented in an area array single-photon avalanche diode (SPAD) image sensor for high-speed fluorescence lifetime imaging microscopy (FLIM). The sensor only records and reads out effective time and position information by adopting full parallel event driven readout method, aiming at reducing the amount of data. The image sensor includes four 8 × 8 pixel arrays. In each array, four time-to-digital converters (TDCs) are used to quantize the time of photons' arrival, and two address record modules are used to record the column and row information. In this work, Monte Carlo simulations were performed in Matlab in terms of the pile-up effect induced by the readout method. The sensor's resolution is 16 × 16. The time resolution of TDCs is 97.6 ps and the quantization range is 100 ns. The readout frame rate is 10 Mfps, and the maximum imaging frame rate is 100 fps. The chip's output bandwidth is 720 MHz with an average power of 15 mW. The lifetime resolvability range is 5-20 ns, and the average error of estimated fluorescence lifetimes is below 1% by employing CMM to estimate lifetimes.

  8. CMOS Image Sensor with On-Chip Image Compression: A Review and Performance Analysis

    Directory of Open Access Journals (Sweden)

    Milin Zhang

    2010-01-01

    Full Text Available Demand for high-resolution, low-power sensing devices with integrated image processing capabilities, especially compression capability, is increasing. CMOS technology enables the integration of image sensing and image processing, making it possible to improve the overall system performance. This paper reviews the current state of the art in CMOS image sensors featuring on-chip image compression. Firstly, typical sensing systems consisting of separate image-capturing unit and image-compression processing unit are reviewed, followed by systems that integrate focal-plane compression. The paper also provides a thorough review of a new design paradigm, in which image compression is performed during the image-capture phase prior to storage, referred to as compressive acquisition. High-performance sensor systems reported in recent years are also introduced. Performance analysis and comparison of the reported designs using different design paradigm are presented at the end.

  9. Image upconversion - a low noise infrared sensor?

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin; Tidemand-Lichtenberg, Peter; Pedersen, Christian

    for detection of infrared images. Silicon cameras have much smaller intrinsic noise than their IR counter part- some models even offer near single photon detection capability. We demonstrate that an ordinary CCD camera combined with a low noise upconversion has superior noise characteristics when compared......Low noise upconversion of IR images by three-wave mixing, can be performed with high efficiency when mixing the object with a powerful laser field inside a highly non-linear crystal such as periodically poled Lithium Niobate. This feature effectively allows the use of silicon based cameras...... to even state-of-the art IR cameras....

  10. New amorphous-silicon image sensor for x-ray diagnostic medical imaging applications

    Science.gov (United States)

    Weisfield, Richard L.; Hartney, Mark A.; Street, Robert A.; Apte, Raj B.

    1998-07-01

    This paper introduces new high-resolution amorphous Silicon (a-Si) image sensors specifically configured for demonstrating film-quality medical x-ray imaging capabilities. The devices utilizes an x-ray phosphor screen coupled to an array of a-Si photodiodes for detecting visible light, and a-Si thin-film transistors (TFTs) for connecting the photodiodes to external readout electronics. We have developed imagers based on a pixel size of 127 micrometer X 127 micrometer with an approximately page-size imaging area of 244 mm X 195 mm, and array size of 1,536 data lines by 1,920 gate lines, for a total of 2.95 million pixels. More recently, we have developed a much larger imager based on the same pixel pattern, which covers an area of approximately 406 mm X 293 mm, with 2,304 data lines by 3,200 gate lines, for a total of nearly 7.4 million pixels. This is very likely to be the largest image sensor array and highest pixel count detector fabricated on a single substrate. Both imagers connect to a standard PC and are capable of taking an image in a few seconds. Through design rule optimization we have achieved a light sensitive area of 57% and optimized quantum efficiency for x-ray phosphor output in the green part of the spectrum, yielding an average quantum efficiency between 500 and 600 nm of approximately 70%. At the same time, we have managed to reduce extraneous leakage currents on these devices to a few fA per pixel, which allows for very high dynamic range to be achieved. We have characterized leakage currents as a function of photodiode bias, time and temperature to demonstrate high stability over these large sized arrays. At the electronics level, we have adopted a new generation of low noise, charge- sensitive amplifiers coupled to 12-bit A/D converters. Considerable attention was given to reducing electronic noise in order to demonstrate a large dynamic range (over 4,000:1) for medical imaging applications. Through a combination of low data lines capacitance

  11. Minimal form factor digital-image sensor for endoscopic applications

    Science.gov (United States)

    Wäny, Martin; Voltz, Stephan; Gaspar, Fabio; Chen, Lei

    2009-02-01

    This paper presents a digital image sensor SOC featuring a total chip area (including dicing tolerances) of 0.34mm2 for endoscopic applications. Due to this extremely small form factor the sensor enables integration in endoscopes, guide wires and locater devices of less than 1mm outer diameter. The sensor embeds a pixel matrix of 10'000 pixels with a pitch of 3um x 3um covered with RGB filters in Bayer pattern. The sensor operates fully autonomous, controlled by an on chip ring oscillator and readout state machine, which controls integration AD conversion and data transmission, thus the sensor only requires 4 pin's for power supply and data communication. The sensor provides a frame rate of 40Frames per second over a LVDS serial data link. The endoscopic application requires that the sensor must work without any local power decoupling capacitances at the end of up to 2m cabling and be able to sustain data communication over the same wire length without deteriorating image quality. This has been achieved by implementation of a current mode successive approximation ADC and current steering LVDS data transmission. An band gap circuit with -40dB PSRR at the data frequency was implemented as on chip reference to improve robustness against power supply ringing due to the high series inductance of the long cables. The B&W versions of the sensor provides a conversion gain of 30DN/nJ/cm2 at 550nm with a read noise in dark of 1.2DN when operated at 2m cable. Using the photon transfer method according to EMVA1288 standard the full well capacity was determined to be 18ke-. According to our knowledge the presented work is the currently world smallest fully digital image sensor. The chip was designed along with a aspheric single surface lens to assemble on the chip without increasing the form factor. The extremely small form factor of the resulting camera permit's to provide visualization with much higher than state of the art spatial resolution in sub 1mm endoscopic

  12. Image upconversion, a low noise infrared sensor?

    DEFF Research Database (Denmark)

    for detection of infrared images. Silicon cameras have much smaller intrinsic noise than their IR counter part- some models even offer near single photon detection capability. We demonstrate that an ordinary CCD camera combined with a low noise upconversion has superior noise characteristics when compared...

  13. Design and Performance Analysis of an Intrinsically Safe Ultrasonic Ranging Sensor

    Directory of Open Access Journals (Sweden)

    Hongjuan Zhang

    2016-06-01

    Full Text Available In flammable or explosive environments, an ultrasonic sensor for distance measurement poses an important engineering safety challenge, because the driving circuit uses an intermediate frequency transformer as an impedance transformation element, in which the produced heat or spark is available for ignition. In this paper, an intrinsically safe ultrasonic ranging sensor is designed and implemented. The waterproof piezoelectric transducer with integrated transceiver is chosen as an energy transducing element. Then a novel transducer driving circuit is designed based on an impedance matching method considering safety spark parameters to replace an intermediate frequency transformer. Then, an energy limiting circuit is developed to achieve dual levels of over-voltage and over-current protection. The detail calculation and evaluation are executed and the electrical characteristics are analyzed to verify the intrinsic safety of the driving circuit. Finally, an experimental platform of the ultrasonic ranging sensor system is constructed, which involves short-circuit protection. Experimental results show that the proposed ultrasonic ranging sensor is excellent in both ranging performance and intrinsic safety.

  14. Design and Performance Analysis of an Intrinsically Safe Ultrasonic Ranging Sensor.

    Science.gov (United States)

    Zhang, Hongjuan; Wang, Yu; Zhang, Xu; Wang, Dong; Jin, Baoquan

    2016-06-13

    In flammable or explosive environments, an ultrasonic sensor for distance measurement poses an important engineering safety challenge, because the driving circuit uses an intermediate frequency transformer as an impedance transformation element, in which the produced heat or spark is available for ignition. In this paper, an intrinsically safe ultrasonic ranging sensor is designed and implemented. The waterproof piezoelectric transducer with integrated transceiver is chosen as an energy transducing element. Then a novel transducer driving circuit is designed based on an impedance matching method considering safety spark parameters to replace an intermediate frequency transformer. Then, an energy limiting circuit is developed to achieve dual levels of over-voltage and over-current protection. The detail calculation and evaluation are executed and the electrical characteristics are analyzed to verify the intrinsic safety of the driving circuit. Finally, an experimental platform of the ultrasonic ranging sensor system is constructed, which involves short-circuit protection. Experimental results show that the proposed ultrasonic ranging sensor is excellent in both ranging performance and intrinsic safety.

  15. Single fluorescent protein-based Ca2+ sensors with increased dynamic range

    Directory of Open Access Journals (Sweden)

    Labas Yulii A

    2007-06-01

    Full Text Available Abstract Background Genetically encoded sensors developed on the basis of green fluorescent protein (GFP-like proteins are becoming more and more popular instruments for monitoring cellular analytes and enzyme activities in living cells and transgenic organisms. In particular, a number of Ca2+ sensors have been developed, either based on FRET (Fluorescence Resonance Energy Transfer changes between two GFP-mutants or on the change in fluorescence intensity of a single circularly permuted fluorescent protein (cpFP. Results Here we report significant progress on the development of the latter type of Ca2+ sensors. Derived from the knowledge of previously reported cpFP-based sensors, we generated a set of cpFP-based indicators with different spectral properties and fluorescent responses to changes in Ca2+ concentration. Two variants, named Case12 and Case16, were characterized by particular high brightness and superior dynamic range, up to 12-fold and 16.5-fold increase in green fluorescence between Ca2+-free and Ca2+-saturated forms. We demonstrated the high potential of these sensors on various examples, including monitoring of Ca2+ response to a prolonged glutamate treatment in cortical neurons. Conclusion We believe that expanded dynamic range, high brightness and relatively high pH-stability should make Case12 and Case16 popular research tools both in scientific studies and high throughput screening assays.

  16. Optimal Magnetic Sensor Vests for Cardiac Source Imaging

    Directory of Open Access Journals (Sweden)

    Stephan Lau

    2016-05-01

    Full Text Available Magnetocardiography (MCG non-invasively provides functional information about the heart. New room-temperature magnetic field sensors, specifically magnetoresistive and optically pumped magnetometers, have reached sensitivities in the ultra-low range of cardiac fields while allowing for free placement around the human torso. Our aim is to optimize positions and orientations of such magnetic sensors in a vest-like arrangement for robust reconstruction of the electric current distributions in the heart. We optimized a set of 32 sensors on the surface of a torso model with respect to a 13-dipole cardiac source model under noise-free conditions. The reconstruction robustness was estimated by the condition of the lead field matrix. Optimization improved the condition of the lead field matrix by approximately two orders of magnitude compared to a regular array at the front of the torso. Optimized setups exhibited distributions of sensors over the whole torso with denser sampling above the heart at the front and back of the torso. Sensors close to the heart were arranged predominantly tangential to the body surface. The optimized sensor setup could facilitate the definition of a standard for sensor placement in MCG and the development of a wearable MCG vest for clinical diagnostics.

  17. 77 FR 74513 - Certain CMOS Image Sensors and Products Containing Same; Investigations: Terminations...

    Science.gov (United States)

    2012-12-14

    ... From the Federal Register Online via the Government Publishing Office INTERNATIONAL TRADE COMMISSION Certain CMOS Image Sensors and Products Containing Same; Investigations: Terminations... importation, and the sale within the United States after importation of certain CMOS image sensors and...

  18. Range-gated imaging for near-field target identification

    Energy Technology Data Exchange (ETDEWEB)

    Yates, G.J.; Gallegos, R.A.; McDonald, T.E. [and others

    1996-12-01

    The combination of two complementary technologies developed independently at Los Alamos National Laboratory (LANL) and Sandia National Laboratory (SNL) has demonstrated feasibility of target detection and image capture in a highly light-scattering, medium. The technique uses a compact SNL developed Photoconductive Semiconductor Switch/Laser Diode Array (PCSS/LDA) for short-range (distances of 8 to 10 m) large Field-Of-View (FOV) target illumination. Generation of a time-correlated echo signal is accomplished using a photodiode. The return image signal is recorded with a high-speed shuttered Micro-Channel-Plate Image Intensifier (MCPII), declined by LANL and manufactured by Philips Photonics. The MCPII is rated using a high-frequency impedance-matching microstrip design to produce 150 to 200 ps duration optical exposures. The ultra first shuttering producer depth resolution of a few inches along the optic axis between the MCPII and the target, producing enhanced target images effectively deconvolved from noise components from the scattering medium in the FOV. The images from the MCPII are recorded with an RS-170 Charge-Coupled-Device camera and a Big Sky, Beam Code, PC-based digitizer frame grabber and analysis package. Laser pulse data were obtained by the but jitter problems and spectral mismatches between diode spectral emission wavelength and MCPII photocathode spectral sensitivity prevented the capture of fast gating imaging with this demonstration system. Continued development of the system is underway.

  19. A video precipitation sensor for imaging and velocimetry of hydrometeors

    Science.gov (United States)

    Liu, X. C.; Gao, T. C.; Liu, L.

    2014-07-01

    A new method to determine the shape and fall velocity of hydrometeors by using a single CCD camera is proposed in this paper, and a prototype of a video precipitation sensor (VPS) is developed. The instrument consists of an optical unit (collimated light source with multi-mode fibre cluster), an imaging unit (planar array CCD sensor), an acquisition and control unit, and a data processing unit. The cylindrical space between the optical unit and imaging unit is sampling volume (300 mm × 40 mm × 30 mm). As the precipitation particles fall through the sampling volume, the CCD camera exposes twice in a single frame, which allows the double exposure of particles images to be obtained. The size and shape can be obtained by the images of particles; the fall velocity can be calculated by particle displacement in the double-exposure image and interval time; the drop size distribution and velocity distribution, precipitation intensity, and accumulated precipitation amount can be calculated by time integration. The innovation of VPS is that the shape, size, and velocity of precipitation particles can be measured by only one planar array CCD sensor, which can address the disadvantages of a linear scan CCD disdrometer and an impact disdrometer. Field measurements of rainfall demonstrate the VPS's capability to measure micro-physical properties of single particles and integral parameters of precipitation.

  20. Broadband image sensor array based on graphene-CMOS integration

    Science.gov (United States)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  1. Image sensor for testing refractive error of eyes

    Science.gov (United States)

    Li, Xiangning; Chen, Jiabi; Xu, Longyun

    2000-05-01

    It is difficult to detect ametropia and anisometropia for children. Image sensor for testing refractive error of eyes does not need the cooperation of children and can be used to do the general survey of ametropia and anisometropia for children. In our study, photographs are recorded by a CCD element in a digital form which can be directly processed by a computer. In order to process the image accurately by digital technique, formula considering the effect of extended light source and the size of lens aperture has been deduced, which is more reliable in practice. Computer simulation of the image sensing is made to verify the fineness of the results.

  2. Noise Reduction for CFA Image Sensors Exploiting HVS Behaviour

    Directory of Open Access Journals (Sweden)

    Angelo Bosco

    2009-03-01

    Full Text Available This paper presents a spatial noise reduction technique designed to work on CFA (Color Filtering Array data acquired by CCD/CMOS image sensors. The overall processing preserves image details using some heuristics related to the HVS (Human Visual System; estimates of local texture degree and noise levels are computed to regulate the filter smoothing capability. Experimental results confirm the effectiveness of the proposed technique. The method is also suitable for implementation in low power mobile devices with imaging capabilities such as camera phones and PDAs.

  3. BIOME: An Ecosystem Remote Sensor Based on Imaging Interferometry

    Science.gov (United States)

    Peterson, David L.; Hammer, Philip; Smith, William H.; Lawless, James G. (Technical Monitor)

    1994-01-01

    Until recent times, optical remote sensing of ecosystem properties from space has been limited to broad band multispectral scanners such as Landsat and AVHRR. While these sensor data can be used to derive important information about ecosystem parameters, they are very limited for measuring key biogeochemical cycling parameters such as the chemical content of plant canopies. Such parameters, for example the lignin and nitrogen contents, are potentially amenable to measurements by very high spectral resolution instruments using a spectroscopic approach. Airborne sensors based on grating imaging spectrometers gave the first promise of such potential but the recent decision not to deploy the space version has left the community without many alternatives. In the past few years, advancements in high performance deep well digital sensor arrays coupled with a patented design for a two-beam interferometer has produced an entirely new design for acquiring imaging spectroscopic data at the signal to noise levels necessary for quantitatively estimating chemical composition (1000:1 at 2 microns). This design has been assembled as a laboratory instrument and the principles demonstrated for acquiring remote scenes. An airborne instrument is in production and spaceborne sensors being proposed. The instrument is extremely promising because of its low cost, lower power requirements, very low weight, simplicity (no moving parts), and high performance. For these reasons, we have called it the first instrument optimized for ecosystem studies as part of a Biological Imaging and Observation Mission to Earth (BIOME).

  4. Biologically motivated composite image sensor for deep-field target tracking

    Science.gov (United States)

    Melnyk, Pavlo B.; Messner, Richard A.

    2007-01-01

    The present work addresses the design of an image acquisition front-end for target detection and tracking within a wide range of distances. Inspired by raptor bird's vision, a novel design for a visual sensor is proposed. The sensor consists of two parts, each originating from the studies of biological vision systems of different species. The front end is comprised of a set of video cameras imitating a falconiform eye, in particular its optics and retina [1]. The back end is a software remapper that uses a popular in machine vision log-polar model of retino-cortical projection in primates [2], [3], [4]. The output of this sensor is a composite log-polar image incorporating both near and far visual fields into a single homogeneous image space. In such space it is easier to perform target detection and tracking for those applications that deal with targets moving along the camera axis. The target object preserves its shape and size being handled seamlessly between cameras regardless of distance to the composite sensor. The prototype of proposed composite sensor has been created and is used as a front-end in experimental mobile vehicle detection and tracking system. Its has been tested inside a driving simulator and results are presented.

  5. Low-power high-accuracy micro-digital sun sensor by means of a CMOS image sensor

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.

    2013-01-01

    A micro-digital sun sensor (?DSS) is a sun detector which senses a satellite’s instant attitude angle with respect to the sun. The core of this sensor is a system-on-chip imaging chip which is referred to as APS+. The APS+ integrates a CMOS active pixel sensor (APS) array of 368×368??pixels , a

  6. Ultra-sensitive wide dynamic range temperature sensor based on in-fiber Lyot interferometer

    Science.gov (United States)

    Nikbakht, Hamed; Poorghdiri Isfahani, Mohamad Hosein; Latifi, Hamid

    2017-04-01

    An in-fiber Lyot interferometer for temperature measurement is presented. The sensor utilizes high temperature-dependence of the birefringence in Panda polarization maintaining fibers to achieve high resolution in temperature measurements. Temperature variation modulates the phase difference between the polarization modes propagating in different modes of the Panda fiber. The Lyot interferometer produces a spectrum which varies with the phase difference. Therefore, by monitoring this spectrum a high resolution of 0.003°C was achieved. A fiber Bragg grating is added to the setup to expand its dynamic range. This sensor does not need complicated fabrication process and can be implemented in many applications.

  7. Stereo Vision-Based High Dynamic Range Imaging Using Differently-Exposed Image Pair.

    Science.gov (United States)

    Park, Won-Jae; Ji, Seo-Won; Kang, Seok-Jae; Jung, Seung-Won; Ko, Sung-Jea

    2017-06-22

    In this paper, a high dynamic range (HDR) imaging method based on the stereo vision system is presented. The proposed method uses differently exposed low dynamic range (LDR) images captured from a stereo camera. The stereo LDR images are first converted to initial stereo HDR images using the inverse camera response function estimated from the LDR images. However, due to the limited dynamic range of the stereo LDR camera, the radiance values in under/over-exposed regions of the initial main-view (MV) HDR image can be lost. To restore these radiance values, the proposed stereo matching and hole-filling algorithms are applied to the stereo HDR images. Specifically, the auxiliary-view (AV) HDR image is warped by using the estimated disparity between initial the stereo HDR images and then effective hole-filling is applied to the warped AV HDR image. To reconstruct the final MV HDR, the warped and hole-filled AV HDR image is fused with the initial MV HDR image using the weight map. The experimental results demonstrate objectively and subjectively that the proposed stereo HDR imaging method provides better performance compared to the conventional method.

  8. Stereo Vision-Based High Dynamic Range Imaging Using Differently-Exposed Image Pair

    Directory of Open Access Journals (Sweden)

    Won-Jae Park

    2017-06-01

    Full Text Available In this paper, a high dynamic range (HDR imaging method based on the stereo vision system is presented. The proposed method uses differently exposed low dynamic range (LDR images captured from a stereo camera. The stereo LDR images are first converted to initial stereo HDR images using the inverse camera response function estimated from the LDR images. However, due to the limited dynamic range of the stereo LDR camera, the radiance values in under/over-exposed regions of the initial main-view (MV HDR image can be lost. To restore these radiance values, the proposed stereo matching and hole-filling algorithms are applied to the stereo HDR images. Specifically, the auxiliary-view (AV HDR image is warped by using the estimated disparity between initial the stereo HDR images and then effective hole-filling is applied to the warped AV HDR image. To reconstruct the final MV HDR, the warped and hole-filled AV HDR image is fused with the initial MV HDR image using the weight map. The experimental results demonstrate objectively and subjectively that the proposed stereo HDR imaging method provides better performance compared to the conventional method.

  9. Close-range imaging and research priorities in Europe

    Directory of Open Access Journals (Sweden)

    P. Patias

    2014-06-01

    Full Text Available Since 1984, the European Union’s Framework Program for Research and Innovation has been the main instrument for funding research. Specific priorities, objectives and types of funded activities vary between funding periods. Horizon 2020 is the biggest EU Research and Innovation programme ever with nearly € 80 billion of funding available over 7 years (2014–2020. H2020 is based on three pillars: (i Excellent science, (ii Industrial leadership, (iii Societal challenges. The current economic crisis in Europe and elsewhere leads to extended shortage of research budgets in national levels, which in turn leads researchers to search funds in the highly competitive transnational research instruments, as H2020. This paper : - draws the overall picture of Horizon 2020 - investigates the position of close-range imaging technologies, applications and research areas - presents the research challenges in H2020 that offer funding opportunities in close-range imaging

  10. The coronagraphic Modal Wavefront Sensor: a hybrid focal-plane sensor for the high-contrast imaging of circumstellar environments

    Science.gov (United States)

    Wilby, M. J.; Keller, C. U.; Snik, F.; Korkiakoski, V.; Pietrow, A. G. M.

    2017-01-01

    The raw coronagraphic performance of current high-contrast imaging instruments is limited by the presence of a quasi-static speckle (QSS) background, resulting from instrumental Non-Common Path Errors (NCPEs). Rapid development of efficient speckle subtraction techniques in data reduction has enabled final contrasts of up to 10-6 to be obtained, however it remains preferable to eliminate the underlying NCPEs at the source. In this work we introduce the coronagraphic Modal Wavefront Sensor (cMWS), a new wavefront sensor suitable for real-time NCPE correction. This combines the Apodizing Phase Plate (APP) coronagraph with a holographic modal wavefront sensor to provide simultaneous coronagraphic imaging and focal-plane wavefront sensing with the science point-spread function. We first characterise the baseline performance of the cMWS via idealised closed-loop simulations, showing that the sensor is able to successfully recover diffraction-limited coronagraph performance over an effective dynamic range of ±2.5 radians root-mean-square (rms) wavefront error within 2-10 iterations, with performance independent of the specific choice of mode basis. We then present the results of initial on-sky testing at the William Herschel Telescope, which demonstrate that the sensor is capable of NCPE sensing under realistic seeing conditions via the recovery of known static aberrations to an accuracy of 10 nm (0.1 radians) rms error in the presence of a dominant atmospheric speckle foreground. We also find that the sensor is capable of real-time measurement of broadband atmospheric wavefront variance (50% bandwidth, 158 nm rms wavefront error) at a cadence of 50 Hz over an uncorrected telescope sub-aperture. When combined with a suitable closed-loop adaptive optics system, the cMWS holds the potential to deliver an improvement of up to two orders of magnitude over the uncorrected QSS floor. Such a sensor would be eminently suitable for the direct imaging and spectroscopy of

  11. 77 FR 26787 - Certain CMOS Image Sensors and Products Containing Same; Notice of Receipt of Complaint...

    Science.gov (United States)

    2012-05-07

    ... COMMISSION Certain CMOS Image Sensors and Products Containing Same; Notice of Receipt of Complaint... complaint entitled Certain CMOS Image Sensors and Products Containing Same, DN 2895; the Commission is... importation of certain CMOS image sensors and products containing same. The complaint names as respondents...

  12. 77 FR 33488 - Certain CMOS Image Sensors and Products Containing Same; Institution of Investigation Pursuant to...

    Science.gov (United States)

    2012-06-06

    ... COMMISSION Certain CMOS Image Sensors and Products Containing Same; Institution of Investigation Pursuant to... States after importation of certain CMOS image sensors and products containing same by reason of... image sensors and products containing same that infringe one or more of claims 1 and 2 of the `126...

  13. Time-reversed lasing in the terahertz range and its preliminary study in sensor applications

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Yun, E-mail: shenyunoptics@gmail.com [Department of Physics, Nanchang University, Nanchang 330031 (China); Liu, Huaqing [Department of Physics, Nanchang University, Nanchang 330031 (China); Deng, Xiaohua [Institute of Space Science and Technology, Nanchang University, Nanchang 330031 (China); Wang, Guoping [Key Laboratory of Artificial Micro- and Nano-Structures of Ministry of Education and School of Physics and Technology, Wuhan University, Wuhan 430072 (China)

    2017-02-05

    Time-reversed lasing in a uniform slab and a grating structure are investigated in the terahertz range. The results show that both the uniform slab and grating can support terahertz time-reversed lasing. Nevertheless, due to the tunable effective refractive index, the grating structure can not only exhibit time-reversed lasing more effectively and flexibly than a uniform slab, but also can realize significant absorption in a broader operating frequency range. Furthermore, applications of terahertz time-reversed lasing for novel concentration/thickness sensors are preliminarily studied in a single-channel coherent perfect absorber system. - Highlights: • Time-reversed lasing are investigated in the terahertz range. • The grating structure exhibit time-reversed lasing more effectively and flexibly than a uniform slab. • THz time-reversed lasing for novel concentration/thickness sensors are studied.

  14. Comparison and experimental validation of two potential resonant viscosity sensors in the kilohertz range

    Science.gov (United States)

    Lemaire, Etienne; Heinisch, Martin; Caillard, Benjamin; Jakoby, Bernhard; Dufour, Isabelle

    2013-08-01

    Oscillating microstructures are well established and find application in many fields. These include force sensors, e.g. AFM micro-cantilevers or accelerometers based on resonant suspended plates. This contribution presents two vibrating mechanical structures acting as force sensors in liquid media in order to measure hydrodynamic interactions. Rectangular cross section microcantilevers as well as circular cross section wires are investigated. Each structure features specific benefits, which are discussed in detail. Furthermore, their mechanical parameters and their deflection in liquids are characterized. Finally, an inverse analytical model is applied to calculate the complex viscosity near the resonant frequency for both types of structures. With this approach it is possible to determine rheological parameters in the kilohertz range in situ within a few seconds. The monitoring of the complex viscosity of yogurt during the fermentation process is used as a proof of concept to qualify at least one of the two sensors in opaque mixtures.

  15. A wide range and highly sensitive optical fiber pH sensor using polyacrylamide hydrogel

    Science.gov (United States)

    Pathak, Akhilesh Kumar; Singh, Vinod Kumar

    2017-12-01

    In the present study we report the fabrication and characterization of no-core fiber sensor (NCFS) using smart hydrogel coating for pH measurement. The no-core fiber (NCF) is stubbed between two single-mode fibers with SMA connector before immobilizing of smart hydrogel. The wavelength interrogation technique is used to calculate the sensitivity of the proposed sensor. The result shows a high sensitivity of 1.94 nm/pH for a wide range of pH values varied from 3 to 10 with a good linear response. In addition to high sensitivity, the fabricated sensor provides a fast response time with a good stability, repeatability and reproducibility.

  16. A Multilayer Improved RBM Network Based Image Compression Method in Wireless Sensor Networks

    National Research Council Canada - National Science Library

    Cheng, Chunling; Wang, Shu; Chen, Xingguo; Yang, Yanying

    2016-01-01

    The processing capacity and power of nodes in a Wireless Sensor Network (WSN) are limited. And most image compression algorithms in WSN are subject to random image content changes or have low image qualities after the images are decoded...

  17. Design, Manufacture and Testing of Capacitive Pressure Sensors for Low-Pressure Measurement Ranges

    Directory of Open Access Journals (Sweden)

    Vasileios Mitrakos

    2017-02-01

    Full Text Available This article presents the design, manufacture and testing of a capacitive pressure sensor with a high, tunable performance to low compressive loads (<10 kPa and a resolution of less than 0.5 kPa. Such a performance is required for the monitoring of treatment efficacy delivered by compression garments to treat or prevent medical conditions such as deep vein thrombosis, leg ulcers, varicose veins or hypertrophic scars. Current commercial sensors used in such medical applications have been found to be either impractical, costly or of insufficient resolution. A microstructured elastomer film of a polydimethylsiloxane (PDMS blend with a tunable Young’s modulus was used as the force-sensing dielectric medium. The resulting 18 mm × 18 mm parallel-plate capacitive pressure sensor was characterised in the range of 0.8 to 6.5 kPa. The microstructuring of the surface morphology of the elastomer film combined with the tuning of the Young’s modulus of the PDMS blend is demonstrated to enhance the sensor performance achieving a 0.25 kPa pressure resolution and a 10 pF capacitive change under 6.5 kPa compressive load. The resulting sensor holds good potential for the targeted medical application.

  18. Nanocomposite-Based Microstructured Piezoresistive Pressure Sensors for Low-Pressure Measurement Range

    Directory of Open Access Journals (Sweden)

    Vasileios Mitrakos

    2018-01-01

    Full Text Available Piezoresistive pressure sensors capable of detecting ranges of low compressive stresses have been successfully fabricated and characterised. The 5.5 × 5 × 1.6 mm3 sensors consist of a planar aluminium top electrode and a microstructured bottom electrode containing a two-by-two array of truncated pyramids with a piezoresistive composite layer sandwiched in-between. The responses of two different piezocomposite materials, a Multiwalled Carbon Nanotube (MWCNT-elastomer composite and a Quantum Tunneling Composite (QTC, have been characterised as a function of applied pressure and effective contact area. The MWCNT piezoresistive composite-based sensor was able to detect pressures as low as 200 kPa. The QTC-based sensor was capable of detecting pressures as low as 50 kPa depending on the contact area of the bottom electrode. Such sensors could find useful applications requiring the detection of small compressive loads such as those encountered in haptic sensing or robotics.

  19. A directly converting high-resolution intra-oral X-ray imaging sensor

    CERN Document Server

    Spartiotis, K; Schulman, T; Puhakka, K; Muukkonen, K

    2003-01-01

    A digital intra-oral X-ray imaging sensor with an active area of 3.6x2.9 cm sup 2 and consisting of six charge-integrating CMOS signal readout circuits bump bonded to one high-resistivity silicon pixel detector has been developed and tested. The pixel size is 35 mu m. The X-rays entering the sensor window are converted directly to electrical charge in the depleted detector material yielding minimum lateral signal spread and maximum image sharpness. The signal charge is collected on the gates of the input field effect transistors of the CMOS signal readout circuits. The analog signal readout is performed by multiplexing in the current mode independent of the signal charge collection enabling multiple readout cycles with negligible dead time and thus imaging with wide dynamic range. Since no intermediate conversion material of X-rays to visible light is needed, the sensor structure is very compact. The analog image signals are guided from the sensor output through a thin cable to signal processing, AD conversio...

  20. Imaging using long range dipolar field effects Nuclear magnetic resonance

    CERN Document Server

    Gutteridge, S

    2002-01-01

    The work in this thesis has been undertaken by the except where indicated in reference, within the Magnetic Resonance Centre, at the University of Nottingham during the period from October 1998 to March 2001. This thesis details the different characteristics of the long range dipolar field and its application to magnetic resonance imaging. The long range dipolar field is usually neglected in nuclear magnetic resonance experiments, as molecular tumbling decouples its effect at short distances. However, in highly polarised samples residual long range components have a significant effect on the evolution of the magnetisation, giving rise to multiple spin echoes and unexpected quantum coherences. Three applications utilising these dipolar field effects are documented in this thesis. The first demonstrates the spatial sensitivity of the signal generated via dipolar field effects in structured liquid state samples. The second utilises the signal produced by the dipolar field to create proton spin density maps. Thes...

  1. Scene correction (precision techniques) of ERTS sensor data using digital image processing techniques

    Science.gov (United States)

    Bernstein, R.

    1974-01-01

    Techniques have been developed, implemented, and evaluated to process ERTS Return Beam Vidicon (RBV) and Multispectral Scanner (MSS) sensor data using digital image processing techniques. The RBV radiometry has been corrected to remove shading effects, and the MSS geometry and radiometry have been corrected to remove internal and external radiometric and geometric errors. The results achieved show that geometric mapping accuracy of about one picture element RMS and two picture elements (maximum) can be achieved by the use of nine ground control points. Radiometric correction of MSS and RBV sensor data has been performed to eliminate striping and shading effects to about one count accuracy. Image processing times on general purpose computers of the IBM 370/145 to 168 class are in the range of 29 to 3.2 minutes per MSS scene (4 bands). Photographic images of the fully corrected and annotated scenes have been generated from the processed data and have demonstrated excellent quality and information extraction potential.

  2. Experimental Demonstration of Long-Range Underwater Acoustic Communication Using a Vertical Sensor Array.

    Science.gov (United States)

    Zhao, Anbang; Zeng, Caigao; Hui, Juan; Ma, Lin; Bi, Xuejie

    2017-06-27

    This paper proposes a composite channel virtual time reversal mirror (CCVTRM) for vertical sensor array (VSA) processing and applies it to long-range underwater acoustic (UWA) communication in shallow water. Because of weak signal-to-noise ratio (SNR), it is unable to accurately estimate the channel impulse response of each sensor of the VSA, thus the traditional passive time reversal mirror (PTRM) cannot perform well in long-range UWA communication in shallow water. However, CCVTRM only needs to estimate the composite channel of the VSA to accomplish time reversal mirror (TRM), which can effectively mitigate the inter-symbol interference (ISI) and reduce the bit error rate (BER). In addition, the calculation of CCVTRM is simpler than traditional PTRM. An UWA communication experiment using a VSA of 12 sensors was conducted in the South China Sea. The experiment achieves a very low BER communication at communication rate of 66.7 bit/s over an 80 km range. The results of the sea trial demonstrate that CCVTRM is feasible and can be applied to long-range UWA communication in shallow water.

  3. Optical Imaging Sensors and Systems for Homeland Security Applications

    CERN Document Server

    Javidi, Bahram

    2006-01-01

    Optical and photonic systems and devices have significant potential for homeland security. Optical Imaging Sensors and Systems for Homeland Security Applications presents original and significant technical contributions from leaders of industry, government, and academia in the field of optical and photonic sensors, systems and devices for detection, identification, prevention, sensing, security, verification and anti-counterfeiting. The chapters have recent and technically significant results, ample illustrations, figures, and key references. This book is intended for engineers and scientists in the relevant fields, graduate students, industry managers, university professors, government managers, and policy makers. Advanced Sciences and Technologies for Security Applications focuses on research monographs in the areas of -Recognition and identification (including optical imaging, biometrics, authentication, verification, and smart surveillance systems) -Biological and chemical threat detection (including bios...

  4. Differentiating a diverse range of volatile organic compounds with polyfluorophore sensors built on a DNA scaffold.

    Science.gov (United States)

    Samain, Florent; Dai, Nan; Kool, Eric T

    2011-01-03

    Oligodeoxyfluorosides (ODFs) are short DNA-like oligomers in which DNA bases are replaced with fluorophores. A preliminary study reported that some sequences of ODFs were able to respond to a few organic small molecules in the vapor phase, giving a change in fluorescence. Here, we follow up on this finding by investigating a larger range of volatile organic analytes, and a considerably larger set of sensors. A library of tetramer ODFs of 2401 different sequences was prepared by using combinatorial methods, and was screened in air for fluorescence responses to a set of ten different volatile organics, including multiple aromatic and aliphatic compounds, acids and bases, varied functional groups, and closely related structures. Nineteen responding sensors were selected and characterized. These sensors were cross-screened against all ten analytes, and responses were measured qualitatively (by changes in color and intensity) and quantitatively (by measuring ΔR, ΔG, and ΔB values averaged over five to six sensor beads; R=red, G=green, B=blue). The results show that sensor responses were diverse, with a single sensor responding differently to as many as eight of the ten analytes; multiple classes of responses were seen, including quenching, lighting-up, and varied shifts in wavelength. Responses were strong, with raw ΔR, ΔG, and ΔB values of as high as >200 on a 256-unit scale and unamplified changes in many cases apparent to the naked eye. Sensors were identified that could distinguish clearly between even very closely related compounds such as acrolein and acrylonitrile. Statistical methods were applied to select a small set of four sensors that, as a pattern response, could distinguish between all ten analytes with high confidence. Sequence analysis of the full set of sensors suggested that sequence/order of the monomer components, and not merely composition, was highly important in the responses. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Numerical Demultiplexing of Color Image Sensor Measurements via Non-linear Random Forest Modeling

    OpenAIRE

    Jason Deglint; Farnoud Kazemzadeh; Daniel Cho; Clausi, David A.; Alexander Wong

    2016-01-01

    The simultaneous capture of imaging data at multiple wavelengths across the electromagnetic spectrum is highly challenging, requiring complex and costly multispectral image devices. In this study, we investigate the feasibility of simultaneous multispectral imaging using conventional image sensors with color filter arrays via a novel comprehensive framework for numerical demultiplexing of the color image sensor measurements. A numerical forward model characterizing the formation of sensor mea...

  6. Dynamic range compression and detail enhancement algorithm for infrared image.

    Science.gov (United States)

    Sun, Gang; Liu, Songlin; Wang, Weihua; Chen, Zengping

    2014-09-10

    For infrared imaging systems with high sampling width applying to the traditional display device or real-time processing system with 8-bit data width, this paper presents a new high dynamic range compression and detail enhancement (DRCDDE) algorithm for infrared images. First, a bilateral filter is adopted to separate the original image into two parts: the base component that contains large-scale signal variations, and the detail component that contains high-frequency information. Then, the operator model for DRC with local-contrast preservation is established, along with a new proposed nonlinear intensity transfer function (ITF) to implement adaptive DRC of the base component. For the detail component, depending on the local statistical characteristics, we set up suitable intensity level extension criteria to enhance the low-contrast details and suppress noise. Finally, the results of the two components are recombined with a weighted coefficient. Experiment results by real infrared data, and quantitative comparison with other well-established methods, show the better performance of the proposed algorithm. Furthermore, the technique could effectively project a dim target while suppressing noise, which is beneficial to image display and target detection.

  7. Digital image processing of earth observation sensor data

    Science.gov (United States)

    Bernstein, R.

    1976-01-01

    This paper describes digital image processing techniques that were developed to precisely correct Landsat multispectral earth observation data and gives illustrations of the results achieved, e.g., geometric corrections with an error of less than one picture element, a relative error of one-fourth picture element, and no radiometric error effect. Techniques for enhancing the sensor data, digitally mosaicking multiple scenes, and extracting information are also illustrated.

  8. A Full Parallel Event Driven Readout Technique for Area Array SPAD FLIM Image Sensors

    Directory of Open Access Journals (Sweden)

    Kaiming Nie

    2016-01-01

    Full Text Available This paper presents a full parallel event driven readout method which is implemented in an area array single-photon avalanche diode (SPAD image sensor for high-speed fluorescence lifetime imaging microscopy (FLIM. The sensor only records and reads out effective time and position information by adopting full parallel event driven readout method, aiming at reducing the amount of data. The image sensor includes four 8 × 8 pixel arrays. In each array, four time-to-digital converters (TDCs are used to quantize the time of photons’ arrival, and two address record modules are used to record the column and row information. In this work, Monte Carlo simulations were performed in Matlab in terms of the pile-up effect induced by the readout method. The sensor’s resolution is 16 × 16. The time resolution of TDCs is 97.6 ps and the quantization range is 100 ns. The readout frame rate is 10 Mfps, and the maximum imaging frame rate is 100 fps. The chip’s output bandwidth is 720 MHz with an average power of 15 mW. The lifetime resolvability range is 5–20 ns, and the average error of estimated fluorescence lifetimes is below 1% by employing CMM to estimate lifetimes.

  9. Differentiating a Diverse Range of Volatile Organic Compounds with Polyfluorophore Sensors Built on a DNA Scaffold

    OpenAIRE

    Samain, Florent; Dai, Nan; Kool, Eric T.

    2010-01-01

    Oligodeoxyfluorosides (ODFs) are short DNA-like oligomers in which DNA bases are replaced with fluorophores. A preliminary study reported that some sequences of ODFs were able to respond to a few organic small molecules in the vapor phase, giving a change in fluorescence. Here we follow up on this finding by investigating a larger range of volatile organic analytes, and a considerably larger set of sensors. A library of tetramer ODFs of 2401 different sequences was prepared using combinatoria...

  10. Polymer Optical Fibre Sensors for Endoscopic Opto-Acoustic Imaging

    DEFF Research Database (Denmark)

    Broadway, Christian; Gallego, Daniel; Woyessa, Getinet

    2015-01-01

    Opto-acoustic imaging (OAI) shows particular promise for in-vivo biomedical diagnostics. Its applications include cardiovascular, gastrointestinal and urogenital systems imaging. Opto-acoustic endoscopy (OAE) allows the imaging of body parts through cavities permitting entry. The critical parameter...... is the physical size of the device, allowing compatibility with current technology, while governing flexibility of the distal end of the endoscope based on the needs of the sensor. Polymer optical fibre (POF) presents a novel approach for endoscopic applications and has been positively discussed and compared...... in existing publications. A great advantage can be obtained for endoscopy due to a small size and array potential to provide discrete imaging speed improvements. Optical fibre exhibits numerous advantages over conventional piezo-electric transducers, such as immunity from electromagnetic interference...

  11. MIST Final Report: Multi-sensor Imaging Science and Technology

    Energy Technology Data Exchange (ETDEWEB)

    Lind, Michael A.; Medvick, Patricia A.; Foley, Michael G.; Foote, Harlan P.; Heasler, Patrick G.; Thompson, Sandra E.; Nuffer, Lisa L.; Mackey, Patrick S.; Barr, Jonathan L.; Renholds, Andrea S.

    2008-03-15

    The Multi-sensor Imaging Science and Technology (MIST) program was undertaken to advance exploitation tools for Long Wavelength Infra Red (LWIR) hyper-spectral imaging (HSI) analysis as applied to the discovery and quantification of nuclear proliferation signatures. The program focused on mitigating LWIR image background clutter to ease the analyst burden and enable a) faster more accurate analysis of large volumes of high clutter data, b) greater detection sensitivity of nuclear proliferation signatures (primarily released gasses) , and c) quantify confidence estimates of the signature materials detected. To this end the program investigated fundamental limits and logical modifications of the more traditional statistical discovery and analysis tools applied to hyperspectral imaging and other disciplines, developed and tested new software incorporating advanced mathematical tools and physics based analysis, and demonstrated the strength and weaknesses of the new codes on relevant hyperspectral data sets from various campaigns. This final report describes the content of the program and the outlines the significant results.

  12. Smart image sensor with adaptive correction of brightness

    Science.gov (United States)

    Paindavoine, Michel; Ngoua, Auguste; Brousse, Olivier; Clerc, Cédric

    2012-03-01

    Today, intelligent image sensors require the integration in the focal plane (or near the focal plane) of complex algorithms for image processing. Such devices must meet the constraints related to the quality of acquired images, speed and performance of embedded processing, as well as low power consumption. To achieve these objectives, analog pre-processing are essential, on the one hand, to improve the quality of the images making them usable whatever the light conditions, and secondly, to detect regions of interest (ROIs) to limit the amount of pixels to be transmitted to a digital processor performing the high-level processing such as feature extraction for pattern recognition. To show that it is possible to implement analog pre-processing in the focal plane, we have designed and implemented in 130nm CMOS technology, a test circuit with groups of 4, 16 and 144 pixels, each incorporating analog average calculations.

  13. Pile volume measurement by range imaging camera in indoor environment

    Directory of Open Access Journals (Sweden)

    C. Altuntas

    2014-06-01

    Full Text Available Range imaging (RIM camera is recent technology in 3D location measurement. The new study areas have been emerged in measurement and data processing together with RIM camera. It has low-cost and fast measurement technique compared to the current measurement techniques. However its measurement accuracy varies according to effects resulting from the device and the environment. The direct sunlight is affect measurement accuracy of the camera. Thus, RIM camera should be used for indoor measurement. In this study gravel pile volume was measured by SwissRanger SR4000 camera. The measured volume is acquired as different 8.13% from the known.

  14. Zinc oxide nanoparticle-doped nanoporous solgel fiber as a humidity sensor with enhanced sensitivity and large linear dynamic range.

    Science.gov (United States)

    Aneesh, R; Khijwania, Sunil K

    2013-08-01

    An all-optical humidity sensor based on direct and exhaustive guided-mode attenuation in an in-house developed zinc oxide (ZnO) nanoparticle-immobilized bare solgel fiber is reported. The main objective of the present work is to enhance the sensitivity considerably while realizing a throughout linear response over a wide dynamic range. The developed sensor is characterized and performance characteristics of the sensor are compared with an optical fiber humidity sensor employing an evanescent wave absorption scheme in a straight and uniform probe, with ZnO nanoparticles-immobilized solgel film as humidity sensing cladding. Sensor response is observed to be linear over a wide dynamic range of 5%-95% relative humidity (RH). The observed linear sensitivity is 0.0103/% RH, which is ~9 times higher than the sensor employing the evanescent wave absorption scheme. In addition, sensor response is observed to be very fast, highly reversible, and repeatable.

  15. A Wide Area Bipolar Cascade Resonant Cavity Light Emitting Diode for a Hybrid Range-Intensity Sensor

    Science.gov (United States)

    2008-06-19

    coplanar sensors, an autonomous platform can capitalize on stereopsis , which is the process in visual perception that leads to the perception of...stereoscopic depth. This depth emerges from the fusion of two slightly different projections of a scene recorded by two sensors. Stereopsis can be used to...points, and renders a 3-D image using stereopsis . There are drawbacks to producing real-time 3-D imagery using sensor fusion as a processing technique

  16. Self-Configuring Indoor Localization Based on Low-Cost Ultrasonic Range Sensors

    Directory of Open Access Journals (Sweden)

    Can Basaran

    2014-10-01

    Full Text Available In smart environments, target tracking is an essential service used by numerous applications from activity recognition to personalized infotaintment. The target tracking relies on sensors with known locations to estimate and keep track of the path taken by the target, and hence, it is crucial to have an accurate map of such sensors. However, the need for manually entering their locations after deployment and expecting them to remain fixed, significantly limits the usability of target tracking. To remedy this drawback, we present a self-configuring and device-free localization protocol based on genetic algorithms that autonomously identifies the geographic topology of a network of ultrasonic range sensors as well as automatically detects any change in the established network structure in less than a minute and generates a new map within seconds. The proposed protocol significantly reduces hardware and deployment costs thanks to the use of low-cost off-the-shelf sensors with no manual configuration. Experiments on two real testbeds of different sizes show that the proposed protocol achieves an error of 7.16~17.53 cm in topology mapping, while also tracking a mobile target with an average error of 11.71~18.43 cm and detecting displacements of 1.41~3.16 m in approximately 30 s.

  17. Self-configuring indoor localization based on low-cost ultrasonic range sensors.

    Science.gov (United States)

    Basaran, Can; Yoon, Jong-Wan; Son, Sang Hyuk; Park, Taejoon

    2014-10-10

    In smart environments, target tracking is an essential service used by numerous applications from activity recognition to personalized infotaintment. The target tracking relies on sensors with known locations to estimate and keep track of the path taken by the target, and hence, it is crucial to have an accurate map of such sensors. However, the need for manually entering their locations after deployment and expecting them to remain fixed, significantly limits the usability of target tracking. To remedy this drawback, we present a self-configuring and device-free localization protocol based on genetic algorithms that autonomously identifies the geographic topology of a network of ultrasonic range sensors as well as automatically detects any change in the established network structure in less than a minute and generates a new map within seconds. The proposed protocol significantly reduces hardware and deployment costs thanks to the use of low-cost off-the-shelf sensors with no manual configuration. Experiments on two real testbeds of different sizes show that the proposed protocol achieves an error of 7.16~17.53 cm in topology mapping, while also tracking a mobile target with an average error of 11.71~18.43 cm and detecting displacements of 1.41~3.16 m in approximately 30 s.

  18. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    Science.gov (United States)

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  19. Laser Doppler Blood Flow Imaging Using a CMOS Imaging Sensor with On-Chip Signal Processing

    Directory of Open Access Journals (Sweden)

    Cally Gill

    2013-09-01

    Full Text Available The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  20. Imaging intracellular pH in live cells with a genetically encoded red fluorescent protein sensor.

    Science.gov (United States)

    Tantama, Mathew; Hung, Yin Pun; Yellen, Gary

    2011-07-06

    Intracellular pH affects protein structure and function, and proton gradients underlie the function of organelles such as lysosomes and mitochondria. We engineered a genetically encoded pH sensor by mutagenesis of the red fluorescent protein mKeima, providing a new tool to image intracellular pH in live cells. This sensor, named pHRed, is the first ratiometric, single-protein red fluorescent sensor of pH. Fluorescence emission of pHRed peaks at 610 nm while exhibiting dual excitation peaks at 440 and 585 nm that can be used for ratiometric imaging. The intensity ratio responds with an apparent pK(a) of 6.6 and a >10-fold dynamic range. Furthermore, pHRed has a pH-responsive fluorescence lifetime that changes by ~0.4 ns over physiological pH values and can be monitored with single-wavelength two-photon excitation. After characterizing the sensor, we tested pHRed's ability to monitor intracellular pH by imaging energy-dependent changes in cytosolic and mitochondrial pH.

  1. Indoor and Outdoor Depth Imaging of Leaves With Time-of-Flight and Stereo Vision Sensors

    DEFF Research Database (Denmark)

    Kazmi, Wajahat; Foix, Sergi; Alenya, Guilliem

    2014-01-01

    In this article we analyze the response of Time-of-Flight (ToF) cameras (active sensors) for close range imaging under three different illumination conditions and compare the results with stereo vision (passive) sensors. ToF cameras are sensitive to ambient light and have low resolution but deliver...... poorly under sunlight. Stereo vision is comparatively more robust to ambient illumination and provides high resolution depth data but is constrained by texture of the object along with computational efficiency. Graph cut based stereo correspondence algorithm can better retrieve the shape of the leaves...... of the sensors. Performance of three different ToF cameras (PMD CamBoard, PMD CamCube and SwissRanger SR4000) is compared against selected stereo correspondence algorithms (local correlation and graph cuts). PMD CamCube has better cancelation of sunlight, followed by CamBoard, while SwissRanger SR4000 performs...

  2. Miniature infrared hyperspectral imaging sensor for airborne applications

    Science.gov (United States)

    Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl

    2017-05-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each

  3. New Endoscopic Imaging Technology Based on MEMS Sensors and Actuators

    Directory of Open Access Journals (Sweden)

    Zhen Qiu

    2017-07-01

    Full Text Available Over the last decade, optical fiber-based forms of microscopy and endoscopy have extended the realm of applicability for many imaging modalities. Optical fiber-based imaging modalities permit the use of remote illumination sources and enable flexible forms supporting the creation of portable and hand-held imaging instrumentations to interrogate within hollow tissue cavities. A common challenge in the development of such devices is the design and integration of miniaturized optical and mechanical components. Until recently, microelectromechanical systems (MEMS sensors and actuators have been playing a key role in shaping the miniaturization of these components. This is due to the precision mechanics of MEMS, microfabrication techniques, and optical functionality enabling a wide variety of movable and tunable mirrors, lenses, filters, and other optical structures. Many promising results from MEMS based optical fiber endoscopy have demonstrated great potentials for clinical translation. In this article, reviews of MEMS sensors and actuators for various fiber-optical endoscopy such as fluorescence, optical coherence tomography, confocal, photo-acoustic, and two-photon imaging modalities will be discussed. This advanced MEMS based optical fiber endoscopy can provide cellular and molecular features with deep tissue penetration enabling guided resections and early cancer assessment to better treatment outcomes.

  4. Subinteger Range-Bin Alignment Method for ISAR Imaging of Noncooperative Targets

    Directory of Open Access Journals (Sweden)

    F. Pérez-Martínez

    2010-01-01

    Full Text Available Inverse Synthetic Aperture Radar (ISAR is a coherent radar technique capable of generating images of noncooperative targets. ISAR may have better performance in adverse meteorological conditions than traditional imaging sensors. Unfortunately, ISAR images are usually blurred because of the relative motion between radar and target. To improve the quality of ISAR products, motion compensation is necessary. In this context, range-bin alignment is the first step for translational motion compensation. In this paper, we propose a subinteger range-bin alignment method based on envelope correlation and reference profiles. The technique, which makes use of a carefully designed optimization stage, is robust against noise, clutter, target scintillation, and error accumulation. It provides us with very fine translational motion compensation. Comparisons with state-of-the-art range-bin alignment methods are included and advantages of the proposal are highlighted. Simulated and live data from a high-resolution linear-frequency-modulated continuous-wave radar are included to perform the pertinent comparisons.

  5. High speed global shutter image sensors for professional applications

    Science.gov (United States)

    Wu, Xu; Meynants, Guy

    2015-04-01

    Global shutter imagers expand the use to miscellaneous applications, such as machine vision, 3D imaging, medical imaging, space etc. to eliminate motion artifacts in rolling shutter imagers. A low noise global shutter pixel requires more than one non-light sensitive memory to reduce the read noise. But larger memory area reduces the fill-factor of the pixels. Modern micro-lenses technology can compensate this fill-factor loss. Backside illumination (BSI) is another popular technique to improve the pixel fill-factor. But some pixel architecture may not reach sufficient shutter efficiency with backside illumination. Non-light sensitive memory elements make the fabrication with BSI possible. Machine vision like fast inspection system, medical imaging like 3D medical or scientific applications always ask for high frame rate global shutter image sensors. Thanks to the CMOS technology, fast Analog-to-digital converters (ADCs) can be integrated on chip. Dual correlated double sampling (CDS) on chip ADC with high interface digital data rate reduces the read noise and makes more on-chip operation control. As a result, a global shutter imager with digital interface is a very popular solution for applications with high performance and high frame rate requirements. In this paper we will review the global shutter architectures developed in CMOSIS, discuss their optimization process and compare their performances after fabrication.

  6. EUROCMOSHF: demonstration of a fully European supply chain for space image sensors

    Science.gov (United States)

    De Moor, P.; De Munck, K.; Haspeslagh, L.; Guerrieri, S.; Van Olmen, J.; Meynants, G.; Beeckman, G.; Vanwichelen, K.; Van Esbroeck, K.; Ghiglione, Alexandre; Gilbert, Teva; Demiguel, Stéphane

    2017-09-01

    Europe has currently no full supply chain of CMOS image sensors (CIS) for space use, certainly not in terms of image sensor manufacturing. Although a few commercial foundries in Europe manufacture CMOS image sensors for consumer and automotive applications, they are typically not interested in adapting their process flow to meet high-end performance specifications, mainly because the expected manufacturing volume for space imagers is extremely low.

  7. Lead salt TE-cooled imaging sensor development

    Science.gov (United States)

    Green, Kenton; Yoo, Sung-Shik; Kauffman, Christopher

    2014-06-01

    Progress on development of lead-salt thermoelectrically-cooled (TE-cooled) imaging sensors will be presented. The imaging sensor architecture has been integrated into field-ruggedized hardware, and supports the use of lead-salt based detector material, including lead selenide and lead sulfide. Images and video are from a lead selenide focal plane array on silicon ROIC at temperatures approaching room temperature, and at high frame rates. Lead-salt imagers uniquely possess three traits: (1) Sensitive operation at high temperatures above the typical `cooled' sensor maximum (2) Photonic response which enables high frame rates faster than the bolometric, thermal response time (3) Capability to reliably fabricate 2D arrays from solution-deposition directly, i. e. monolithically, on silicon. These lead-salt imagers are less expensive to produce and operate compared to other IR imagers based on II-VI HgCdTe and III-V InGaAsSb, because they do not require UHV epitaxial growth nor hybrid assembly, and no cryo-engine is needed to maintain low thermal noise. Historically, there have been challenges with lead-salt detector-to-detector non-uniformities and detector noise. Staring arrays of lead-salt imagers are promising today because of advances in ROIC technology and fabrication improvements. Non-uniformities have been addressed by on-FPA non-uniformity correction and 1/f noise has been mitigated with adjustable noise filtering without mechanical chopping. Finally, improved deposition process and measurement controls have enabled reliable fabrication of high-performance, lead-salt, large format staring arrays on the surface of large silicon ROIC wafers. The imaging array performance has achieved a Noise Equivalent Temperature Difference (NETD) of 30 mK at 2.5 millisecond integration time with an f/1 lens in the 3-5 μm wavelength band using a two-stage TE cooler to operate the FPA at 230 K. Operability of 99.6% is reproducible on 240 × 320 format arrays.

  8. A mTurquoise-based cAMP sensor for both FLIM and ratiometric read-out has improved dynamic range.

    Science.gov (United States)

    Klarenbeek, Jeffrey B; Goedhart, Joachim; Hink, Mark A; Gadella, Theodorus W J; Jalink, Kees

    2011-04-29

    FRET-based sensors for cyclic Adenosine Mono Phosphate (cAMP) have revolutionized the way in which this important intracellular messenger is studied. The currently prevailing sensors consist of the cAMP-binding protein Epac1, sandwiched between suitable donor- and acceptor fluorescent proteins (FPs). Through a conformational change in Epac1, alterations in cellular cAMP levels lead to a change in FRET that is most commonly detected by either Fluorescence Lifetime Imaging (FLIM) or by Sensitized Emission (SE), e.g., by simple ratio-imaging. We recently reported a range of different Epac-based cAMP sensors with high dynamic range and signal-to-noise ratio. We showed that constructs with cyan FP as donor are optimal for readout by SE, whereas other constructs with green FP donors appeared much more suited for FLIM detection. In this study, we present a new cAMP sensor, termed (T)Epac(VV), which employs mTurquoise as donor. Spectrally very similar to CFP, mTurquoise has about doubled quantum efficiency and unlike CFP, its fluorescence decay is strictly single-exponential. We show that (T)Epac(VV) appears optimal for detection both by FLIM and SE, that it has outstanding FRET span and signal-to-noise ratio, and improved photostability. Hence, (T)Epac(VV) should become the cAMP sensor of choice for new experiments, both for FLIM and ratiometric detection.

  9. A mTurquoise-based cAMP sensor for both FLIM and ratiometric read-out has improved dynamic range.

    Directory of Open Access Journals (Sweden)

    Jeffrey B Klarenbeek

    Full Text Available FRET-based sensors for cyclic Adenosine Mono Phosphate (cAMP have revolutionized the way in which this important intracellular messenger is studied. The currently prevailing sensors consist of the cAMP-binding protein Epac1, sandwiched between suitable donor- and acceptor fluorescent proteins (FPs. Through a conformational change in Epac1, alterations in cellular cAMP levels lead to a change in FRET that is most commonly detected by either Fluorescence Lifetime Imaging (FLIM or by Sensitized Emission (SE, e.g., by simple ratio-imaging. We recently reported a range of different Epac-based cAMP sensors with high dynamic range and signal-to-noise ratio. We showed that constructs with cyan FP as donor are optimal for readout by SE, whereas other constructs with green FP donors appeared much more suited for FLIM detection. In this study, we present a new cAMP sensor, termed (TEpac(VV, which employs mTurquoise as donor. Spectrally very similar to CFP, mTurquoise has about doubled quantum efficiency and unlike CFP, its fluorescence decay is strictly single-exponential. We show that (TEpac(VV appears optimal for detection both by FLIM and SE, that it has outstanding FRET span and signal-to-noise ratio, and improved photostability. Hence, (TEpac(VV should become the cAMP sensor of choice for new experiments, both for FLIM and ratiometric detection.

  10. Column-parallel correlated multiple sampling circuits for CMOS image sensors and their noise reduction effects.

    Science.gov (United States)

    Suh, Sungho; Itoh, Shinya; Aoyama, Satoshi; Kawahito, Shoji

    2010-01-01

    For low-noise complementary metal-oxide-semiconductor (CMOS) image sensors, the reduction of pixel source follower noises is becoming very important. Column-parallel high-gain readout circuits are useful for low-noise CMOS image sensors. This paper presents column-parallel high-gain signal readout circuits, correlated multiple sampling (CMS) circuits and their noise reduction effects. In the CMS, the gain of the noise cancelling is controlled by the number of samplings. It has a similar effect to that of an amplified CDS for the thermal noise but is a little more effective for 1/f and RTS noises. Two types of the CMS with simple integration and folding integration are proposed. In the folding integration, the output signal swing is suppressed by a negative feedback using a comparator and one-bit D-to-A converter. The CMS circuit using the folding integration technique allows to realize a very low-noise level while maintaining a wide dynamic range. The noise reduction effects of their circuits have been investigated with a noise analysis and an implementation of a 1Mpixel pinned photodiode CMOS image sensor. Using 16 samplings, dynamic range of 59.4 dB and noise level of 1.9 e(-) for the simple integration CMS and 75 dB and 2.2 e(-) for the folding integration CMS, respectively, are obtained.

  11. Airborne measurements in the longwave infrared using an imaging hyperspectral sensor

    Science.gov (United States)

    Allard, Jean-Pierre; Chamberland, Martin; Farley, Vincent; Marcotte, Frédérick; Rolland, Matthias; Vallières, Alexandre; Villemaire, André

    2008-08-01

    Emerging applications in Defense and Security require sensors with state-of-the-art sensitivity and capabilities. Among these sensors, the imaging spectrometer is an instrument yielding a large amount of rich information about the measured scene. Standoff detection, identification and quantification of chemicals in the gaseous state is one important application. Analysis of the surface emissivity as a means to classify ground properties and usage is another one. Imaging spectrometers have unmatched capabilities to meet the requirements of these applications. Telops has developed the FIRST, a LWIR hyperspectral imager. The FIRST is based on the Fourier Transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. The FIRST, a man portable sensor, provides datacubes of up to 320x256 pixels at 0.35mrad spatial resolution over the 8-12 μm spectral range at spectral resolutions of up to 0.25cm-1. The FIRST has been used in several field campaigns, including the demonstration of standoff chemical agent detection [http://dx.doi.org/10.1117/12.795119.1]. More recently, an airborne system integrating the FIRST has been developed to provide airborne hyperspectral measurement capabilities. The airborne system and its capabilities are presented in this paper. The FIRST sensor modularity enables operation in various configurations such as tripod-mounted and airborne. In the airborne configuration, the FIRST can be operated in push-broom mode, or in staring mode with image motion compensation. This paper focuses on the airborne operation of the FIRST sensor.

  12. Nanoposition sensors with superior linear response to position and unlimited travel ranges

    Science.gov (United States)

    Lee, Sheng-Chiang; Peters, Randall D.

    2009-04-01

    With the advancement in nanotechnology, the ability of positioning/measuring at subnanometer scale has been one of the most critical issues for the nanofabrication industry and researchers using scanning probe microscopy. Commercial nanopositioners have achieved direct measurements at the scale of 0.01 nm with capacitive sensing metrology. However, the commercial sensors have small dynamic ranges (up to only a few hundred micrometers) and are relatively large in size (centimeters in the transverse directions to the motion), which is necessary for healthy signal detections but making it difficult to use on smaller devices. This limits applications in which large materials (on the scale of centimeters or greater) are handled with needs of subnanometer resolutions. What has been done in the past is to combine the fine and coarse translation stages with different dynamic ranges to simultaneously achieve long travel range and high spatial resolution. In this paper, we present a novel capacitive position sensing metrology with ultrawide dynamic range from subnanometer to literally any practically desired length for a translation stage. This sensor will greatly simplify the task and enhance the performance of direct metrology in a hybrid translational stage covering translation tasks from subnanometer to centimeters.

  13. Range imaging results from polar mesosphere summer echoes

    Science.gov (United States)

    Zecha, Marius; Hoffmann, Peter; Rapp, Markus; Chen, Jenn-Shyong

    The range resolution of pulsed radars is usually limited by the transmitting pulse length and the sampling time. The so-called range imaging (RIM) has been developed to reduce these lim-itations. To apply this method the radar operates alternately over a set of distinct frequencies. Then the phase differences of the receiving signals can be used for optimization methods to generate high-resolution maps of reflections as function of range insight the pulse length. The technique has been implemented on the ALWIN VHF radar in Andenes (69) and the OSWIN VHF radar in Kühlungsborn (54N). Here we present results of the RIM method from measurements in polar mesosphere summer echoes -PMSE. These strong radar echoes are linked to ice particle clouds in the mesopause region. The dynamic of the PMSE can be reflected very well by RIM. The movement of PMSE and the edges of the extension can be tracked with a high altitude resolution. Comparisons between simultaneous measurements by RIM and by standard radar techniques demonstrate the advan-tages of RIM. Wave structures can be identified with RIM whereas they are not detectable with the lesser resolution of the standard measurements. Gravity wave parameter associated with these variations are estimated using the simultaneous measured velocity field.

  14. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles

    Directory of Open Access Journals (Sweden)

    Fabian de Ponte Müller

    2017-01-01

    Full Text Available Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  15. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles.

    Science.gov (United States)

    de Ponte Müller, Fabian

    2017-01-31

    Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  16. Detection in urban scenario using combined airborne imaging sensors

    Science.gov (United States)

    Renhorn, Ingmar; Axelsson, Maria; Benoist, Koen; Bourghys, Dirk; Boucher, Yannick; Briottet, Xavier; De Ceglie, Sergio; Dekker, Rob; Dimmeler, Alwin; Dost, Remco; Friman, Ola; Kåsen, Ingebjørg; Maerker, Jochen; van Persie, Mark; Resta, Salvatore; Schwering, Piet; Shimoni, Michal; Haavardsholm, Trym Vegard

    2012-06-01

    The EDA project "Detection in Urban scenario using Combined Airborne imaging Sensors" (DUCAS) is in progress. The aim of the project is to investigate the potential benefit of combined high spatial and spectral resolution airborne imagery for several defense applications in the urban area. The project is taking advantage of the combined resources from 7 contributing nations within the EDA framework. An extensive field trial has been carried out in the city of Zeebrugge at the Belgian coast in June 2011. The Belgian armed forces contributed with platforms, weapons, personnel (soldiers) and logistics for the trial. Ground truth measurements with respect to geometrical characteristics, optical material properties and weather conditions were obtained in addition to hyperspectral, multispectral and high resolution spatial imagery. High spectral/spatial resolution sensor data are used for detection, classification, identification and tracking.

  17. Optical palpation: optical coherence tomography-based tactile imaging using a compliant sensor.

    Science.gov (United States)

    Kennedy, Kelsey M; Es'haghian, Shaghayegh; Chin, Lixin; McLaughlin, Robert A; Sampson, David D; Kennedy, Brendan F

    2014-05-15

    We present optical palpation, a tactile imaging technique for mapping micrometer- to millimeter-scale mechanical variations in soft tissue. In optical palpation, a stress sensor consisting of translucent, compliant silicone with known stress-strain behavior is placed on the tissue surface and a compressive load is applied. Optical coherence tomography (OCT) is used to measure the local strain in the sensor, from which the local stress at the sample surface is calculated and mapped onto an image. We present results in tissue-mimicking phantoms, demonstrating the detection of a feature embedded 4.7 mm below the sample surface, well beyond the depth range of OCT. We demonstrate the use of optical palpation to delineate the boundary of a region of tumor in freshly excised human breast tissue, validated against histopathology.

  18. Self-mixing imaging sensor using a monolithic VCSEL array with parallel readout.

    Science.gov (United States)

    Lim, Yah Leng; Nikolic, Milan; Bertling, Karl; Kliese, Russell; Rakić, Aleksandar D

    2009-03-30

    The advent of two-dimensional arrays of Vertical-Cavity Surface-Emitting Lasers (VCSELs) opened a range of potential sensing applications for nanotechnology and life-sciences. With each laser independently addressable, there is scope for the development of high-resolution full-field imaging systems with electronic scanning. We report on the first implementation of a self-mixing imaging system with parallel readout based on a monolithic VCSEL array. A self-mixing Doppler signal was acquired from the variation in VCSEL junction voltage rather than from a conventional variation in laser power, thus markedly reducing the system complexity. The sensor was validated by imaging the velocity distribution on the surface of a rotating disc. The results obtained demonstrate that monolithic arrays of Vertical-Cavity lasers present a powerful tool for the advancement of self-mixing sensors into parallel imaging paradigms and provide a stepping stone to the implementation of a full-field self-mixing sensor systems.

  19. Long-range non-contact imaging photoplethysmography: cardiac pulse wave sensing at a distance

    Science.gov (United States)

    Blackford, Ethan B.; Estepp, Justin R.; Piasecki, Alyssa M.; Bowers, Margaret A.; Klosterman, Samantha L.

    2016-03-01

    Non-contact, imaging photoplethysmography uses photo-optical sensors to measure variations in light absorption, caused by blood volume pulsations, to assess cardiopulmonary parameters including pulse rate, pulse rate variability, and respiration rate. Recently, researchers have studied the applications and methodology of imaging photoplethysmography. Basic research has examined some of the variables affecting data quality and accuracy of imaging photoplethysmography including signal processing, imager parameters (e.g. frame rate and resolution), lighting conditions, subject motion, and subject skin tone. This technology may be beneficial for long term or continuous monitoring where contact measurements may be harmful (e.g. skin sensitivities) or where imperceptible or unobtrusive measurements are desirable. Using previously validated signal processing methods, we examined the effects of imager-to-subject distance on one-minute, windowed estimates of pulse rate. High-resolution video of 22, stationary participants was collected using an enthusiast-grade, mirrorless, digital camera equipped with a fully-manual, super-telephoto lens at distances of 25, 50, and 100 meters with simultaneous contact measurements of electrocardiography, and fingertip photoplethysmography. By comparison, previous studies have usually been conducted with imager-to-subject distances of up to only a few meters. Mean absolute error for one-minute, windowed, pulse rate estimates (compared to those derived from gold-standard electrocardiography) were 2.0, 4.1, and 10.9 beats per minute at distances of 25, 50, and 100 meters, respectively. Long-range imaging presents several unique challenges among which include decreased, observed light reflectance and smaller regions of interest. Nevertheless, these results demonstrate that accurate pulse rate measurements can be obtained from over long imager-to-participant distances given these constraints.

  20. Numerical Demultiplexing of Color Image Sensor Measurements via Non-linear Random Forest Modeling.

    Science.gov (United States)

    Deglint, Jason; Kazemzadeh, Farnoud; Cho, Daniel; Clausi, David A; Wong, Alexander

    2016-06-27

    The simultaneous capture of imaging data at multiple wavelengths across the electromagnetic spectrum is highly challenging, requiring complex and costly multispectral image devices. In this study, we investigate the feasibility of simultaneous multispectral imaging using conventional image sensors with color filter arrays via a novel comprehensive framework for numerical demultiplexing of the color image sensor measurements. A numerical forward model characterizing the formation of sensor measurements from light spectra hitting the sensor is constructed based on a comprehensive spectral characterization of the sensor. A numerical demultiplexer is then learned via non-linear random forest modeling based on the forward model. Given the learned numerical demultiplexer, one can then demultiplex simultaneously-acquired measurements made by the color image sensor into reflectance intensities at discrete selectable wavelengths, resulting in a higher resolution reflectance spectrum. Experimental results demonstrate the feasibility of such a method for the purpose of simultaneous multispectral imaging.

  1. Numerical Demultiplexing of Color Image Sensor Measurements via Non-linear Random Forest Modeling

    Science.gov (United States)

    Deglint, Jason; Kazemzadeh, Farnoud; Cho, Daniel; Clausi, David A.; Wong, Alexander

    2016-06-01

    The simultaneous capture of imaging data at multiple wavelengths across the electromagnetic spectrum is highly challenging, requiring complex and costly multispectral image devices. In this study, we investigate the feasibility of simultaneous multispectral imaging using conventional image sensors with color filter arrays via a novel comprehensive framework for numerical demultiplexing of the color image sensor measurements. A numerical forward model characterizing the formation of sensor measurements from light spectra hitting the sensor is constructed based on a comprehensive spectral characterization of the sensor. A numerical demultiplexer is then learned via non-linear random forest modeling based on the forward model. Given the learned numerical demultiplexer, one can then demultiplex simultaneously-acquired measurements made by the color image sensor into reflectance intensities at discrete selectable wavelengths, resulting in a higher resolution reflectance spectrum. Experimental results demonstrate the feasibility of such a method for the purpose of simultaneous multispectral imaging.

  2. Improved laser-based triangulation sensor with enhanced range and resolution through adaptive optics-based active beam control.

    Science.gov (United States)

    Reza, Syed Azer; Khwaja, Tariq Shamim; Mazhar, Mohsin Ali; Niazi, Haris Khan; Nawab, Rahma

    2017-07-20

    Various existing target ranging techniques are limited in terms of the dynamic range of operation and measurement resolution. These limitations arise as a result of a particular measurement methodology, the finite processing capability of the hardware components deployed within the sensor module, and the medium through which the target is viewed. Generally, improving the sensor range adversely affects its resolution and vice versa. Often, a distance sensor is designed for an optimal range/resolution setting depending on its intended application. Optical triangulation is broadly classified as a spatial-signal-processing-based ranging technique and measures target distance from the location of the reflected spot on a position sensitive detector (PSD). In most triangulation sensors that use lasers as a light source, beam divergence-which severely affects sensor measurement range-is often ignored in calculations. In this paper, we first discuss in detail the limitations to ranging imposed by beam divergence, which, in effect, sets the sensor dynamic range. Next, we show how the resolution of laser-based triangulation sensors is limited by the interpixel pitch of a finite-sized PSD. In this paper, through the use of tunable focus lenses (TFLs), we propose a novel design of a triangulation-based optical rangefinder that improves both the sensor resolution and its dynamic range through adaptive electronic control of beam propagation parameters. We present the theory and operation of the proposed sensor and clearly demonstrate a range and resolution improvement with the use of TFLs. Experimental results in support of our claims are shown to be in strong agreement with theory.

  3. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    Directory of Open Access Journals (Sweden)

    Neale A. W. Dutton

    2016-07-01

    Full Text Available SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN permitting single photon counting (SPC imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW of single photon peaks in a photon counting histogram (PCH. The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  4. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    Science.gov (United States)

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  5. Strategies for registering range images from unknown camera positions

    Science.gov (United States)

    Bernardini, Fausto; Rushmeier, Holly E.

    2000-03-01

    We describe a project to construct a 3D numerical model of Michelangelo's Florentine Pieta to be used in a study of the sculpture. Here we focus on the registration of the range images used to construct the model. The major challenge was the range of length scales involved. A resolution of 1 mm or less required for the 2.25 m tall piece. To achieve this resolution, we could only acquire an area of 20 by 20 cm per scan. A total of approximately 700 images were required. Ideally, a tracker would be attached to the scanner to record position and pose. The use of a tracker was not possible in the field. Instead, we used a crude-to-fine approach to registering the meshes to one another. The crudest level consisted of pairwise manual registration, aided by texture maps containing laser dots that were projected onto the sculpture. This crude alignment was refined by an automatic registration of laser dot centers. In this phase, we found that consistency constraints on dot matches were essential to obtaining accurate results. The laser dot alignment was refined by an automatic registration of laser dot centers. In this phase, we found that consistency constraints on dot matches were essential to obtaining accurate results. The laser dot alignment was further refined using a variation of the ICP algorithm developed by Besl and McKay. In the application of ICP to global registration, we developed a method to avoid one class of local minima by finding a set of points, rather than the single point, that matches each candidate point.

  6. BMRC: A Bitmap-Based Maximum Range Counting Approach for Temporal Data in Sensor Monitoring Networks

    Directory of Open Access Journals (Sweden)

    Bin Cao

    2017-09-01

    Full Text Available Due to the rapid development of the Internet of Things (IoT, many feasible deployments of sensor monitoring networks have been made to capture the events in physical world, such as human diseases, weather disasters and traffic accidents, which generate large-scale temporal data. Generally, the certain time interval that results in the highest incidence of a severe event has significance for society. For example, there exists an interval that covers the maximum number of people who have the same unusual symptoms, and knowing this interval can help doctors to locate the reason behind this phenomenon. As far as we know, there is no approach available for solving this problem efficiently. In this paper, we propose the Bitmap-based Maximum Range Counting (BMRC approach for temporal data generated in sensor monitoring networks. Since sensor nodes can update their temporal data at high frequency, we present a scalable strategy to support the real-time insert and delete operations. The experimental results show that the BMRC outperforms the baseline algorithm in terms of efficiency.

  7. Detection of wavelengths in the visible range using fiber optic sensors

    Science.gov (United States)

    Díaz, Leonardo; Morales, Yailteh; Mattos, Lorenzo; Torres, Cesar O.

    2013-11-01

    This paper shows the design and implementation of a fiber optic sensor for detecting and identifying wavelengths in the visible range. The system consists of a diffuse optical fiber, a conventional laser diode 650nm, 2.5mW of power, an ambient light sensor LX1972, a PIC 18F2550 and LCD screen for viewing. The principle used in the detection of the lambda is based on specular reflection and absorption. The optoelectronic device designed and built used the absorption and reflection properties of the material under study, having as active optical medium a bifurcated optical fiber, which is optically coupled to an ambient light sensor, which makes the conversion of light signals to electricas, procedure performed by a microcontroller, which acquires and processes the signal. To verify correct operation of the assembly were utilized the color cards of sewing thread and nail polish as samples for analysis. This optoelectronic device can be used in many applications such as quality control of industrial processes, classification of corks or bottle caps, color quality of textiles, sugar solutions, polymers and food among others.

  8. Development of a Low cost Ultra tiny Line Laser Range Sensor

    Science.gov (United States)

    2016-12-01

    voltage reading of the inner capacitors. Through the amplifier buffers, the output signals are collected by the ADCs and transferred to the memory by the...32-bits MCU integrated with powerful peripheral circuits, including 12-bit Analog-Digital Converters ( ADCs ) with 5 Mega Samples per Second (MSPS...outputs, linear with respect to Q1, Q2, the measurements of the range sensor. For all the 256 pixels, there is: V = [ V (1) 2 V (1) 1 +V (1) 2 V (2) 2

  9. Fast Image Restoration for Spatially Varying Defocus Blur of Imaging Sensor

    Directory of Open Access Journals (Sweden)

    Hejin Cheong

    2015-01-01

    Full Text Available This paper presents a fast adaptive image restoration method for removing spatially varying out-of-focus blur of a general imaging sensor. After estimating the parameters of space-variant point-spread-function (PSF using the derivative in each uniformly blurred region, the proposed method performs spatially adaptive image restoration by selecting the optimal restoration filter according to the estimated blur parameters. Each restoration filter is implemented in the form of a combination of multiple FIR filters, which guarantees the fast image restoration without the need of iterative or recursive processing. Experimental results show that the proposed method outperforms existing space-invariant restoration methods in the sense of both objective and subjective performance measures. The proposed algorithm can be employed to a wide area of image restoration applications, such as mobile imaging devices, robot vision, and satellite image processing.

  10. Versatile, Compact, Low-Cost, MEMS-Based Image Stabilization for Imaging Sensor Performance Enhancement Project

    Data.gov (United States)

    National Aeronautics and Space Administration — LW Microsystems proposes to develop a compact, low-cost image stabilization system suitable for use with a wide range of focal-plane imaging systems in remote...

  11. Laser Doppler perfusion imaging with a complimentary metal oxide semiconductor image sensor

    NARCIS (Netherlands)

    Serov, Alexander; Steenbergen, Wiendelt; de Mul, F.F.M.

    2002-01-01

    We utilized a complimentary metal oxide semiconductor video camera for fast f low imaging with the laser Doppler technique. A single sensor is used for both observation of the area of interest and measurements of the interference signal caused by dynamic light scattering from moving particles inside

  12. Integrated arrays of air-dielectric graphene transistors as transparent active-matrix pressure sensors for wide pressure ranges

    Science.gov (United States)

    Shin, Sung-Ho; Ji, Sangyoon; Choi, Seiho; Pyo, Kyoung-Hee; Wan An, Byeong; Park, Jihun; Kim, Joohee; Kim, Ju-Young; Lee, Ki-Suk; Kwon, Soon-Yong; Heo, Jaeyeong; Park, Byong-Guk; Park, Jang-Ung

    2017-03-01

    Integrated electronic circuitries with pressure sensors have been extensively researched as a key component for emerging electronics applications such as electronic skins and health-monitoring devices. Although existing pressure sensors display high sensitivities, they can only be used for specific purposes due to the narrow range of detectable pressure (under tens of kPa) and the difficulty of forming highly integrated arrays. However, it is essential to develop tactile pressure sensors with a wide pressure range in order to use them for diverse application areas including medical diagnosis, robotics or automotive electronics. Here we report an unconventional approach for fabricating fully integrated active-matrix arrays of pressure-sensitive graphene transistors with air-dielectric layers simply formed by folding two opposing panels. Furthermore, this realizes a wide tactile pressure sensing range from 250 Pa to ~3 MPa. Additionally, fabrication of pressure sensor arrays and transparent pressure sensors are demonstrated, suggesting their substantial promise as next-generation electronics.

  13. Sensor for real-time determining the polarization state distribution in the object images

    Science.gov (United States)

    Kilosanidze, Barbara; Kakauridze, George; Kvernadze, Teimuraz; Kurkhuli, Georgi

    2015-10-01

    An innovative real-time polarimetric method is presented based on the integral polarization-holographic diffraction element developed by us. This element is suggested to be used for real time analysis of the polarization state of light, to help highlight military equipment in a scene. In the process of diffraction, the element decomposes light incoming on them onto orthogonal circular and linear basis. The simultaneous measurement of the intensities of four diffracted beams by means of photodetectors and the appropriate software enable the polarization state of an analyzable light (all the four Stokes parameters) and its change to be obtained in real time. The element with photodetectors and software is a sensor of the polarization state. Such a sensor allows the point-by-point distribution of the polarization state in the images of objects to be determined. The spectral working range of such an element is 530 - 1600 nm. This sensor is compact, lightweight and relatively cheap, and it can be easily installed on any space and airborne platforms. It has no mechanically moving or electronically controlled elements. The speed of its operation is limited only by computer processing. Such a sensor is proposed to be use for the determination of the characteristics of the surface of objects at optical remote sensing by means of the determination of the distribution of the polarization state of light in the image of recognizable object and the dispersion of this distribution, which provides additional information while identifying an object. The possibility of detection of a useful signal of the predetermined polarization on a background of statistically random noise of an underlying surface is also possible. The application of the sensor is also considered for the nondestructive determination of the distribution of stressed state in different constructions based on the determination of the distribution of the polarization state of light reflected from the object under

  14. A Wafer scale active pixel CMOS image sensor for generic x-ray radiology

    Science.gov (United States)

    Scheffer, Danny

    2007-03-01

    This paper describes a CMOS Active Pixel Image Sensor developed for generic X-ray imaging systems using standard CMOS technology and an active pixel architecture featuring low noise and a high sensitivity. The image sensor has been manufactured in a standard 0.35 μm technology using 8" wafers. The resolution of the sensor is 3360x3348 pixels of 40x40 μm2 each. The diagonal of the sensor measures little over 190 mm. The paper discusses the floor planning, stitching diagram, and the electro-optical performance of the sensor that has been developed.

  15. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  16. Quantum dots in imaging, drug delivery and sensor applications.

    Science.gov (United States)

    Matea, Cristian T; Mocan, Teodora; Tabaran, Flaviu; Pop, Teodora; Mosteanu, Ofelia; Puia, Cosmin; Iancu, Cornel; Mocan, Lucian

    2017-01-01

    Quantum dots (QDs), also known as nanoscale semiconductor crystals, are nanoparticles with unique optical and electronic properties such as bright and intensive fluorescence. Since most conventional organic label dyes do not offer the near-infrared (>650 nm) emission possibility, QDs, with their tunable optical properties, have gained a lot of interest. They possess characteristics such as good chemical and photo-stability, high quantum yield and size-tunable light emission. Different types of QDs can be excited with the same light wavelength, and their narrow emission bands can be detected simultaneously for multiple assays. There is an increasing interest in the development of nano-theranostics platforms for simultaneous sensing, imaging and therapy. QDs have great potential for such applications, with notable results already published in the fields of sensors, drug delivery and biomedical imaging. This review summarizes the latest developments available in literature regarding the use of QDs for medical applications.

  17. Covariance estimation in Terms of Stokes Parameters iwth Application to Vector Sensor Imaging

    Science.gov (United States)

    2016-12-15

    vector sensor imaging problem: estimating the magnitude, polarization, and direction of plane wave sources from a sample covariance matrix of vector mea...Covariance estimation in terms of Stokes parameters with application to vector sensor imaging Ryan Volz∗, Mary Knapp†, Frank D. Lind∗, Frank C. Robey...Lincoln Laboratory, Lexington, MA Abstract— Vector sensor imaging presents a challeng- ing problem in covariance estimation when allowing arbitrarily

  18. Infrared Range Sensor Array for 3D Sensing in Robotic Applications

    Directory of Open Access Journals (Sweden)

    Yongtae Do

    2013-04-01

    Full Text Available This paper presents the design and testing of multiple infrared range detectors arranged in a two-dimensional (2D array. The proposed system can collect the sparse three-dimensional (3D data of objects and surroundings for robotics applications. Three kinds of tasks are considered using the system: detecting obstacles that lie ahead of a mobile robot, sensing the ground profile for the safe navigation of a mobile robot, and sensing the shape and position of an object on a conveyor belt for pickup by a robot manipulator. The developed system is potentially a simple alternative to high-resolution (and expensive 3D sensing systems, such as stereo cameras or laser scanners. In addition, the system can provide shape information about target objects and surroundings that cannot be obtained using simple ultrasonic sensors. Laboratory prototypes of the system were built with nine infrared range sensors arranged in a 3×3 array and test results confirmed the validity of system.

  19. Long-range vibration sensor based on correlation analysis of optical frequency-domain reflectometry signals.

    Science.gov (United States)

    Ding, Zhenyang; Yao, X Steve; Liu, Tiegen; Du, Yang; Liu, Kun; Han, Qun; Meng, Zhuo; Chen, Hongxin

    2012-12-17

    We present a novel method to achieve a space-resolved long- range vibration detection system based on the correlation analysis of the optical frequency-domain reflectometry (OFDR) signals. By performing two separate measurements of the vibrated and non-vibrated states on a test fiber, the vibration frequency and position of a vibration event can be obtained by analyzing the cross-correlation between beat signals of the vibrated and non-vibrated states in a spatial domain, where the beat signals are generated from interferences between local Rayleigh backscattering signals of the test fiber and local light oscillator. Using the proposed technique, we constructed a standard single-mode fiber based vibration sensor that can have a dynamic range of 12 km and a measurable vibration frequency up to 2 kHz with a spatial resolution of 5 m. Moreover, preliminarily investigation results of two vibration events located at different positions along the test fiber are also reported.

  20. Simulation of Meteosat Third Generation-Lightning Imager through tropical rainfall measuring mission: Lightning Imaging Sensor data

    Science.gov (United States)

    Biron, Daniele; De Leonibus, Luigi; Laquale, Paolo; Labate, Demetrio; Zauli, Francesco; Melfi, Davide

    2008-08-01

    The Centro Nazionale di Meteorologia e Climatologia Aeronautica recently hosted a fellowship sponsored by Galileo Avionica, with the intent to study and perform a simulation of Meteosat Third Generation - Lightning Imager (MTG-LI) sensor behavior through Tropical Rainfall Measuring Mission - Lightning Imaging Sensor data (TRMM-LIS). For the next generation of earth observation geostationary satellite, major operating agencies are planning to insert an optical imaging mission, that continuously observes lightning pulses in the atmosphere; EUMETSAT has decided in recent years that one of the three candidate mission to be flown on MTG is LI, a Lightning Imager. MTG-LI mission has no Meteosat Second Generation heritage, but users need to evaluate the possible real time data output of the instrument to agree in inserting it on MTG payload. Authors took the expected LI design from MTG Mission Requirement Document, and reprocess real lightning dataset, acquired from space by TRMM-LIS instrument, to produce a simulated MTG-LI lightning dataset. The simulation is performed in several run, varying Minimum Detectable Energy, taking into account processing steps from event detection to final lightning information. A definition of the specific meteorological requirements is given from the potential use in meteorology of lightning final information for convection estimation and numerical cloud modeling. Study results show the range of instrument requirements relaxation which lead to minimal reduction in the final lightning information.

  1. Novel near-to-mid IR imaging sensors without cooling Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Boston Applied Technologies, Inc (BATi), together with Kent State University (KSU), proposes to develop a high sensitivity infrared (IR) imaging sensor without...

  2. Improve the Robustness of Range-Free Localization Methods on Wireless Sensor Networks using Recursive Position Estimation Algorithms

    Directory of Open Access Journals (Sweden)

    Gamantyo Hendrantoro

    2011-12-01

    Full Text Available The position of a sensor node at wireless sensor networks determines the received data sensing accuracy. By the knowledge of sensor positioning, the location of target sensed can be estimated. Localization techniques used to find out the position of sensor node by considering the distance of this sensor from the vicinity reference nodes. Centroid Algorithm is a robust, simple and low cost localization technique without dependence on hardware requirement. We propose Recursive Position Estimation Algorithm to obtain the more accurate node positioning on range-free localization technique. The simulation result shows that this algorithm has the ability on increasing position accuracy up to 50%. The trade off factor shows the smaller the number of reference nodes the higher the computational time required. The new method on the availability on sensor power controlled is proposed to optimize the estimated position.

  3. Improve the Robustness of Range-Free Localization Methods on Wireless Sensor Networks using Recursive Position Estimation Algorithm

    Directory of Open Access Journals (Sweden)

    Prima Kristalina

    2013-09-01

    Full Text Available The position of a sensor node at wireless sensor networks determines the received data sensing accuracy. By the knowledge of sensor positioning, the location of target sensed can be estimated. Localization techniques used to find out the position of sensor node by considering the distance of this sensor from the vicinity reference nodes.  Centroid Algorithm is a robust, simple and low cost localization technique without dependence on hardware requirement. We propose Recursive Position Estimation Algorithm to obtain the more accurate node positioning on range-free localization technique. The simulation result shows that this algorithm has the ability on increasing position accuracy up to 50%.  The trade off factor shows the smaller the number of reference nodes the higher the computational time required. The new method on the availability on sensor power controlled is proposed to optimize the estimated position.

  4. Phase Compensation Sensor for Ranging Consistency in Inter-Satellite Links of Navigation Constellation.

    Science.gov (United States)

    Meng, Zhijun; Yang, Jun; Guo, Xiye; Hu, Mei

    2017-02-24

    Theperformanceoftheglobalnavigationsatellitesystem(GNSS)canbeenhancedsignificantly by introducing the inter-satellite links (ISL) of a navigation constellation. In particular, the improvement of the position, velocity, and time accuracy, and the realization of autonomous functions require the ISL distance measurement data as the original input. For building a high-performance ISL, the ranging consistency between navigation satellites becomes a crucial problem to be addressed. Considering the frequency aging drift and the relativistic effect of the navigation satellite, the frequency and phase adjustment (FPA) instructions for the 10.23 MHz must be injected from the ground station to ensure the time synchronization of the navigation constellation. Moreover, the uncertainty of the initial phase each time the onboard clock equipment boots also results in a pseudo-range offset. In this Ref., we focus on the influence of the frequency and phase characteristics of the onboard clock equipment on the ranging consistency of the ISL and propose a phase compensation sensor design method for the phase offset. The simulation and experimental results show that the proposed method not only realized a phase compensation for the pseudo-range jitter, but, when the 1 PPS (1 pulse per second) falls in the 10.23 MHz skip area, also overcomes the problem of compensating the ambiguous phase by directly tracking the 10.23 MHz to ensure consistency in the ranging.

  5. Automatic face segmentation and facial landmark detection in range images.

    Science.gov (United States)

    Pamplona Segundo, Maurício; Silva, Luciano; Bellon, Olga Regina Pereira; Queirolo, Chauã C

    2010-10-01

    We present a methodology for face segmentation and facial landmark detection in range images. Our goal was to develop an automatic process to be embedded in a face recognition system using only depth information as input. To this end, our segmentation approach combines edge detection, region clustering, and shape analysis to extract the face region, and our landmark detection approach combines surface curvature information and depth relief curves to find the nose and eye landmarks. The experiments were performed using the two available versions of the Face Recognition Grand Challenge database and the BU-3DFE database, in order to validate our proposed methodology and its advantages for 3-D face recognition purposes. We present an analysis regarding the accuracy of our segmentation and landmark detection approaches. Our results were better compared to state-of-the-art works published in the literature. We also performed an evaluation regarding the influence of the segmentation process in our 3-D face recognition system and analyzed the improvements obtained when applying landmark-based techniques to deal with facial expressions.

  6. Measurements of pulse rate using long-range imaging photoplethysmography and sunlight illumination outdoors

    Science.gov (United States)

    Blackford, Ethan B.; Estepp, Justin R.

    2017-02-01

    Imaging photoplethysmography, a method using imagers to record absorption variations caused by microvascular blood volume pulsations, shows promise as a non-contact cardiovascular sensing technology. The first long-range imaging photoplethysmography measurements at distances of 25, 50, and 100 meters from the participant was recently demonstrated. Degraded signal quality was observed with increasing imager-to-subject distances. The degradation in signal quality was hypothesized to be largely attributable to inadequate light return to the image sensor with increasing lens focal length. To test this hypothesis, a follow-up evaluation with 27 participants was conducted outdoors with natural sunlight illumination resulting in 5-33 times the illumination intensity. Video was recorded from cameras equipped with ultra-telephoto lenses and positioned at distances of 25, 50, 100, and 150 meters. The brighter illumination allowed high-definition video recordings at increased frame rates of 60fps, shorter exposure times, and lower ISO settings, leading to higher quality image formation than the previous indoor evaluation. Results were compared to simultaneous reference measurements from electrocardiography. Compared to the previous indoor study, we observed lower overall error in pulse rate measurement with the same pattern of degradation in signal quality with respect to increasing distance. This effect was corroborated by the signal-to-noise ratio of the blood volume pulse signal which also showed decreasing quality with respect to increasing distance. Finally, a popular chrominance-based method was compared to a blind source separation approach; while comparable in measurement of signal-to-noise ratio, we observed higher overall error in pulse rate measurement using the chrominance method in this data.

  7. Ultrahigh sensitivity endoscopic camera using a new CMOS image sensor: providing with clear images under low illumination in addition to fluorescent images.

    Science.gov (United States)

    Aoki, Hisae; Yamashita, Hiromasa; Mori, Toshiyuki; Fukuyo, Tsuneo; Chiba, Toshio

    2014-11-01

    We developed a new ultrahigh-sensitive CMOS camera using a specific sensor that has a wide range of spectral sensitivity characteristics. The objective of this study is to present our updated endoscopic technology that has successfully integrated two innovative functions; ultrasensitive imaging as well as advanced fluorescent viewing. Two different experiments were conducted. One was carried out to evaluate the function of the ultrahigh-sensitive camera. The other was to test the availability of the newly developed sensor and its performance as a fluorescence endoscope. In both studies, the distance from the endoscopic tip to the target was varied and those endoscopic images in each setting were taken for further comparison. In the first experiment, the 3-CCD camera failed to display the clear images under low illumination, and the target was hardly seen. In contrast, the CMOS camera was able to display the targets regardless of the camera-target distance under low illumination. Under high illumination, imaging quality given by both cameras was quite alike. In the second experiment as a fluorescence endoscope, the CMOS camera was capable of clearly showing the fluorescent-activated organs. The ultrahigh sensitivity CMOS HD endoscopic camera is expected to provide us with clear images under low illumination in addition to the fluorescent images under high illumination in the field of laparoscopic surgery.

  8. High Time Resolution Photon Counting 3D Imaging Sensors

    Science.gov (United States)

    Siegmund, O.; Ertley, C.; Vallerga, J.

    2016-09-01

    Novel sealed tube microchannel plate (MCP) detectors using next generation cross strip (XS) anode readouts and high performance electronics have been developed to provide photon counting imaging sensors for Astronomy and high time resolution 3D remote sensing. 18 mm aperture sealed tubes with MCPs and high efficiency Super-GenII or GaAs photocathodes have been implemented to access the visible/NIR regimes for ground based research, astronomical and space sensing applications. The cross strip anode readouts in combination with PXS-II high speed event processing electronics can process high single photon counting event rates at >5 MHz ( 80 ns dead-time per event), and time stamp events to better than 25 ps. Furthermore, we are developing a high speed ASIC version of the electronics for low power/low mass spaceflight applications. For a GaAs tube the peak quantum efficiency has degraded from 30% (at 560 - 850 nm) to 25% over 4 years, but for Super-GenII tubes the peak quantum efficiency of 17% (peak at 550 nm) has remained unchanged for over 7 years. The Super-GenII tubes have a uniform spatial resolution of MCP gain photon counting operation also permits longer overall sensor lifetimes and high local counting rates. Using the high timing resolution, we have demonstrated 3D object imaging with laser pulse (630 nm 45 ps jitter Pilas laser) reflections in single photon counting mode with spatial and depth sensitivity of the order of a few millimeters. A 50 mm Planacon sealed tube was also constructed, using atomic layer deposited microchannel plates which potentially offer better overall sealed tube lifetime, quantum efficiency and gain stability. This tube achieves standard bialkali quantum efficiency levels, is stable, and has been coupled to the PXS-II electronics and used to detect and image fast laser pulse signals.

  9. Depth maps and high-dynamic range image generation from alternating exposure multiview images

    Science.gov (United States)

    Heo, Yong Seok; Lee, Kyoung Mu; Lee, Sang Uk

    2017-07-01

    For stereo matching, it is hard to find accurate correspondence for saturated regions, such as too dark or too bright regions, because there is rarely reliable information to match. In this situation, conventional high-dynamic range (HDR) imaging techniques combining multiple exposures for each viewpoint can be adopted to generate well-exposed stereo images. This approach is, however, time-consuming and needs much memory to store multiple exposures for each viewpoint. We propose an efficient method to generate HDR multiview images as well as corresponding accurate depth maps. First, we take a single exposure for each viewpoint with alternating exposure setting, such as short and long exposure, as functions of viewpoint changes. Then, we compute an initial depth map for each view only using neighboring images that have the same exposure. To reduce the error of the initial depth maps for the saturated regions, we adopt the fusion move algorithm fusing neighboring depth maps that have different error regions. Finally, using the enhanced depth maps, we generate artifact-free and sharp HDR images using the joint bilateral filtering and a detail-transfer technique. Experimental results show that our method produces both consistent HDR images and accurate depth maps for various indoor and outdoor multiview images.

  10. Exploiting sparsity in time-of-flight range acquisition using a single time-resolved sensor.

    Science.gov (United States)

    Kirmani, Ahmed; Colaço, Andrea; Wong, Franco N C; Goyal, Vivek K

    2011-10-24

    Range acquisition systems such as light detection and ranging (LIDAR) and time-of-flight (TOF) cameras operate by measuring the time difference of arrival between a transmitted pulse and the scene reflection. We introduce the design of a range acquisition system for acquiring depth maps of piecewise-planar scenes with high spatial resolution using a single, omnidirectional, time-resolved photodetector and no scanning components. In our experiment, we reconstructed 64 × 64-pixel depth maps of scenes comprising two to four planar shapes using only 205 spatially-patterned, femtosecond illuminations of the scene. The reconstruction uses parametric signal modeling to recover a set of depths present in the scene. Then, a convex optimization that exploits sparsity of the Laplacian of the depth map of a typical scene determines correspondences between spatial positions and depths. In contrast with 2D laser scanning used in LIDAR systems and low-resolution 2D sensor arrays used in TOF cameras, our experiment demonstrates that it is possible to build a non-scanning range acquisition system with high spatial resolution using only a standard, low-cost photodetector and a spatial light modulator. © 2011 Optical Society of America

  11. Low-Voltage 96 dB Snapshot CMOS Image Sensor with 4.5 nW Power Dissipation per Pixel

    Directory of Open Access Journals (Sweden)

    Orly Yadid-Pecht

    2012-07-01

    Full Text Available Modern “smart” CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage “smart” image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR and Dynamic Range (DR as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  12. Wide-Range Highly-Efficient Wireless Power Receivers for Implantable Biomedical Sensors

    KAUST Repository

    Ouda, Mahmoud

    2016-11-01

    Wireless power transfer (WPT) is the key enabler for a myriad of applications, from low-power RFIDs, and wireless sensors, to wirelessly charged electric vehicles, and even massive power transmission from space solar cells. One of the major challenges in designing implantable biomedical devices is the size and lifetime of the battery. Thus, replacing the battery with a miniaturized wireless power receiver (WPRx) facilitates designing sustainable biomedical implants in smaller volumes for sentient medical applications. In the first part of this dissertation, we propose a miniaturized, fully integrated, wirelessly powered implantable sensor with on-chip antenna, designed and implemented in a standard 0.18μm CMOS process. As a batteryless device, it can be implanted once inside the body with no need for further invasive surgeries to replace batteries. The proposed single-chip solution is designed for intraocular pressure monitoring (IOPM), and can serve as a sustainable platform for implantable devices or IoT nodes. A custom setup is developed to test the chip in a saline solution with electrical properties similar to those of the aqueous humor of the eye. The proposed chip, in this eye-like setup, is wirelessly charged to 1V from a 5W transmitter 3cm away from the chip. In the second part, we propose a self-biased, differential rectifier with enhanced efficiency over an extended range of input power. A prototype is designed for the medical implant communication service (MICS) band at 433MHz. It demonstrates an efficiency improvement of more than 40% in the rectifier power conversion efficiency (PCE) and a dynamic range extension of more than 50% relative to the conventional cross-coupled rectifier. A sensitivity of -15.2dBm input power for 1V output voltage and a peak PCE of 65% are achieved for a 50k load. In the third part, we propose a wide-range, differential RF-to-DC power converter using an adaptive, self-biasing technique. The proposed architecture doubles

  13. Full Waveform Analysis for Long-Range 3D Imaging Laser Radar

    Directory of Open Access Journals (Sweden)

    Wallace AndrewM

    2010-01-01

    Full Text Available The new generation of 3D imaging systems based on laser radar (ladar offers significant advantages in defense and security applications. In particular, it is possible to retrieve 3D shape information directly from the scene and separate a target from background or foreground clutter by extracting a narrow depth range from the field of view by range gating, either in the sensor or by postprocessing. We discuss and demonstrate the applicability of full-waveform ladar to produce multilayer 3D imagery, in which each pixel produces a complex temporal response that describes the scene structure. Such complexity caused by multiple and distributed reflection arises in many relevant scenarios, for example in viewing partially occluded targets, through semitransparent materials (e.g., windows and through distributed reflective media such as foliage. We demonstrate our methodology on 3D image data acquired by a scanning time-of-flight system, developed in our own laboratories, which uses the time-correlated single-photon counting technique.

  14. Recognition of flow in everyday life using sensor agent robot with laser range finder

    Science.gov (United States)

    Goshima, Misa; Mita, Akira

    2011-04-01

    In the present paper, we suggest an algorithm for a sensor agent robot with a laser range finder to recognize the flows of residents in the living spaces in order to achieve flow recognition in the living spaces, recognition of the number of people in spaces, and the classification of the flows. House reform is or will be demanded to prolong the lifetime of the home. Adaption for the individuals is needed for our aging society which is growing at a rapid pace. Home autonomous mobile robots will become popular in the future for aged people to assist them in various situations. Therefore we have to collect various type of information of human and living spaces. However, a penetration in personal privacy must be avoided. It is essential to recognize flows in everyday life in order to assist house reforms and aging societies in terms of adaption for the individuals. With background subtraction, extra noise removal, and the clustering based k-means method, we got an average accuracy of more than 90% from the behavior from 1 to 3 persons, and also confirmed the reliability of our system no matter the position of the sensor. Our system can take advantages from autonomous mobile robots and protect the personal privacy. It hints at a generalization of flow recognition methods in the living spaces.

  15. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    Science.gov (United States)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported

  16. IR sensitivity enhancement of CMOS Image Sensor with diffractive light trapping pixels.

    Science.gov (United States)

    Yokogawa, Sozo; Oshiyama, Itaru; Ikeda, Harumi; Ebiko, Yoshiki; Hirano, Tomoyuki; Saito, Suguru; Oinoue, Takashi; Hagimoto, Yoshiya; Iwamoto, Hayato

    2017-06-19

    We report on the IR sensitivity enhancement of back-illuminated CMOS Image Sensor (BI-CIS) with 2-dimensional diffractive inverted pyramid array structure (IPA) on crystalline silicon (c-Si) and deep trench isolation (DTI). FDTD simulations of semi-infinite thick c-Si having 2D IPAs on its surface whose pitches over 400 nm shows more than 30% improvement of light absorption at λ = 850 nm and the maximum enhancement of 43% with the 540 nm pitch at the wavelength is confirmed. A prototype BI-CIS sample with pixel size of 1.2 μm square containing 400 nm pitch IPAs shows 80% sensitivity enhancement at λ = 850 nm compared to the reference sample with flat surface. This is due to diffraction with the IPA and total reflection at the pixel boundary. The NIR images taken by the demo camera equip with a C-mount lens show 75% sensitivity enhancement in the λ = 700-1200 nm wavelength range with negligible spatial resolution degradation. Light trapping CIS pixel technology promises to improve NIR sensitivity and appears to be applicable to many different image sensor applications including security camera, personal authentication, and range finding Time-of-Flight camera with IR illuminations.

  17. Evaluation of the AN/SAY-1 Thermal Imaging Sensor System

    National Research Council Canada - National Science Library

    Smith, John G; Middlebrook, Christopher T

    2002-01-01

    The AN/SAY-1 Thermal Imaging Sensor System "TISS" was developed to provide surface ships with a day/night imaging capability to detect low radar reflective, small cross-sectional area targets such as floating mines...

  18. Special Sensor Microwave Imager/Sounder (SSMIS) Temperature Data Record (TDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager/Sounder (SSMIS) is a series of passive microwave conically scanning imagers and sounders onboard the DMSP satellites beginning...

  19. Human Posture Recognition Based on Images Captured by the Kinect Sensor

    National Research Council Canada - National Science Library

    Wang, Wen-June; Chang, Jun-Wei; Haung, Shih-Fu; Wang, Rong-Jyue

    2016-01-01

    In this paper we combine several image processing techniques with the depth images captured by a Kinect sensor to successfully recognize the five distinct human postures of sitting, standing, stooping...

  20. Processor for Real-Time Atmospheric Compensation in Long-Range Imaging Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Long-range imaging is a critical component to many NASA applications including range surveillance, launch tracking, and astronomical observation. However,...

  1. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    Science.gov (United States)

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  2. Nitrogen-rich functional groups carbon nanoparticles based fluorescent pH sensor with broad-range responding for environmental and live cells applications.

    Science.gov (United States)

    Shi, Bingfang; Su, Yubin; Zhang, Liangliang; Liu, Rongjun; Huang, Mengjiao; Zhao, Shulin

    2016-08-15

    A nitrogen-rich functional groups carbon nanoparticles (N-CNs) based fluorescent pH sensor with a broad-range responding was prepared by one-pot hydrothermal treatment of melamine and triethanolamine. The as-prepared N-CNs exhibited excellent photoluminesence properties with an absolute quantum yield (QY) of 11.0%. Furthermore, the N-CNs possessed a broad-range pH response. The linear pH response range was 3.0 to 12.0, which is much wider than that of previously reported fluorescent pH sensors. The possible mechanism for the pH-sensitive response of the N-CNs was ascribed to photoinduced electron transfer (PET). Cell toxicity experiment showed that the as-prepared N-CNs exhibited low cytotoxicity and excellent biocompatibility with the cell viabilities of more than 87%. The proposed N-CNs-based pH sensor was used for pH monitoring of environmental water samples, and pH fluorescence imaging of live T24 cells. The N-CNs is promising as a convenient and general fluorescent pH sensor for environmental monitoring and bioimaging applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Highly Specific and Wide Range NO2 Sensor with Color Readout.

    Science.gov (United States)

    Fàbrega, Cristian; Fernández, Luis; Monereo, Oriol; Pons-Balagué, Alba; Xuriguera, Elena; Casals, Olga; Waag, Andreas; Prades, Joan Daniel

    2017-10-25

    We present a simple and inexpensive method to implement a Griess-Saltzman-type reaction that combines the advantages of the liquid phase method (high specificity and fast response time) with the benefits of a solid implementation (easy to handle). We demonstrate that the measurements can be carried out using conventional RGB sensors; circumventing all the limitations around the measurement of the samples with spectrometers. We also present a method to optimize the measurement protocol and target a specific range of NO2 concentrations. We demonstrate that it is possible to measure the concentration of NO2 from 50 ppb to 300 ppm with high specificity and without modifying the Griess-Saltzman reagent.

  4. Mathematical Model and Calibration Experiment of a Large Measurement Range Flexible Joints 6-UPUR Six-Axis Force Sensor

    Directory of Open Access Journals (Sweden)

    Yanzhi Zhao

    2016-08-01

    Full Text Available Nowadays improving the accuracy and enlarging the measuring range of six-axis force sensors for wider applications in aircraft landing, rocket thrust, and spacecraft docking testing experiments has become an urgent objective. However, it is still difficult to achieve high accuracy and large measuring range with traditional parallel six-axis force sensors due to the influence of the gap and friction of the joints. Therefore, to overcome the mentioned limitations, this paper proposed a 6-Universal-Prismatic-Universal-Revolute (UPUR joints parallel mechanism with flexible joints to develop a large measurement range six-axis force sensor. The structural characteristics of the sensor are analyzed in comparison with traditional parallel sensor based on the Stewart platform. The force transfer relation of the sensor is deduced, and the force Jacobian matrix is obtained using screw theory in two cases of the ideal state and the state of flexibility of each flexible joint is considered. The prototype and loading calibration system are designed and developed. The K value method and least squares method are used to process experimental data, and in errors of kind Ι and kind II linearity are obtained. The experimental results show that the calibration error of the K value method is more than 13.4%, and the calibration error of the least squares method is 2.67%. The experimental results prove the feasibility of the sensor and the correctness of the theoretical analysis which are expected to be adopted in practical applications.

  5. Radiometric Normalization of Large Airborne Image Data Sets Acquired by Different Sensor Types

    Science.gov (United States)

    Gehrke, S.; Beshah, B. T.

    2016-06-01

    Generating seamless mosaics of aerial images is a particularly challenging task when the mosaic comprises a large number of im-ages, collected over longer periods of time and with different sensors under varying imaging conditions. Such large mosaics typically consist of very heterogeneous image data, both spatially (different terrain types and atmosphere) and temporally (unstable atmo-spheric properties and even changes in land coverage). We present a new radiometric normalization or, respectively, radiometric aerial triangulation approach that takes advantage of our knowledge about each sensor's properties. The current implementation supports medium and large format airborne imaging sensors of the Leica Geosystems family, namely the ADS line-scanner as well as DMC and RCD frame sensors. A hierarchical modelling - with parameters for the overall mosaic, the sensor type, different flight sessions, strips and individual images - allows for adaptation to each sensor's geometric and radiometric properties. Additional parameters at different hierarchy levels can compensate radiome-tric differences of various origins to compensate for shortcomings of the preceding radiometric sensor calibration as well as BRDF and atmospheric corrections. The final, relative normalization is based on radiometric tie points in overlapping images, absolute radiometric control points and image statistics. It is computed in a global least squares adjustment for the entire mosaic by altering each image's histogram using a location-dependent mathematical model. This model involves contrast and brightness corrections at radiometric fix points with bilinear interpolation for corrections in-between. The distribution of the radiometry fixes is adaptive to each image and generally increases with image size, hence enabling optimal local adaptation even for very long image strips as typi-cally captured by a line-scanner sensor. The normalization approach is implemented in HxMap software. It has been

  6. Information theory analysis of sensor-array imaging systems for computer vision

    Science.gov (United States)

    Huck, F. O.; Fales, C. L.; Park, S. K.; Samms, R. W.; Self, M. O.

    1983-01-01

    Information theory is used to assess the performance of sensor-array imaging systems, with emphasis on the performance obtained with image-plane signal processing. By electronically controlling the spatial response of the imaging system, as suggested by the mechanism of human vision, it is possible to trade-off edge enhancement for sensitivity, increase dynamic range, and reduce data transmission. Computational results show that: signal information density varies little with large variations in the statistical properties of random radiance fields; most information (generally about 85 to 95 percent) is contained in the signal intensity transitions rather than levels; and performance is optimized when the OTF of the imaging system is nearly limited to the sampling passband to minimize aliasing at the cost of blurring, and the SNR is very high to permit the retrieval of small spatial detail from the extensively blurred signal. Shading the lens aperture transmittance to increase depth of field and using a regular hexagonal sensor-array instead of square lattice to decrease sensitivity to edge orientation also improves the signal information density up to about 30 percent at high SNRs.

  7. Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition

    OpenAIRE

    Chulhee Park; Moon Gi Kang

    2016-01-01

    A multispectral filter array (MSFA) image sensor with red, green, blue and near-infrared (NIR) filters is useful for various imaging applications with the advantages that it obtains color information and NIR information simultaneously. Because the MSFA image sensor needs to acquire invisible band information, it is necessary to remove the IR cut-offfilter (IRCF). However, without the IRCF, the color of the image is desaturated by the interference of the additional NIR component of each RGB co...

  8. Long-Range Reconnaissance Imager on New Horizons

    Science.gov (United States)

    Cheng, A. F.; Weaver, H. A.; Conard, S. J.; Hayes, J. R.; Morgan, M. F.; Noble, M.; Taylor, H. W.; Barnouin, O.; Boldt, J. D.; Darlington, E. H.; Grey, M. P.; Magee, T.; Rossano, E.; Schlemm, C.; Kosakowski, K. E.; Sampath, D.

    2012-10-01

    LORRI is the highest resolution imager on the New Horizons (NH) mission to Pluto and the Kuiper belt. LORRI produced superb images of Jupiter and its satellites even though those bodies are ~35 times brighter than bodies in the Pluto system.

  9. A Multi-Sensor Fusion MAV State Estimation from Long-Range Stereo, IMU, GPS and Barometric Sensors.

    Science.gov (United States)

    Song, Yu; Nuske, Stephen; Scherer, Sebastian

    2016-12-22

    State estimation is the most critical capability for MAV (Micro-Aerial Vehicle) localization, autonomous obstacle avoidance, robust flight control and 3D environmental mapping. There are three main challenges for MAV state estimation: (1) it can deal with aggressive 6 DOF (Degree Of Freedom) motion; (2) it should be robust to intermittent GPS (Global Positioning System) (even GPS-denied) situations; (3) it should work well both for low- and high-altitude flight. In this paper, we present a state estimation technique by fusing long-range stereo visual odometry, GPS, barometric and IMU (Inertial Measurement Unit) measurements. The new estimation system has two main parts, a stochastic cloning EKF (Extended Kalman Filter) estimator that loosely fuses both absolute state measurements (GPS, barometer) and the relative state measurements (IMU, visual odometry), and is derived and discussed in detail. A long-range stereo visual odometry is proposed for high-altitude MAV odometry calculation by using both multi-view stereo triangulation and a multi-view stereo inverse depth filter. The odometry takes the EKF information (IMU integral) for robust camera pose tracking and image feature matching, and the stereo odometry output serves as the relative measurements for the update of the state estimation. Experimental results on a benchmark dataset and our real flight dataset show the effectiveness of the proposed state estimation system, especially for the aggressive, intermittent GPS and high-altitude MAV flight.

  10. Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaoying Zhang

    2014-12-01

    Full Text Available Wireless sensor networks (WSNs are indispensable building blocks for the Internet of Things (IoT. With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.

  11. Method for calculating self-noise spectra and operating ranges for seismographic inertial sensors and recorders

    Science.gov (United States)

    Evans, John R.; Followill, F.; Hutt, Charles R.; Kromer, R.P.; Nigbor, R.L.; Ringler, A.T.; Steim, J.M.; Wielandt, E.

    2010-01-01

    Understanding the performance of sensors and recorders is prerequisite to making appropriate use of them in seismology and earthquake engineering. This paper explores a critical aspect of instrument performance, the “self” noise level of the device and the amplitude range it can usefully record. Self noise limits the smallest signals, while instrument clipping level creates the upper limit (above which it either cannot produce signals or becomes unacceptably nonlinear). Where these levels fall, and the “operating range” between them, determines much of the instrument's viability and the applications for which it is appropriate. The representation of seismic-instrument self-noise levels and their effective operating ranges (cf., dynamic range) for seismological inertial sensors, recorders (data acquisition units, or DAUs), and integrated systems of sensors and recorders (data acquisition systems, or DASs) forces one to address an unnatural comparison between transient finite-bandwidth signals, such as earthquake records, and the instrument's self noise, an effectively stationary signal of infinite duration. In addition to being transient, earthquakes and other records of interest are characterized by a peak amplitude and generally a narrow, peaked spectral shape. Unfortunately, any power spectrum computed for such transient signals is ill defined, since the maximum of that spectrum depends strongly upon signal and record durations. In contrast, the noise floor of an instrument is approximately stationary and properly described by a power spectral density (PSD) or its root (rPSD). Put another way, earthquake records have units of amplitude (e.g., m/s2) while PSDs have units of amplitude-squared per hertz (e.g., (m/s2)2/Hz) and the rPSD has units of amplitude per root of hertz (e.g., (m/s2)/Hz1/2). Thus, this incompatability is a conflict between earthquake (amplitude) and PSD (spectral density) units that requires one to make various assumptions before they

  12. PROCESSING OF UAV BASED RANGE IMAGING DATA TO GENERATE DETAILED ELEVATION MODELS OF COMPLEX NATURAL STRUCTURES

    Directory of Open Access Journals (Sweden)

    T. K. Kohoutek

    2012-07-01

    Full Text Available Unmanned Aerial Vehicles (UAVs are more and more used in civil areas like geomatics. Autonomous navigated platforms have a great flexibility in flying and manoeuvring in complex environments to collect remote sensing data. In contrast to standard technologies such as aerial manned platforms (airplanes and helicopters UAVs are able to fly closer to the object and in small-scale areas of high-risk situations such as landslides, volcano and earthquake areas and floodplains. Thus, UAVs are sometimes the only practical alternative in areas where access is difficult and where no manned aircraft is available or even no flight permission is given. Furthermore, compared to terrestrial platforms, UAVs are not limited to specific view directions and could overcome occlusions from trees, houses and terrain structures. Equipped with image sensors and/or laser scanners they are able to provide elevation models, rectified images, textured 3D-models and maps. In this paper we will describe a UAV platform, which can carry a range imaging (RIM camera including power supply and data storage for the detailed mapping and monitoring of complex structures, such as alpine riverbed areas. The UAV platform NEO from Swiss UAV was equipped with the RIM camera CamCube 2.0 by PMD Technologies GmbH to capture the surface structures. Its navigation system includes an autopilot. To validate the UAV-trajectory a 360° prism was installed and tracked by a total station. Within the paper a workflow for the processing of UAV-RIM data is proposed, which is based on the processing of differential GNSS data in combination with the acquired range images. Subsequently, the obtained results for the trajectory are compared and verified with a track of a UAV (Falcon 8, Ascending Technologies carried out with a total station simultaneously to the GNSS data acquisition. The results showed that the UAV's position using differential GNSS could be determined in the centimetre to the decimetre

  13. The Effect of a Pre-Lens Aperture on the Temperature Range and Image Uniformity of Microbolometer Infrared Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Dinwiddie, Ralph Barton [ORNL; Parris, Larkin S. [Wichita State University; Lindal, John M. [Oak Ridge National Laboratory (ORNL); Kunc, Vlastimil [ORNL

    2016-01-01

    This paper explores the temperature range extension of long-wavelength infrared (LWIR) cameras by placing an aperture in front of the lens. An aperture smaller than the lens will reduce the radiance to the sensor, allowing the camera to image targets much hotter than typically allowable. These higher temperatures were accurately determined after developing a correction factor which was applied to the built-in temperature calibration. The relationship between aperture diameter and temperature range is linear. The effect of pre-lens apertures on the image uniformity is a form of anti-vignetting, meaning the corners appear brighter (hotter) than the rest of the image. An example of using this technique to measure temperatures of high melting point polymers during 3D printing provide valuable information of the time required for the weld-line temperature to fall below the glass transition temperature.

  14. Few-photon color imaging using energy-dispersive superconducting transition-edge sensor spectrometry.

    Science.gov (United States)

    Niwa, Kazuki; Numata, Takayuki; Hattori, Kaori; Fukuda, Daiji

    2017-04-04

    Highly sensitive spectral imaging is increasingly being demanded in bioanalysis research and industry to obtain the maximum information possible from molecules of different colors. We introduce an application of the superconducting transition-edge sensor (TES) technique to highly sensitive spectral imaging. A TES is an energy-dispersive photodetector that can distinguish the wavelength of each incident photon. Its effective spectral range is from the visible to the infrared (IR), up to 2800 nm, which is beyond the capabilities of other photodetectors. TES was employed in this study in a fiber-coupled optical scanning microscopy system, and a test sample of a three-color ink pattern was observed. A red-green-blue (RGB) image and a near-IR image were successfully obtained in the few-incident-photon regime, whereas only a black and white image could be obtained using a photomultiplier tube. Spectral data were also obtained from a selected focal area out of the entire image. The results of this study show that TES is feasible for use as an energy-dispersive photon-counting detector in spectral imaging applications.

  15. An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability.

    Science.gov (United States)

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U

    2015-03-06

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  16. An Ultra-Low Power CMOS Image Sensor with On-Chip Energy Harvesting and Power Management Capability

    Directory of Open Access Journals (Sweden)

    Ismail Cevik

    2015-03-01

    Full Text Available An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT-based power management system (PMS is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  17. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization.

    Science.gov (United States)

    Hutcheson, Joshua A; Majid, Aneeka A; Powless, Amy J; Muldoon, Timothy J

    2015-09-01

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min(-1) with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels(-1).

  18. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Hutcheson, Joshua A.; Majid, Aneeka A.; Powless, Amy J.; Muldoon, Timothy J., E-mail: tmuldoon@uark.edu [Department of Biomedical Engineering, University of Arkansas, 120 Engineering Hall, Fayetteville, Arkansas 72701 (United States)

    2015-09-15

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min{sup −1} with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels{sup −1}.

  19. Range estimation techniques in single-station thunderstorm warning sensors based upon gated, wideband, magnetic direction finder technology

    Science.gov (United States)

    Pifer, Alburt E.; Hiscox, William L.; Cummins, Kenneth L.; Neumann, William T.

    1991-01-01

    Gated, wideband, magnetic direction finders (DFs) were originally designed to measure the bearing of cloud-to-ground lightning relative to the sensor. A recent addition to this device uses proprietary waveform discrimination logic to select return stroke signatures and certain range dependent features in the waveform to provide an estimate of range of flashes within 50 kms. The enhanced ranging techniques are discussed which were designed and developed for use in single station thunderstorm warning sensor. Included are the results of on-going evaluations being conducted under a variety of meteorological and geographic conditions.

  20. Low-light color image enhancement via iterative noise reduction using RGB/NIR sensor

    Science.gov (United States)

    Yamashita, Hiroki; Sugimura, Daisuke; Hamamoto, Takayuki

    2017-07-01

    We propose a method to enhance the color image of a low-light scene using a single sensor that simultaneously captures red, green, blue (RGB), and near-infrared (NIR) information. Typical image enhancement methods require two sensors to simultaneously capture color and NIR images. In contrast, our proposed system utilizes a single sensor but achieves accurate color image restoration. We divide the captured multispectral data into RGB and NIR information based on the spectral sensitivity of our imaging system. Using the NIR information for guidance, we reconstruct the corresponding color image based on a joint demosaicking and denoising technique. Subsequently, we restore the estimated color image iteratively using the constructed guidance image. Our experiments demonstrate the effectiveness of our method using synthetic data, and real raw data captured by our imaging system.

  1. Development of a handheld widefield hyperspectral imaging (HSI) sensor for standoff detection of explosive, chemical, and narcotic residues

    Science.gov (United States)

    Nelson, Matthew P.; Basta, Andrew; Patil, Raju; Klueva, Oksana; Treado, Patrick J.

    2013-05-01

    The utility of Hyper Spectral Imaging (HSI) passive chemical detection employing wide field, standoff imaging continues to be advanced in detection applications. With a drive for reduced SWaP (Size, Weight, and Power), increased speed of detection and sensitivity, developing a handheld platform that is robust and user-friendly increases the detection capabilities of the end user. In addition, easy to use handheld detectors could improve the effectiveness of locating and identifying threats while reducing risks to the individual. ChemImage Sensor Systems (CISS) has developed the HSI Aperio™ sensor for real time, wide area surveillance and standoff detection of explosives, chemical threats, and narcotics for use in both government and commercial contexts. Employing liquid crystal tunable filter technology, the HSI system has an intuitive user interface that produces automated detections and real-time display of threats with an end user created library of threat signatures that is easily updated allowing for new hazardous materials. Unlike existing detection technologies that often require close proximity for sensing and so endanger operators and costly equipment, the handheld sensor allows the individual operator to detect threats from a safe distance. Uses of the sensor include locating production facilities of illegal drugs or IEDs by identification of materials on surfaces such as walls, floors, doors, deposits on production tools and residue on individuals. In addition, the sensor can be used for longer-range standoff applications such as hasty checkpoint or vehicle inspection of residue materials on surfaces or bulk material identification. The CISS Aperio™ sensor has faster data collection, faster image processing, and increased detection capability compared to previous sensors.

  2. In-Vivo High Dynamic Range Vector Flow Imaging

    DEFF Research Database (Denmark)

    Villagómez Hoyos, Carlos Armando; Stuart, Matthias Bo; Jensen, Jørgen Arendt

    2015-01-01

    Current vector flow systems are limited in their detectable range of blood flow velocities. Previous work on phantoms has shown that the velocity range can be extended using synthetic aperture directional beamforming combined with an adaptive multi-lag approach. This paper presents a first invivo...

  3. A new type of remote sensors which allow directly forming certain statistical estimates of images

    Science.gov (United States)

    Podlaskin, Boris; Guk, Elena; Karpenko, Andrey

    2010-10-01

    A new approach to the problems of statistical and structural pattern recognition, a signal processing and image analysis techniques has been considered. These problems are extremely important for tasks being solved by airborne and space borne remote sensing systems. Development of new remote sensors for image and signal processing is inherently connected with a possibility of statistical processing of images. Fundamentally new optoelectronic sensors "Multiscan" have been suggested in the present paper. Such sensors make it possible to form directly certain statistical estimates, which describe completely enough the different types of images. The sensors under discussion perform the Lebesgue-Stieltjes signal integration rather than the Cauchy-Riemann one. That permits to create integral functionals for determining statistical features of images. The use of the integral functionals for image processing provides a good agreement of obtained statistical estimates with required image information features. The Multiscan remote sensors allows to create a set of integral moments of an input image right up to high-order integral moments, to form a quantile representation of an input image, which provides a count number limited texture, to form a median, which provides a localisation of a low-contrast horizon line in fog, localisation of water flow boundary etc. This work presents both the description of the design concept of the new remote sensor and mathematical apparatus providing the possibility to create input image statistical features and integral functionals.

  4. Development of Thermal Infrared Sensor to Supplement Operational Land Imager

    Science.gov (United States)

    Shu, Peter; Waczynski, Augustyn; Kan, Emily; Wen, Yiting; Rosenberry, Robert

    2012-01-01

    The thermal infrared sensor (TIRS) is a quantum well infrared photodetector (QWIP)-based instrument intended to supplement the Operational Land Imager (OLI) for the Landsat Data Continuity Mission (LDCM). The TIRS instrument is a far-infrared imager operating in the pushbroom mode with two IR channels: 10.8 and 12 m. The focal plane will contain three 640 512 QWIP arrays mounted onto a silicon substrate. The readout integrated circuit (ROIC) addresses each pixel on the QWIP arrays and reads out the pixel value (signal). The ROIC is controlled by the focal plane electronics (FPE) by means of clock signals and bias voltage value. The means of how the FPE is designed to control and interact with the TIRS focal plane assembly (FPA) is the basis for this work. The technology developed under the FPE is for the TIRS focal plane assembly (FPA). The FPE must interact with the FPA to command and control the FPA, extract analog signals from the FPA, and then convert the analog signals to digital format and send them via a serial link (USB) to a computer. The FPE accomplishes the described functions by converting electrical power from generic power supplies to the required bias power that is needed by the FPA. The FPE also generates digital clocking signals and shifts the typical transistor-to-transistor logic (TTL) to }5 V required by the FPA. The FPE also uses an application- specific integrated circuit (ASIC) named System Image, Digitizing, Enhancing, Controlling, And Retrieving (SIDECAR) from Teledyne Corp. to generate the clocking patterns commanded by the user. The uniqueness of the FPE for TIRS lies in that the TIRS FPA has three QWIP detector arrays, and all three detector arrays must be in synchronization while in operation. This is to avoid data skewing while observing Earth flying in space. The observing scenario may be customized by uploading new control software to the SIDECAR.

  5. Adaptive optics instrument for long-range imaging. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, T.M.

    1998-06-01

    The science and history of imaging through a turbulent atmosphere is reviewed in detail. Traditional methods for reducing the effects of turbulence are presented. A simplified method for turbulence reduction called the Sheared Coherent Interferometric Photography (SCIP) method is presented. Implementation of SCIP is discussed along with experimental results. Limitations in the use of this method are discussed along with recommendations for future improvements.

  6. Security SVGA image sensor with on-chip video data authentication and cryptographic circuit

    Science.gov (United States)

    Stifter, P.; Eberhardt, K.; Erni, A.; Hofmann, K.

    2005-10-01

    Security applications of sensors in a networking environment has a strong demand of sensor authentication and secure data transmission due to the possibility of man-in-the-middle and address spoofing attacks. Therefore a secure sensor system should fulfil the three standard requirements of cryptography, namely data integrity, authentication and non-repudiation. This paper is intended to present the unique sensor development by AIM, the so called SecVGA, which is a high performance, monochrome (B/W) CMOS active pixel image sensor. The device is capable of capturing still and motion images with a resolution of 800x600 active pixels and converting the image into a digital data stream. The distinguishing feature of this development in comparison to standard imaging sensors is the on-chip cryptographic engine which provides the sensor authentication, based on a one-way challenge/response protocol. The implemented protocol results in the exchange of a session-key which will secure the following video data transmission. This is achieved by calculating a cryptographic checksum derived from a stateful hash value of the complete image frame. Every sensor contains an EEPROM memory cell for the non-volatile storage of a unique identifier. The imager is programmable via a two-wire I2C compatible interface which controls the integration time, the active window size of the pixel array, the frame rate and various operating modes including the authentication procedure.

  7. Low-cost compact thermal imaging sensors for body temperature measurement

    Science.gov (United States)

    Han, Myung-Soo; Han, Seok Man; Kim, Hyo Jin; Shin, Jae Chul; Ahn, Mi Sook; Kim, Hyung Won; Han, Yong Hee

    2013-06-01

    This paper presents a 32x32 microbolometer thermal imaging sensor for human body temperature measurement. Waferlevel vacuum packaging technology allows us to get a low cost and compact imaging sensor chip. The microbolometer uses V-W-O film as sensing material and ROIC has been designed 0.35-um CMOS process in UMC. A thermal image of a human face and a hand using f/1 lens convinces that it has a potential of human body temperature for commercial use.

  8. Discrimination between Sedimentary Rocks from Close-Range Visible and Very-Near-Infrared Images.

    Directory of Open Access Journals (Sweden)

    Susana Del Pozo

    Full Text Available Variation in the mineral composition of rocks results in a change of their spectral response capable of being studied by imaging spectroscopy. This paper proposes the use of a low-cost handy sensor, a calibrated visible-very near infrared (VIS-VNIR multispectral camera for the recognition of different geological formations. The spectral data was recorded by a Tetracam Mini-MCA-6 camera mounted on a field-based platform covering six bands in the spectral range of 0.530-0.801 µm. Twelve sedimentary formations were selected in the Rhône-Alpes region (France to analyse the discrimination potential of this camera for rock types and close-range mapping applications. After proper corrections and data processing, a supervised classification of the multispectral data was performed trying to distinguish four classes: limestones, marlstones, vegetation and shadows. After a maximum-likelihood classification, results confirmed that this camera can be efficiently exploited to map limestone-marlstone alternations in geological formations with this mineral composition.

  9. Discrimination between Sedimentary Rocks from Close-Range Visible and Very-Near-Infrared Images.

    Science.gov (United States)

    Del Pozo, Susana; Lindenbergh, Roderik; Rodríguez-Gonzálvez, Pablo; Kees Blom, Jan; González-Aguilera, Diego

    2015-01-01

    Variation in the mineral composition of rocks results in a change of their spectral response capable of being studied by imaging spectroscopy. This paper proposes the use of a low-cost handy sensor, a calibrated visible-very near infrared (VIS-VNIR) multispectral camera for the recognition of different geological formations. The spectral data was recorded by a Tetracam Mini-MCA-6 camera mounted on a field-based platform covering six bands in the spectral range of 0.530-0.801 µm. Twelve sedimentary formations were selected in the Rhône-Alpes region (France) to analyse the discrimination potential of this camera for rock types and close-range mapping applications. After proper corrections and data processing, a supervised classification of the multispectral data was performed trying to distinguish four classes: limestones, marlstones, vegetation and shadows. After a maximum-likelihood classification, results confirmed that this camera can be efficiently exploited to map limestone-marlstone alternations in geological formations with this mineral composition.

  10. Optimal Sensor Placement for Multiple Target Positioning with Range-Only Measurements in Two-Dimensional Scenarios

    Directory of Open Access Journals (Sweden)

    Joaquin Aranda

    2013-08-01

    Full Text Available The problem of determining the optimal geometric configuration of a sensor network that will maximize the range-related information available for multiple target positioning is of key importance in a multitude of application scenarios. In this paper, a set of sensors that measures the distances between the targets and each of the receivers is considered, assuming that the range measurements are corrupted by white Gaussian noise, in order to search for the formation that maximizes the accuracy of the target estimates. Using tools from estimation theory and convex optimization, the problem is converted into that of maximizing, by proper choice of the sensor positions, a convex combination of the logarithms of the determinants of the Fisher Information Matrices corresponding to each of the targets in order to determine the sensor configuration that yields the minimum possible covariance of any unbiased target estimator. Analytical and numerical solutions are well defined and it is shown that the optimal configuration of the sensors depends explicitly on the constraints imposed on the sensor configuration, the target positions, and the probabilistic distributions that define the prior uncertainty in each of the target positions. Simulation examples illustrate the key results derived.

  11. Highly Sensitive and Wide-Dynamic-Range Multichannel Optical-Fiber pH Sensor Based on PWM Technique.

    Science.gov (United States)

    Khan, Md Rajibur Rahaman; Kang, Shin-Won

    2016-11-09

    In this study, we propose a highly sensitive multichannel pH sensor that is based on an optical-fiber pulse width modulation (PWM) technique. According to the optical-fiber PWM method, the received sensing signal's pulse width changes when the optical-fiber pH sensing-element of the array comes into contact with pH buffer solutions. The proposed optical-fiber PWM pH-sensing system offers a linear sensing response over a wide range of pH values from 2 to 12, with a high pH-sensing ability. The sensitivity of the proposed pH sensor is 0.46 µs/pH, and the correlation coefficient R² is approximately 0.997. Additional advantages of the proposed optical-fiber PWM pH sensor include a short/fast response-time of about 8 s, good reproducibility properties with a relative standard deviation (RSD) of about 0.019, easy fabrication, low cost, small size, reusability of the optical-fiber sensing-element, and the capability of remote sensing. Finally, the performance of the proposed PWM pH sensor was compared with that of potentiometric, optical-fiber modal interferometer, and optical-fiber Fabry-Perot interferometer pH sensors with respect to dynamic range width, linearity as well as response and recovery times. We observed that the proposed sensing systems have better sensing abilities than the above-mentioned pH sensors.

  12. Hard-X-Ray/Soft-Gamma-Ray Imaging Sensor Assembly for Astronomy

    Science.gov (United States)

    Myers, Richard A.

    2008-01-01

    An improved sensor assembly has been developed for astronomical imaging at photon energies ranging from 1 to 100 keV. The assembly includes a thallium-doped cesium iodide scintillator divided into pixels and coupled to an array of high-gain avalanche photodiodes (APDs). Optionally, the array of APDs can be operated without the scintillator to detect photons at energies below 15 keV. The array of APDs is connected to compact electronic readout circuitry that includes, among other things, 64 independent channels for detection of photons in various energy ranges, up to a maximum energy of 100 keV, at a count rate up to 3 kHz. The readout signals are digitized and processed by imaging software that performs "on-the-fly" analysis. The sensor assembly has been integrated into an imaging spectrometer, along with a pair of coded apertures (Fresnel zone plates) that are used in conjunction with the pixel layout to implement a shadow-masking technique to obtain relatively high spatial resolution without having to use extremely small pixels. Angular resolutions of about 20 arc-seconds have been measured. Thus, for example, the imaging spectrometer can be used to (1) determine both the energy spectrum of a distant x-ray source and the angular deviation of the source from the nominal line of sight of an x-ray telescope in which the spectrometer is mounted or (2) study the spatial and temporal development of solar flares, repeating - ray bursters, and other phenomena that emit transient radiation in the hard-x-ray/soft- -ray region of the electromagnetic spectrum.

  13. Improving the Ability of Image Sensors to Detect Faint Stars and Moving Objects Using Image Deconvolution Techniques

    Directory of Open Access Journals (Sweden)

    Octavi Fors

    2010-03-01

    Full Text Available In this paper we show how the techniques of image deconvolution can increase the ability of image sensors as, for example, CCD imagers, to detect faint stars or faint orbital objects (small satellites and space debris. In the case of faint stars, we show that this benefit is equivalent to double the quantum efficiency of the used image sensor or to increase the effective telescope aperture by more than 30% without decreasing the astrometric precision or introducing artificial bias. In the case of orbital objects, the deconvolution technique can double the signal-to-noise ratio of the image, which helps to discover and control dangerous objects as space debris or lost satellites. The benefits obtained using CCD detectors can be extrapolated to any kind of image sensors.

  14. Improving the ability of image sensors to detect faint stars and moving objects using image deconvolution techniques.

    Science.gov (United States)

    Fors, Octavi; Núñez, Jorge; Otazu, Xavier; Prades, Albert; Cardinal, Robert D

    2010-01-01

    In this paper we show how the techniques of image deconvolution can increase the ability of image sensors as, for example, CCD imagers, to detect faint stars or faint orbital objects (small satellites and space debris). In the case of faint stars, we show that this benefit is equivalent to double the quantum efficiency of the used image sensor or to increase the effective telescope aperture by more than 30% without decreasing the astrometric precision or introducing artificial bias. In the case of orbital objects, the deconvolution technique can double the signal-to-noise ratio of the image, which helps to discover and control dangerous objects as space debris or lost satellites. The benefits obtained using CCD detectors can be extrapolated to any kind of image sensors.

  15. Adaptive sensing and optimal power allocation for wireless video sensors with sigma-delta imager.

    Science.gov (United States)

    Marijan, Malisa; Demirkol, Ilker; Maricić I, Danijel; Sharma, Gaurav; Ignjatovi, Zeljko

    2010-10-01

    We consider optimal power allocation for wireless video sensors (WVSs), including the image sensor subsystem in the system analysis. By assigning a power-rate-distortion (P-R-D) characteristic for the image sensor, we build a comprehensive P-R-D optimization framework for WVSs. For a WVS node operating under a power budget, we propose power allocation among the image sensor, compression, and transmission modules, in order to minimize the distortion of the video reconstructed at the receiver. To demonstrate the proposed optimization method, we establish a P-R-D model for an image sensor based upon a pixel level sigma-delta (Σ∆) image sensor design that allows investigation of the tradeoff between the bit depth of the captured images and spatio-temporal characteristics of the video sequence under the power constraint. The optimization results obtained in this setting confirm that including the image sensor in the system optimization procedure can improve the overall video quality under power constraint and prolong the lifetime of the WVSs. In particular, when the available power budget for a WVS node falls below a threshold, adaptive sensing becomes necessary to ensure that the node communicates useful information about the video content while meeting its power budget.

  16. Indoor pedestrian navigation using foot-mounted IMU and portable ultrasound range sensors.

    Science.gov (United States)

    Girard, Gabriel; Côté, Stéphane; Zlatanova, Sisi; Barette, Yannick; St-Pierre, Johanne; van Oosterom, Peter

    2011-01-01

    Many solutions have been proposed for indoor pedestrian navigation. Some rely on pre-installed sensor networks, which offer good accuracy but are limited to areas that have been prepared for that purpose, thus requiring an expensive and possibly time-consuming process. Such methods are therefore inappropriate for navigation in emergency situations since the power supply may be disturbed. Other types of solutions track the user without requiring a prepared environment. However, they may have low accuracy. Offline tracking has been proposed to increase accuracy, however this prevents users from knowing their position in real time. This paper describes a real time indoor navigation system that does not require prepared building environments and provides tracking accuracy superior to previously described tracking methods. The system uses a combination of four techniques: foot-mounted IMU (Inertial Motion Unit), ultrasonic ranging, particle filtering and model-based navigation. The very purpose of the project is to combine these four well-known techniques in a novel way to provide better indoor tracking results for pedestrians.

  17. An ultrasensitive method of real time pH monitoring with complementary metal oxide semiconductor image sensor.

    Science.gov (United States)

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2015-02-09

    CMOS sensors are becoming a powerful tool in the biological and chemical field. In this work, we introduce a new approach on quantifying various pH solutions with a CMOS image sensor. The CMOS image sensor based pH measurement produces high-accuracy analysis, making it a truly portable and user friendly system. pH indicator blended hydrogel matrix was fabricated as a thin film to the accurate color development. A distinct color change of red, green and blue (RGB) develops in the hydrogel film by applying various pH solutions (pH 1-14). The semi-quantitative pH evolution was acquired by visual read out. Further, CMOS image sensor absorbs the RGB color intensity of the film and hue value converted into digital numbers with the aid of an analog-to-digital converter (ADC) to determine the pH ranges of solutions. Chromaticity diagram and Euclidean distance represent the RGB color space and differentiation of pH ranges, respectively. This technique is applicable to sense the various toxic chemicals and chemical vapors by situ sensing. Ultimately, the entire approach can be integrated into smartphone and operable with the user friendly manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Wavelength-Scanning SPR Imaging Sensors Based on an Acousto-Optic Tunable Filter and a White Light Laser

    Directory of Open Access Journals (Sweden)

    Youjun Zeng

    2017-01-01

    Full Text Available A fast surface plasmon resonance (SPR imaging biosensor system based on wavelength interrogation using an acousto-optic tunable filter (AOTF and a white light laser is presented. The system combines the merits of a wide-dynamic detection range and high sensitivity offered by the spectral approach with multiplexed high-throughput data collection and a two-dimensional (2D biosensor array. The key feature is the use of AOTF to realize wavelength scan from a white laser source and thus to achieve fast tracking of the SPR dip movement caused by target molecules binding to the sensor surface. Experimental results show that the system is capable of completing a SPR dip measurement within 0.35 s. To the best of our knowledge, this is the fastest time ever reported in the literature for imaging spectral interrogation. Based on a spectral window with a width of approximately 100 nm, a dynamic detection range and resolution of 4.63 × 10−2 refractive index unit (RIU and 1.27 × 10−6 RIU achieved in a 2D-array sensor is reported here. The spectral SPR imaging sensor scheme has the capability of performing fast high-throughput detection of biomolecular interactions from 2D sensor arrays. The design has no mechanical moving parts, thus making the scheme completely solid-state.

  19. A wireless sensor network for vineyard monitoring that uses image processing.

    Science.gov (United States)

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis.

  20. Miniature large range multi-axis force-torque sensor for biomechanical applications

    NARCIS (Netherlands)

    Brookhuis, Robert Anton; Sanders, Remco G.P.; Ma, Kechun; Lammerink, Theodorus S.J.; de Boer, Meint J.; Krijnen, Gijsbertus J.M.; Wiegerink, Remco J.

    2015-01-01

    A miniature force sensor for the measurement of forces and moments at a human fingertip is designed and realized. Thin silicon pillars inside the sensor provide in-plane guidance for shear force measurement and provide the spring constant in normal direction. A corrugated silicon ring around the

  1. Development of a 55 μm pitch 8 inch CMOS image sensor for the high resolution NDT application

    Science.gov (United States)

    Kim, M. S.; Kim, G.; Cho, G.; Kim, D.

    2016-11-01

    A CMOS image sensor (CIS) with a large area for the high resolution X-ray imaging was designed. The sensor has an active area of 125 × 125 mm2 comprised with 2304 × 2304 pixels and a pixel size of 55 × 55 μm2. First batch samples were fabricated by using an 8 inch silicon CMOS image sensor process with a stitching method. In order to evaluate the performance of the first batch samples, the electro-optical test and the X-ray test after coupling with an image intensifier screen were performed. The primary results showed that the performance of the manufactured sensors was limited by a large stray capacitance from the long path length between the analog multiplexer on the chip and the bank ADC on the data acquisition board. The measured speed and dynamic range were limited up to 12 frame per sec and 55 dB respectively, but other parameters such as the MTF, NNPS and DQE showed a good result as designed. Based on this study, the new X-ray CIS with ~ 50 μm pitch and ~ 150 cm2 active area are going to be designed for the high resolution X-ray NDT equipment for semiconductor and PCB inspections etc.

  2. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Kiyotaka Sasagawa

    2010-12-01

    Full Text Available In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors’ architecture on the basis of the type of electric measurement or imaging functionalities.

  3. The influence of sensor and flight parameters on texture in radar images

    Science.gov (United States)

    Frost, V. S.; Shanmugan, K. S.; Holtzman, J. C.

    1984-01-01

    Texture is known to be important in the analysis of radar images for geologic applications. It has previously been shown that texture features derived from the grey level co-occurrence matrix (GLCM) can be used to separate large scale texture in radar images. Here the influence of sensor parameters, specifically the spatial and radiometric resolution and flight parameters, i.e., the orientation of the surface structure relative to the sensor, on the ability to classify texture based on the GLCM features is investigated. It was found that changing these sensor and flight parameters greatly affects the usefulness of the GLCM for classifying texture on radar images.

  4. Lookup Table Hough Transform for Real Time Range Image Segmentation and Featureless Co-Registration

    NARCIS (Netherlands)

    Gorte, B.G.H.; Sithole, G.

    2012-01-01

    The paper addresses range image segmentation, particularly of data recorded by range cameras, such as the Microsoft Kinect and the Mesa Swissranger SR4000. These devices record range images at video frame rates and allow for acqui-sition of 3-dimensional measurement sequences that can be used for 3D

  5. Discrimination between sedimentary rocks from close-range visible and very-near-infrared images

    NARCIS (Netherlands)

    Pozo, Susana Del; Lindenbergh, R.C.; Rodríguez-Gonzálvez, Pablo; Blom, J.C.; González-Aguilera, Diego

    2015-01-01

    Variation in the mineral composition of rocks results in a change of their spectral response capable of being studied by imaging spectroscopy. This paper proposes the use of a low-cost handy sensor, a calibrated visible-very near infrared (VIS-VNIR) multispectral camera for the recognition of

  6. A Monitoring System for Laying Hens That Uses a Detection Sensor Based on Infrared Technology and Image Pattern Recognition

    Science.gov (United States)

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bontempo, Valentino; Dell’Orto, Vittorio; Savoini, Giovanni

    2017-01-01

    In Italy, organic egg production farms use free-range housing systems with a big outdoor area and a flock of no more than 500 hens. With additional devices and/or farming procedures, the whole flock could be forced to stay in the outdoor area for a limited time of the day. As a consequence, ozone treatments of housing areas could be performed in order to reduce the levels of atmospheric ammonia and bacterial load without risks, due by its toxicity, both for hens and workers. However, an automatic monitoring system, and a sensor able to detect the presence of animals, would be necessary. For this purpose, a first sensor was developed but some limits, related to the time necessary to detect a hen, were observed. In this study, significant improvements, for this sensor, are proposed. They were reached by an image pattern recognition technique that was applied to thermografic images acquired from the housing system. An experimental group of seven laying hens was selected for the tests, carried out for three weeks. The first week was used to set-up the sensor. Different templates, to use for the pattern recognition, were studied and different floor temperature shifts were investigated. At the end of these evaluations, a template of elliptical shape, and sizes of 135 × 63 pixels, was chosen. Furthermore, a temperature shift of one degree was selected to calculate, for each image, a color background threshold to apply in the following field tests. Obtained results showed an improvement of the sensor detection accuracy that reached values of sensitivity and specificity of 95.1% and 98.7%. In addition, the range of time necessary to detect a hen, or classify a case, was reduced at two seconds. This result could allow the sensor to control a bigger area of the housing system. Thus, the resulting monitoring system could allow to perform the sanitary treatments without risks both for animals and humans. PMID:28538654

  7. A Monitoring System for Laying Hens That Uses a Detection Sensor Based on Infrared Technology and Image Pattern Recognition.

    Science.gov (United States)

    Zaninelli, Mauro; Redaelli, Veronica; Luzi, Fabio; Bontempo, Valentino; Dell'Orto, Vittorio; Savoini, Giovanni

    2017-05-24

    In Italy, organic egg production farms use free-range housing systems with a big outdoor area and a flock of no more than 500 hens. With additional devices and/or farming procedures, the whole flock could be forced to stay in the outdoor area for a limited time of the day. As a consequence, ozone treatments of housing areas could be performed in order to reduce the levels of atmospheric ammonia and bacterial load without risks, due by its toxicity, both for hens and workers. However, an automatic monitoring system, and a sensor able to detect the presence of animals, would be necessary. For this purpose, a first sensor was developed but some limits, related to the time necessary to detect a hen, were observed. In this study, significant improvements, for this sensor, are proposed. They were reached by an image pattern recognition technique that was applied to thermografic images acquired from the housing system. An experimental group of seven laying hens was selected for the tests, carried out for three weeks. The first week was used to set-up the sensor. Different templates, to use for the pattern recognition, were studied and different floor temperature shifts were investigated. At the end of these evaluations, a template of elliptical shape, and sizes of 135 × 63 pixels, was chosen. Furthermore, a temperature shift of one degree was selected to calculate, for each image, a color background threshold to apply in the following field tests. Obtained results showed an improvement of the sensor detection accuracy that reached values of sensitivity and specificity of 95.1% and 98.7%. In addition, the range of time necessary to detect a hen, or classify a case, was reduced at two seconds. This result could allow the sensor to control a bigger area of the housing system. Thus, the resulting monitoring system could allow to perform the sanitary treatments without risks both for animals and humans.

  8. Multi-wavelength laser sensor surface for high frame rate imaging refractometry (Conference Presentation)

    Science.gov (United States)

    Kristensen, Anders; Vannahme, Christoph; Sørensen, Kristian T.; Dufva, Martin

    2016-09-01

    A highly sensitive distributed feedback (DFB) dye laser sensor for high frame rate imaging refractometry without moving parts is presented. The laser sensor surface comprises areas of different grating periods. Imaging in two dimensions of space is enabled by analyzing laser light from all areas in parallel with an imaging spectrometer. Refractive index imaging of a 2 mm by 2 mm surface is demonstrated with a spatial resolution of 10 μm, a detection limit of 8 10-6 RIU, and a framerate of 12 Hz, limited by the CCD camera. Label-free imaging of dissolution dynamics is demonstrated.

  9. CMOS Imaging of Pin-Printed Xerogel-Based Luminescent Sensor Microarrays.

    Science.gov (United States)

    Yao, Lei; Yung, Ka Yi; Khan, Rifat; Chodavarapu, Vamsy P; Bright, Frank V

    2010-12-01

    We present the design and implementation of a luminescence-based miniaturized multisensor system using pin-printed xerogel materials which act as host media for chemical recognition elements. We developed a CMOS imager integrated circuit (IC) to image the luminescence response of the xerogel-based sensor array. The imager IC uses a 26 × 20 (520 elements) array of active pixel sensors and each active pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. The imager includes a correlated double sampling circuit and pixel address/digital control circuit; the image data is read-out as coded serial signal. The sensor system uses a light-emitting diode (LED) to excite the target analyte responsive luminophores doped within discrete xerogel-based sensor elements. As a prototype, we developed a 4 × 4 (16 elements) array of oxygen (O2) sensors. Each group of 4 sensor elements in the array (arranged in a row) is designed to provide a different and specific sensitivity to the target gaseous O2 concentration. This property of multiple sensitivities is achieved by using a strategic mix of two oxygen sensitive luminophores ([Ru(dpp)3]2+ and ([Ru(bpy)3]2+) in each pin-printed xerogel sensor element. The CMOS imager consumes an average power of 8 mW operating at 1 kHz sampling frequency driven at 5 V. The developed prototype system demonstrates a low cost and miniaturized luminescence multisensor system.

  10. Design and implementation of range-gated underwater laser imaging system

    Science.gov (United States)

    Ge, Wei-long; Zhang, Xiao-hui

    2014-02-01

    A range-gated underwater laser imaging system is designed and implemented in this article, which is made up of laser illumination subsystem, photoelectric imaging subsystem and control subsystem. The experiment of underwater target drone detection has been done, the target of distance 40m far from the range-gated underwater laser imaging system can be imaged in the pool which water attenuation coefficient is 0.159m-1. Experimental results show that the range-gated underwater laser imaging system can detect underwater objects effectively.

  11. Technical guidance for the development of a solid state image sensor for human low vision image warping

    Science.gov (United States)

    Vanderspiegel, Jan

    1994-01-01

    This report surveys different technologies and approaches to realize sensors for image warping. The goal is to study the feasibility, technical aspects, and limitations of making an electronic camera with special geometries which implements certain transformations for image warping. This work was inspired by the research done by Dr. Juday at NASA Johnson Space Center on image warping. The study has looked into different solid-state technologies to fabricate image sensors. It is found that among the available technologies, CMOS is preferred over CCD technology. CMOS provides more flexibility to design different functions into the sensor, is more widely available, and is a lower cost solution. By using an architecture with row and column decoders one has the added flexibility of addressing the pixels at random, or read out only part of the image.

  12. Low-Power Radio and Image-Sensor Package Project

    Data.gov (United States)

    National Aeronautics and Space Administration — One of the most effective sensor modalities for situational awareness is imagery. While typically high bandwidth and relegated to analog wireless communications,...

  13. Proximity gettering technology for advanced CMOS image sensors using carbon cluster ion-implantation technique. A review

    Energy Technology Data Exchange (ETDEWEB)

    Kurita, Kazunari; Kadono, Takeshi; Okuyama, Ryousuke; Shigemastu, Satoshi; Hirose, Ryo; Onaka-Masada, Ayumi; Koga, Yoshihiro; Okuda, Hidehiko [SUMCO Corporation, Saga (Japan)

    2017-07-15

    A new technique is described for manufacturing advanced silicon wafers with the highest capability yet reported for gettering transition metallic, oxygen, and hydrogen impurities in CMOS image sensor fabrication processes. Carbon and hydrogen elements are localized in the projection range of the silicon wafer by implantation of ion clusters from a hydrocarbon molecular gas source. Furthermore, these wafers can getter oxygen impurities out-diffused to device active regions from a Czochralski grown silicon wafer substrate to the carbon cluster ion projection range during heat treatment. Therefore, they can reduce the formation of transition metals and oxygen-related defects in the device active regions and improve electrical performance characteristics, such as the dark current, white spot defects, pn-junction leakage current, and image lag characteristics. The new technique enables the formation of high-gettering-capability sinks for transition metals, oxygen, and hydrogen impurities under device active regions of CMOS image sensors. The wafers formed by this technique have the potential to significantly improve electrical devices performance characteristics in advanced CMOS image sensors. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  14. Film cameras or digital sensors? The challenge ahead for aerial imaging

    Science.gov (United States)

    Light, D.L.

    1996-01-01

    Cartographic aerial cameras continue to play the key role in producing quality products for the aerial photography business, and specifically for the National Aerial Photography Program (NAPP). One NAPP photograph taken with cameras capable of 39 lp/mm system resolution can contain the equivalent of 432 million pixels at 11 ??m spot size, and the cost is less than $75 per photograph to scan and output the pixels on a magnetic storage medium. On the digital side, solid state charge coupled device linear and area arrays can yield quality resolution (7 to 12 ??m detector size) and a broader dynamic range. If linear arrays are to compete with film cameras, they will require precise attitude and positioning of the aircraft so that the lines of pixels can be unscrambled and put into a suitable homogeneous scene that is acceptable to an interpreter. Area arrays need to be much larger than currently available to image scenes competitive in size with film cameras. Analysis of the relative advantages and disadvantages of the two systems show that the analog approach is more economical at present. However, as arrays become larger, attitude sensors become more refined, global positioning system coordinate readouts become commonplace, and storage capacity becomes more affordable, the digital camera may emerge as the imaging system for the future. Several technical challenges must be overcome if digital sensors are to advance to where they can support mapping, charting, and geographic information system applications.

  15. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors.

    Science.gov (United States)

    Malinowski, Pawel E; Georgitzikis, Epimitheas; Maes, Jorick; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-12-10

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III-V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10-6 A/cm² at -2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.

  16. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors

    Science.gov (United States)

    Georgitzikis, Epimitheas; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-01-01

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III–V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10−6 A/cm2 at −2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors. PMID:29232871

  17. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors

    Directory of Open Access Journals (Sweden)

    Pawel E. Malinowski

    2017-12-01

    Full Text Available Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection, augmented reality (for eye tracking and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III–V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10−6 A/cm2 at −2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.

  18. Nanoimprinted distributed feedback dye laser sensor for real-time imaging of small molecule diffusion

    DEFF Research Database (Denmark)

    Vannahme, Christoph; Dufva, Martin; Kristensen, Anders

    2014-01-01

    distributed feedback (DFB) dye laser sensor for real-time label-free imaging without any moving parts enabling a frame rate of 12 Hz is presented. The presence of molecules on the laser surface results in a wavelength shift which is used as sensor signal. The unique DFB laser structure comprises several areas...... molecules in water....

  19. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  20. Single-photon sampling architecture for solid-state imaging sensors.

    Science.gov (United States)

    van den Berg, Ewout; Candès, Emmanuel; Chinn, Garry; Levin, Craig; Olcott, Peter Demetri; Sing-Long, Carlos

    2013-07-23

    Advances in solid-state technology have enabled the development of silicon photomultiplier sensor arrays capable of sensing individual photons. Combined with high-frequency time-to-digital converters (TDCs), this technology opens up the prospect of sensors capable of recording with high accuracy both the time and location of each detected photon. Such a capability could lead to significant improvements in imaging accuracy, especially for applications operating with low photon fluxes such as light detection and ranging and positron-emission tomography. The demands placed on on-chip readout circuitry impose stringent trade-offs between fill factor and spatiotemporal resolution, causing many contemporary designs to severely underuse the technology's full potential. Concentrating on the low photon flux setting, this paper leverages results from group testing and proposes an architecture for a highly efficient readout of pixels using only a small number of TDCs. We provide optimized design instances for various sensor parameters and compute explicit upper and lower bounds on the number of TDCs required to uniquely decode a given maximum number of simultaneous photon arrivals. To illustrate the strength of the proposed architecture, we note a typical digitization of a 60 × 60 photodiode sensor using only 142 TDCs. The design guarantees registration and unique recovery of up to four simultaneous photon arrivals using a fast decoding algorithm. By contrast, a cross-strip design requires 120 TDCs and cannot uniquely decode any simultaneous photon arrivals. Among other realistic simulations of scintillation events in clinical positron-emission tomography, the above design is shown to recover the spatiotemporal location of 99.98% of all detected photons.

  1. Design and Implementation of a Novel Compatible Encoding Scheme in the Time Domain for Image Sensor Communication

    Directory of Open Access Journals (Sweden)

    Trang Nguyen

    2016-05-01

    Full Text Available This paper presents a modulation scheme in the time domain based on On-Off-Keying and proposes various compatible supports for different types of image sensors. The content of this article is a sub-proposal to the IEEE 802.15.7r1 Task Group (TG7r1 aimed at Optical Wireless Communication (OWC using an image sensor as the receiver. The compatibility support is indispensable for Image Sensor Communications (ISC because the rolling shutter image sensors currently available have different frame rates, shutter speeds, sampling rates, and resolutions. However, focusing on unidirectional communications (i.e., data broadcasting, beacons, an asynchronous communication prototype is also discussed in the paper. Due to the physical limitations associated with typical image sensors (including low and varying frame rates, long exposures, and low shutter speeds, the link speed performance is critically considered. Based on the practical measurement of camera response to modulated light, an operating frequency range is suggested along with the similar system architecture, decoding procedure, and algorithms. A significant feature of our novel data frame structure is that it can support both typical frame rate cameras (in the oversampling mode as well as very low frame rate cameras (in the error detection mode for a camera whose frame rate is lower than the transmission packet rate. A high frame rate camera, i.e., no less than 20 fps, is supported in an oversampling mode in which a majority voting scheme for decoding data is applied. A low frame rate camera, i.e., when the frame rate drops to less than 20 fps at some certain time, is supported by an error detection mode in which any missing data sub-packet is detected in decoding and later corrected by external code. Numerical results and valuable analysis are also included to indicate the capability of the proposed schemes.

  2. BOREAS RSS-02 Level-1b ASAS Image Data: At-sensor Radiance in BSQ Format

    Data.gov (United States)

    National Aeronautics and Space Administration — The BOREAS RSS-02 team used the ASAS instrument, mounted on the NASA C-130 aircraft, to create at-sensor radiance images of various sites as a function of spectral...

  3. NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data Vb0

    Data.gov (United States)

    National Aeronautics and Space Administration — The NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data were collected by the LIS instrument on the ISS used to detect the...

  4. NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Backgrounds Vb0

    Data.gov (United States)

    National Aeronautics and Space Administration — The NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Backgrounds dataset was collected by the LIS instrument on the ISS used to detect the...

  5. Non-Quality Controlled Lightning Imaging Sensor (LIS) on International Space Station (ISS) Backgrounds Vb0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Non-Quality Controlled Lightning Imaging Sensor (LIS) on International Space Station (ISS) Backgrounds dataset was collected by the LIS instrument on the ISS...

  6. A High-Speed CMOS Image Sensor with Global Electronic Shutter Pixels Using Pinned Diodes

    Science.gov (United States)

    Yasutomi, Keita; Tamura, Toshihiro; Furuta, Masanori; Itoh, Shinya; Kawahito, Shoji

    This paper describes a high-speed CMOS image sensor with a new type of global electronic shutter pixel. A global electronic shutter is necessary for imaging fast-moving objects without motion blur or distortion. The proposed pixel has two potential wells with pinned diode structure for two-stage charge transfer that enables a global electronic shuttering and reset noise canceling. A prototype high-speed image sensor fabricated in 0.18μm standard CMOS image sensor process consists of the proposed pixel array, 12-bit column-parallel cyclic ADC arrays and 192-channel digital outputs. The sensor achieves a good linearity at low-light intensity, demonstrating the perfect charge transfer between two pinned diodes. The input referred noise of the proposed pixel is measured to be 6.3 e-.

  7. GPM GROUND VALIDATION SPECIAL SENSOR MICROWAVE IMAGER/SOUNDER (SSMI/S) LPVEX V1

    Data.gov (United States)

    National Aeronautics and Space Administration — The GPM Ground Validation Special Sensor Microwave Imager/Sounder (SSMI/S) LPVEx dataset contains brightness temperature data processed from the NOAA CLASS QC...

  8. Extended Special Sensor Microwave Imager (SSM/I) Temperature Data Record (TDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager (SSM/I) is a seven-channel linearly polarized passive microwave radiometer that operates at frequencies of 19.36 (vertically and...

  9. Process for the Development of Image Quality Metrics for Underwater Electro-Optic Sensors

    National Research Council Canada - National Science Library

    Taylor, Jr., James S; Cordes, Brett; Osofsky, Sam; Domnich, Ann

    2002-01-01

    .... These sensors produce two and three-dimensional images that will be used by operators to make the all-important decision regarding use of neutralization systems against sonar contacts classified as mine-like...

  10. Landsat 8 Operational Land Imager (OLI)_Thermal Infared Sensor (TIRS) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — Abstract:The Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) are instruments onboard the Landsat 8 satellite, which was launched in February of...

  11. Gimbal Integration to Small Format, Airborne, MWIR and LWIR Imaging Sensors Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is for enhanced sensor performance and high resolution imaging for Long Wave InfraRed (LWIR) and Medium Wave IR (MWIR) camera systems used in...

  12. Hyperspectral Imaging Sensor with Real-Time Processor Performing Principle Components Analyses for Gas Detection

    National Research Council Canada - National Science Library

    Hinnrichs, Michele

    2000-01-01

    .... With support from the US Air Force and Navy, Pacific Advanced Technology has developed a small man portable hyperspectral imaging sensor with an embedded DSP processor for real time processing...

  13. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Sensor Data Record (SDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sensor Data Records (SDRs), or Level 1b data, from the Visible Infrared Imaging Radiometer Suite (VIIRS) are the calibrated and geolocated radiance and reflectance...

  14. Researchers develop CCD image sensor with 20ns per row parallel readout time

    CERN Multimedia

    Bush, S

    2004-01-01

    "Scientists at the Rutherford Appleton Laboratory (RAL) in Oxfordshire have developed what they claim is the fastest CCD (charge-coupled device) image sensor, with a readout time which is 20ns per row" (1/2 page)

  15. Observations of Gas Emissions from Cascade Range Volcanoes (USA) using a Portable Real-Time Sensor Package and Evacuated Flasks

    Science.gov (United States)

    Kelly, P. J.; Werner, C. A.; Evans, W.; Ingebritsen, S.; Tucker, D.

    2012-12-01

    Degassing from most Cascade Range Volcanoes, USA, is characterized by low-temperature hydrothermal emissions. It is important to monitor these emissions as part of a comprehensive monitoring strategy yet access is often difficult and most features are sampled by the USGS only once per year at best. In an effort to increase the sampling frequency of major gas species and in preparation for building permanent, autonomous units, we built a portable sensor package capable of measuring H2O, CO2, SO2, and H2S in volcanic gas plumes. Here we compare results from the portable sensor package with gas analyses from direct samples obtained using a titanium tube and evacuated glass flasks collected at the same time. The sensor package is housed in a small, rugged case, weighs 5 kg, and includes sensors for measuring H2O (0-16 parts per thousand), CO2 (0-5000 ppmv), SO2 (0-100 ppm), and H2S (0-20 ppm) gases. Additional temperature and pressure sensors, a micro air pump, datalogger, and an internal battery are also incorporated. H2O and CO2 are measured using an infrared spectrometer (Licor 840) and sulfur-containing gases are measured using electrochemical sensors equipped with filters to mitigate cross-sensitivities. Data are collected at a 1 Hz sampling rate and can be recorded and displayed in real-time using a netbook computer or can be saved to the onboard datalogger. The data display includes timeseries of H2O, CO2, SO2, and H2S mixing ratios, the four-component bulk composition of the plume, and automated calculation of gas ratios commonly used in volcanic gas monitoring, such as H2O/CO2, CO2/SO2, and CO2/H2S . In the Cascade Range, the sensor package has been tested at Mt. Baker, Mt. St. Helens, Mt. Hood, and in Lassen Volcanic National Park. In each case, the instrument was placed 5 to 30 meters from the fumarole or fumarole field and emissions were sampled for 5 to 30 minutes. No SO2 was detected at any location. At Mt. Hood the sensor package yielded average CO2/H2S

  16. High-content analysis of single cells directly assembled on CMOS sensor based on color imaging.

    Science.gov (United States)

    Tanaka, Tsuyoshi; Saeki, Tatsuya; Sunaga, Yoshihiko; Matsunaga, Tadashi

    2010-12-15

    A complementary metal oxide semiconductor (CMOS) image sensor was applied to high-content analysis of single cells which were assembled closely or directly onto the CMOS sensor surface. The direct assembling of cell groups on CMOS sensor surface allows large-field (6.66 mm×5.32 mm in entire active area of CMOS sensor) imaging within a second. Trypan blue-stained and non-stained cells in the same field area on the CMOS sensor were successfully distinguished as white- and blue-colored images under white LED light irradiation. Furthermore, the chemiluminescent signals of each cell were successfully visualized as blue-colored images on CMOS sensor only when HeLa cells were placed directly on the micro-lens array of the CMOS sensor. Our proposed approach will be a promising technique for real-time and high-content analysis of single cells in a large-field area based on color imaging. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging

    Science.gov (United States)

    Cho, Youngjun; Julier, Simon J.; Marquardt, Nicolai; Bianchi-Berthouze, Nadia

    2017-01-01

    The ability to monitor the respiratory rate, one of the vital signs, is extremely important for the medical treatment, healthcare and fitness sectors. In many situations, mobile methods, which allow users to undertake everyday activities, are required. However, current monitoring systems can be obtrusive, requiring users to wear respiration belts or nasal probes. Alternatively, contactless digital image sensor based remote-photoplethysmography (PPG) can be used. However, remote PPG requires an ambient source of light, and does not work properly in dark places or under varying lighting conditions. Recent advances in thermographic systems have shrunk their size, weight and cost, to the point where it is possible to create smart-phone based respiration rate monitoring devices that are not affected by lighting conditions. However, mobile thermal imaging is challenged in scenes with high thermal dynamic ranges (e.g. due to the different environmental temperature distributions indoors and outdoors). This challenge is further amplified by general problems such as motion artifacts and low spatial resolution, leading to unreliable breathing signals. In this paper, we propose a novel and robust approach for respiration tracking which compensates for the negative effects of variations in the ambient temperature and motion artifacts and can accurately extract breathing rates in highly dynamic thermal scenes. The approach is based on tracking the nostril of the user and using local temperature variations to infer inhalation and exhalation cycles. It has three main contributions. The first is a novel Optimal Quantization technique which adaptively constructs a color mapping of absolute temperature to improve segmentation, classification and tracking. The second is the Thermal Gradient Flow method that computes thermal gradient magnitude maps to enhance the accuracy of the nostril region tracking. Finally, we introduce the Thermal Voxel method to increase the reliability of the

  18. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging.

    Science.gov (United States)

    Cho, Youngjun; Julier, Simon J; Marquardt, Nicolai; Bianchi-Berthouze, Nadia

    2017-10-01

    The ability to monitor the respiratory rate, one of the vital signs, is extremely important for the medical treatment, healthcare and fitness sectors. In many situations, mobile methods, which allow users to undertake everyday activities, are required. However, current monitoring systems can be obtrusive, requiring users to wear respiration belts or nasal probes. Alternatively, contactless digital image sensor based remote-photoplethysmography (PPG) can be used. However, remote PPG requires an ambient source of light, and does not work properly in dark places or under varying lighting conditions. Recent advances in thermographic systems have shrunk their size, weight and cost, to the point where it is possible to create smart-phone based respiration rate monitoring devices that are not affected by lighting conditions. However, mobile thermal imaging is challenged in scenes with high thermal dynamic ranges (e.g. due to the different environmental temperature distributions indoors and outdoors). This challenge is further amplified by general problems such as motion artifacts and low spatial resolution, leading to unreliable breathing signals. In this paper, we propose a novel and robust approach for respiration tracking which compensates for the negative effects of variations in the ambient temperature and motion artifacts and can accurately extract breathing rates in highly dynamic thermal scenes. The approach is based on tracking the nostril of the user and using local temperature variations to infer inhalation and exhalation cycles. It has three main contributions. The first is a novel Optimal Quantization technique which adaptively constructs a color mapping of absolute temperature to improve segmentation, classification and tracking. The second is the Thermal Gradient Flow method that computes thermal gradient magnitude maps to enhance the accuracy of the nostril region tracking. Finally, we introduce the Thermal Voxel method to increase the reliability of the

  19. 3D environment mapping and self-position estimation by a small flying robot mounted with a movable ultrasonic range sensor

    Directory of Open Access Journals (Sweden)

    Kazuya Nakajima

    2017-09-01

    Full Text Available The light weight of ultrasonic sensors makes them useful for collecting environment information from mobile robots. Ultrasonic sensors are generally used in a circular formation in surface-moving robots, but this is not suitable for small flying robots, which require small size and light weight. Here we created a movable ultrasonic range sensor by combining a small, lightweight servomotor and a single ultrasonic range sensor. This sensor could perform 360° measurements of the distance between objects and the robot. We furthermore constructed a measurement system to perform 3D environment mapping and self-localization by equipping a small flying robot with this movable ultrasonic range sensor and a ground-facing ultrasonic range sensor for altitude measurements. We verified the system by means of a flight test and found that 3D environment mapping and self-localization were realized in real time.

  20. A counting pixel chip and sensor system for X-ray imaging

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, P.; Hausmann, J.; Helmich, A.; Lindner, M.; Wermes, N. [Universitaet Bonn (Germany). Physikalisches Institut; Blanquart, L. [CNRS, Marseille (France). Centre de Physique des Particules

    1999-08-01

    Results obtained with a (photon) counting pixel imaging chip connected to a silicon pixel sensor using the bump and flip-chip technology are presented. The performance of the chip electronics is characterized by an average equivalent noise charge (ENC) below 135 e and a threshold spread of less than 35 e after individual threshold adjust, both measured with a sensor attached. First results on the imaging performance are also reported.

  1. Laser range scanning for image-guided neurosurgery: investigation of image-to-physical space registrations.

    Science.gov (United States)

    Cao, Aize; Thompson, R C; Dumpuri, P; Dawant, B M; Galloway, R L; Ding, S; Miga, M I

    2008-04-01

    In this article a comprehensive set of registration methods is utilized to provide image-to-physical space registration for image-guided neurosurgery in a clinical study. Central to all methods is the use of textured point clouds as provided by laser range scanning technology. The objective is to perform a systematic comparison of registration methods that include both extracranial (skin marker point-based registration (PBR), and face-based surface registration) and intracranial methods (feature PBR, cortical vessel-contour registration, a combined geometry/intensity surface registration method, and a constrained form of that method to improve robustness). The platform facilitates the selection of discrete soft-tissue landmarks that appear on the patient's intraoperative cortical surface and the preoperative gadolinium-enhanced magnetic resonance (MR) image volume, i.e., true corresponding novel targets. In an 11 patient study, data were taken to allow statistical comparison among registration methods within the context of registration error. The results indicate that intraoperative face-based surface registration is statistically equivalent to traditional skin marker registration. The four intracranial registration methods were investigated and the results demonstrated a target registration error of 1.6 +/- 0.5 mm, 1.7 +/- 0.5 mm, 3.9 +/- 3.4 mm, and 2.0 +/- 0.9 mm, for feature PBR, cortical vessel-contour registration, unconstrained geometric/intensity registration, and constrained geometric/intensity registration, respectively. When analyzing the results on a per case basis, the constrained geometric/intensity registration performed best, followed by feature PBR, and finally cortical vessel-contour registration. Interestingly, the best target registration errors are similar to targeting errors reported using bone-implanted markers within the context of rigid targets. The experience in this study as with others is that brain shift can compromise extracranial

  2. Spectral and temporal multiplexing for multispectral fluorescence and reflectance imaging using two color sensors.

    Science.gov (United States)

    Dimitriadis, Nikolas; Grychtol, Bartłomiej; Theuring, Martin; Behr, Tobias; Sippel, Christian; Deliolanis, Nikolaos C

    2017-05-29

    Fluorescence imaging can reveal functional, anatomical or pathological features of high interest in medical interventions. We present a novel method to record and display in video rate multispectral color and fluorescence images over the visible and near infrared range. The fast acquisition in multiple channels is achieved through a combination of spectral and temporal multiplexing in a system with two standard color sensors. Accurate color reproduction and high fluorescence unmixing performance are experimentally demonstrated with a prototype system in a challenging imaging scenario. Through spectral simulation and optimization we show that the system is sensitive to all dyes emitting in the visible and near infrared region without changing filters and that the SNR of multiple unmixed components can be kept high if parameters are chosen well. We propose a sensitive per-pixel metric of unmixing quality in a single image based on noise propagation and present a method to visualize the high-dimensional data in a 2D graph, where up to three fluorescent components can be distinguished and segmented.

  3. Functional tomographic fluorescence imaging of pH microenvironments in microbial biofilms by use of silica nanoparticle sensors.

    Science.gov (United States)

    Hidalgo, Gabriela; Burns, Andrew; Herz, Erik; Hay, Anthony G; Houston, Paul L; Wiesner, Ulrich; Lion, Leonard W

    2009-12-01

    Attached bacterial communities can generate three-dimensional (3D) physicochemical gradients that create microenvironments where local conditions are substantially different from those in the surrounding solution. Given their ubiquity in nature and their impacts on issues ranging from water quality to human health, better tools for understanding biofilms and the gradients they create are needed. Here we demonstrate the use of functional tomographic imaging via confocal fluorescence microscopy of ratiometric core-shell silica nanoparticle sensors (C dot sensors) to study the morphology and temporal evolution of pH microenvironments in axenic Escherichia coli PHL628 and mixed-culture wastewater biofilms. Testing of 70-, 30-, and 10-nm-diameter sensor particles reveals a critical size for homogeneous biofilm staining, with only the 10-nm-diameter particles capable of successfully generating high-resolution maps of biofilm pH and distinct local heterogeneities. Our measurements revealed pH values that ranged from 5 to >7, confirming the heterogeneity of the pH profiles within these biofilms. pH was also analyzed following glucose addition to both suspended and attached cultures. In both cases, the pH became more acidic, likely due to glucose metabolism causing the release of tricarboxylic acid cycle acids and CO(2). These studies demonstrate that the combination of 3D functional fluorescence imaging with well-designed nanoparticle sensors provides a powerful tool for in situ characterization of chemical microenvironments in complex biofilms.

  4. Design and fabrication of three-axis accelerometer sensor microsystem for wide temperature range applications using semi-custom process

    Science.gov (United States)

    Merdassi, A.; Wang, Y.; Xereas, G.; Chodavarapu, V. P.

    2014-03-01

    This paper describes an integrated CMOS-MEMS inertial sensor microsystem, consisting of a 3-axis accelerometer sensor device and its complementary readout circuit, which is designed to operate over a wide temperature range from - 55°C to 175°C. The accelerometer device is based on capacitive transduction and is fabricated using PolyMUMPS, which is a commercial process available from MEMSCAP. The fabricated accelerometer device is then post-processed by depositing a layer of amorphous silicon carbide to form a composite sensor structure to improve its performance over an extended wide temperature range. We designed and fabricated a CMOS readout circuit in IBM 0.13μm process that interfaces with the accelerometer device to serve as a capacitance to voltage converter. The accelerometer device is designed to operate over a measurement range of +/-20g. The described sensor system allows low power, low cost and mass-producible implementation well suited for a variety of applications with harsh or wide temperature operating conditions.

  5. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    Science.gov (United States)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  6. In-flight radiometric calibration of the Advanced Land Imager and Hyperion sensors on the EO-1 platform and comparisons with other earth observing sensors

    Science.gov (United States)

    Biggar, Stuart F.; Thome, Kurtis J.; Wisniewski, Wit T.

    2002-09-01

    The radiometric calibration of the two optical sensors on the Earth Observing One satellite has been studied as a function of time since launch. The calibration has been determined by ground reference calibrations at well-characterized field sites, such as White Sands Missile Range and dry playas, and by reference to other sensors such as the Enhanced Thematic Mapper Plus (ETM+) on Landsat 7. The ground reference calibrations of the Advanced Land Imager (ALI) give results consistent with the on-board solar calibrator and show a significant shift since preflight calibration in the short wavelength bands. Similarly, the ground reference calibrations of Hyperion show a change since preflight calibration, however, for Hyperion the largest changes are in the short wave infrared region of the spectrum. Cross calibration of ALI with ETM+ is consistent with the ground reference calibrations in the visible and near infrared. Results showing the changes in radiometric calibration are presented.

  7. Optical Demonstration of a Medical Imaging System with an EMCCD-Sensor Array for Use in a High Resolution Dynamic X-ray Imager.

    Science.gov (United States)

    Qu, Bin; Huang, Ying; Wang, Weiyuan; Sharma, Prateek; Kuhls-Gilcrist, Andrew T; Cartwright, Alexander N; Titus, Albert H; Bednarek, Daniel R; Rudin, Stephen

    2010-10-30

    Use of an extensible array of Electron Multiplying CCDs (EMCCDs) in medical x-ray imager applications was demonstrated for the first time. The large variable electronic-gain (up to 2000) and small pixel size of EMCCDs provide effective suppression of readout noise compared to signal, as well as high resolution, enabling the development of an x-ray detector with far superior performance compared to conventional x-ray image intensifiers and flat panel detectors. We are developing arrays of EMCCDs to overcome their limited field of view (FOV). In this work we report on an array of two EMCCD sensors running simultaneously at a high frame rate and optically focused on a mammogram film showing calcified ducts. The work was conducted on an optical table with a pulsed LED bar used to provide a uniform diffuse light onto the film to simulate x-ray projection images. The system can be selected to run at up to 17.5 frames per second or even higher frame rate with binning. Integration time for the sensors can be adjusted from 1 ms to 1000 ms. Twelve-bit correlated double sampling AD converters were used to digitize the images, which were acquired by a National Instruments dual-channel Camera Link PC board in real time. A user-friendly interface was programmed using LabVIEW to save and display 2K × 1K pixel matrix digital images. The demonstration tiles a 2 × 1 array to acquire increased-FOV stationary images taken at different gains and fluoroscopic-like videos recorded by scanning the mammogram simultaneously with both sensors. The results show high resolution and high dynamic range images stitched together with minimal adjustments needed. The EMCCD array design allows for expansion to an M×N array for arbitrarily larger FOV, yet with high resolution and large dynamic range maintained.

  8. Particle detection and classification using commercial off the shelf CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Martín [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Comisión Nacional de Energía Atómica (CNEA), Centro Atómico Bariloche, Av. Bustillo 9500, Bariloche 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina); Lipovetzky, Jose, E-mail: lipo@cab.cnea.gov.ar [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Comisión Nacional de Energía Atómica (CNEA), Centro Atómico Bariloche, Av. Bustillo 9500, Bariloche 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina); Sofo Haro, Miguel; Sidelnik, Iván; Blostein, Juan Jerónimo; Alcalde Bessia, Fabricio; Berisso, Mariano Gómez [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina)

    2016-08-11

    In this paper we analyse the response of two different Commercial Off The shelf CMOS image sensors as particle detectors. Sensors were irradiated using X-ray photons, gamma photons, beta particles and alpha particles from diverse sources. The amount of charge produced by different particles, and the size of the spot registered on the sensor are compared, and analysed by an algorithm to classify them. For a known incident energy spectrum, the employed sensors provide a dose resolution lower than microGray, showing their potentials in radioprotection, area monitoring, or medical applications.

  9. Nano-Position Sensors with Superior Linear Response to Position and Dynamic Range from Sub-nm to Centimeters

    Science.gov (United States)

    Lee, Sheng-Chiang; Peters, Randall

    2010-03-01

    Commercial nano-positioners have achieved direct position measurements at the scale of 0.01 nm with capacitive sensing metrology. However, the commercial sensors have small dynamic ranges (up to only a few hundred μm) and are relatively large in size (centimeters in the transverse directions), which is necessary for healthy signal detections but making it difficult to use on smaller devices. The small dynamic range also limits its applications in which large materials (on the scale of centimeters or greater) are handled with needs of sub-nm resolutions. What has been done in the past is to combine the fine and coarse position sensors with different dynamic ranges to cover the required dynamic range. In this paper, we present a novel capacitive position sensing metrology with ultra-wide dynamic range from sub-nm to literally any practically desired length for a translation stage. This sensor will greatly simplify the task and enhance the performance of direct metrology in a hybrid translational stage covering translation tasks from sub-nm to centimeters.

  10. A Solar Position Sensor Based on Image Vision.

    Science.gov (United States)

    Ruelas, Adolfo; Velázquez, Nicolás; Villa-Angulo, Carlos; Acuña, Alexis; Rosales, Pedro; Suastegui, José

    2017-07-29

    Solar collector technologies operate with better performance when the Sun beam direction is normal to the capturing surface, and for that to happen despite the relative movement of the Sun, solar tracking systems are used, therefore, there are rules and standards that need minimum accuracy for these tracking systems to be used in solar collectors' evaluation. Obtaining accuracy is not an easy job, hence in this document the design, construction and characterization of a sensor based on a visual system that finds the relative azimuth error and height of the solar surface of interest, is presented. With these characteristics, the sensor can be used as a reference in control systems and their evaluation. The proposed sensor is based on a microcontroller with a real-time clock, inertial measurement sensors, geolocation and a vision sensor, that obtains the angle of incidence from the sunrays' direction as well as the tilt and sensor position. The sensor's characterization proved how a measurement of a focus error or a Sun position can be made, with an accuracy of 0.0426° and an uncertainty of 0.986%, which can be modified to reach an accuracy under 0.01°. The validation of this sensor was determined showing the focus error on one of the best commercial solar tracking systems, a Kipp & Zonen SOLYS 2. To conclude, the solar tracking sensor based on a vision system meets the Sun detection requirements and components that meet the accuracy conditions to be used in solar tracking systems and their evaluation or, as a tracking and orientation tool, on photovoltaic installations and solar collectors.

  11. Multi-Sensor Fusion of Infrared and Electro-Optic Signals for High Resolution Night Images

    Directory of Open Access Journals (Sweden)

    Victor Lawrence

    2012-07-01

    Full Text Available Electro-optic (EO image sensors exhibit the properties of high resolution and low noise level at daytime, but they do not work in dark environments. Infrared (IR image sensors exhibit poor resolution and cannot separate objects with similar temperature. Therefore, we propose a novel framework of IR image enhancement based on the information (e.g., edge from EO images, which improves the resolution of IR images and helps us distinguish objects at night. Our framework superimposing/blending the edges of the EO image onto the corresponding transformed IR image improves their resolution. In this framework, we adopt the theoretical point spread function (PSF proposed by Hardie et al. for the IR image, which has the modulation transfer function (MTF of a uniform detector array and the incoherent optical transfer function (OTF of diffraction-limited optics. In addition, we design an inverse filter for the proposed PSF and use it for the IR image transformation. The framework requires four main steps: (1 inverse filter-based IR image transformation; (2 EO image edge detection; (3 registration; and (4 blending/superimposing of the obtained image pair. Simulation results show both blended and superimposed IR images, and demonstrate that blended IR images have better quality over the superimposed images. Additionally, based on the same steps, simulation result shows a blended IR image of better quality when only the original IR image is available.

  12. Long Range Weather Prediction III: Miniaturized Distributed Sensors for Global Atmospheric Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Teller, E; Leith, C; Canavan, G; Wood, L

    2001-11-13

    We continue consideration of ways-and-means for creating, in an evolutionary, ever-more-powerful manner, a continually-updated data-base of salient atmospheric properties sufficient for finite differenced integration-based, high-fidelity weather prediction over intervals of 2-3 weeks, leveraging the 10{sup 14} FLOPS digital computing systems now coming into existence. A constellation comprised of 10{sup 6}-10{sup 9} small atmospheric sampling systems--high-tech superpressure balloons carrying early 21st century semiconductor devices, drifting with the local winds over the meteorological spectrum of pressure-altitudes--that assays all portions of the troposphere and lower stratosphere remains the central feature of the proposed system. We suggest that these devices should be active-signaling, rather than passive-transponding, as we had previously proposed only for the ground- and aquatic-situated sensors of this system. Instead of periodic interrogation of the intra-atmospheric transponder population by a constellation of sophisticated small satellites in low Earth orbit, we now propose to retrieve information from the instrumented balloon constellation by existing satellite telephony systems, acting as cellular tower-nodes in a global cellular telephony system whose ''user-set'' is the atmospheric-sampling and surface-level monitoring constellations. We thereby leverage the huge investment in cellular (satellite) telephony and GPS technologies, with large technical and economic gains. This proposal minimizes sponsor forward commitment along its entire programmatic trajectory, and moreover may return data of weather-predictive value soon after field activities commence. We emphasize its high near-term value for making better mesoscale, relatively short-term weather predictions with computing-intensive means, and its great long-term utility in enhancing the meteorological basis for global change predictive studies. We again note that adverse

  13. An improved Ras sensor for highly sensitive and quantitative FRET-FLIM imaging.

    Directory of Open Access Journals (Sweden)

    Ana F Oliveira

    Full Text Available Ras is a signaling protein involved in a variety of cellular processes. Hence, studying Ras signaling with high spatiotemporal resolution is crucial to understanding the roles of Ras in many important cellular functions. Previously, fluorescence lifetime imaging (FLIM of fluorescent resonance energy transfer (FRET-based Ras activity sensors, FRas and FRas-F, have been demonstrated to be useful for measuring the spatiotemporal dynamics of Ras signaling in subcellular micro-compartments. However the predominantly nuclear localization of the sensors' acceptor has limited its sensitivity. Here, we have overcome this limitation and developed two variants of the existing FRas sensor with different affinities: FRas2-F (K(d∼1.7 µM and FRas2-M (K(d∼0.5 µM. We demonstrate that, under 2-photon fluorescence lifetime imaging microscopy, FRas2 sensors provide higher sensitivity compared to previous sensors in 293T cells and neurons.

  14. Improved Feature Detection in Fused Intensity-Range Images with Complex SIFT (ℂSIFT

    Directory of Open Access Journals (Sweden)

    Boris Jutzi

    2011-09-01

    Full Text Available The real and imaginary parts are proposed as an alternative to the usual Polar representation of complex-valued images. It is proven that the transformation from Polar to Cartesian representation contributes to decreased mutual information, and hence to greater distinctiveness. The Complex Scale-Invariant Feature Transform (ℂSIFT detects distinctive features in complex-valued images. An evaluation method for estimating the uniformity of feature distributions in complex-valued images derived from intensity-range images is proposed. In order to experimentally evaluate the proposed methodology on intensity-range images, three different kinds of active sensing systems were used: Range Imaging, Laser Scanning, and Structured Light Projection devices (PMD CamCube 2.0, Z+F IMAGER 5003, Microsoft Kinect.

  15. A linear photodiode array employed in a short range laser triangulation obstacle avoidance sensor. M.S. Thesis; [Martian roving vehicle sensor

    Science.gov (United States)

    Odenthal, J. P.

    1980-01-01

    An opto-electronic receiver incorporating a multi-element linear photodiode array as a component of a laser-triangulation rangefinder was developed as an obstacle avoidance sensor for a Martian roving vehicle. The detector can resolve the angle of laser return in 1.5 deg increments within a field of view of 30 deg and a range of five meters. A second receiver with a 1024 elements over 60 deg and a 3 meter range is also documented. Design criteria, circuit operation, schematics, experimental results and calibration procedures are discussed.

  16. Multi-sensor radiation detection, imaging, and fusion

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Kai [Department of Nuclear Engineering, University of California, Berkeley, CA 94720 (United States); Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-01-01

    Glenn Knoll was one of the leaders in the field of radiation detection and measurements and shaped this field through his outstanding scientific and technical contributions, as a teacher, his personality, and his textbook. His Radiation Detection and Measurement book guided me in my studies and is now the textbook in my classes in the Department of Nuclear Engineering at UC Berkeley. In the spirit of Glenn, I will provide an overview of our activities at the Berkeley Applied Nuclear Physics program reflecting some of the breadth of radiation detection technologies and their applications ranging from fundamental studies in physics to biomedical imaging and to nuclear security. I will conclude with a discussion of our Berkeley Radwatch and Resilient Communities activities as a result of the events at the Dai-ichi nuclear power plant in Fukushima, Japan more than 4 years ago. - Highlights: • .Electron-tracking based gamma-ray momentum reconstruction. • .3D volumetric and 3D scene fusion gamma-ray imaging. • .Nuclear Street View integrates and associates nuclear radiation features with specific objects in the environment. • Institute for Resilient Communities combines science, education, and communities to minimize impact of disastrous events.

  17. Color imaging via nearest neighbor hole coupling in plasmonic color filters integrated onto a complementary metal-oxide semiconductor image sensor.

    Science.gov (United States)

    Burgos, Stanley P; Yokogawa, Sozo; Atwater, Harry A

    2013-11-26

    State-of-the-art CMOS imagers are composed of very small pixels, so it is critical for plasmonic imaging to understand the optical response of finite-size hole arrays and their coupling efficiency to CMOS image sensor pixels. Here, we demonstrate that the transmission spectra of finite-size hole arrays can be accurately described by only accounting for up to the second nearest-neighbor scattering-absorption interactions of hole pairs, thus making hole arrays appealing for close-packed color filters for imaging applications. Using this model, we find that the peak transmission efficiency of a square-shaped hole array with a triangular lattice reaches ∼90% that of an infinite array at an extent of ∼6 × 6 μm(2), the smallest size array showing near-infinite array transmission properties. Finally, we experimentally validate our findings by investigating the transmission and imaging characteristics of a 360 × 320 pixel plasmonic color filter array composed of 5.6 × 5.6 μm(2) RGB color filters integrated onto a commercial black and white 1/2.8 in. CMOS image sensor, demonstrating full-color high resolution plasmonic imaging. Our results show good color fidelity with a 6-color-averaged color difference metric (ΔE) in the range of 16.6-19.3, after white balancing and color-matrix correcting raw images taken with f-numbers ranging from 1.8 to 16. The integrated peak filter transmission efficiencies are measured to be in the 50% range, with a FWHM of 200 nm for all three RGB filters, in good agreement with the spectral response of isolated unmounted color filters.

  18. Performance analysis of gamma-ray-irradiated color complementary metal oxide semiconductor digital image sensors

    CERN Document Server

    Kang, A G; Liu, J Q; You, Z

    2003-01-01

    The performance parameters of dark output images captured from color complementary metal oxide semiconductor (CMOS) digital image sensors before and after gamma-ray irradiation were studied. The changes of red, green and blue color parameters of dark output images with different gamma-ray doses and exposure times were analyzed with our computer software. The effect of irradiation on the response of blue color was significantly affected at a lower dose. The dark current density of the sensors increases by three orders at > 60 krad compared to that of unirradiated sensors. The maximum and minimum analog output voltages all increase with irradiation doses, and are almost the same at > 120 krad. The signal to noise ratio is 48 dB before irradiation and 35 dB after irradiation of 180 krad. The antiradiation threshold for these sensors is about 100 krad. The primary explanation for the changes and the degradation of device performance parameters is presented. (author)

  19. Time comparison in image processing: APS sensors versus an artificial retina based vision system

    Science.gov (United States)

    Elouardi, A.; Bouaziz, S.; Dupret, A.; Lacassagne, L.; Klein, J. O.; Reynaud, R.

    2007-09-01

    To resolve the computational complexity of computer vision algorithms, one of the solutions is to perform some low-level image processing on the sensor focal plane. It becomes a smart sensor device called a retina. This concept makes vision systems more compact. It increases performance thanks to the reduction of the data flow exchanges with external circuits. This paper presents a comparison between two different vision system architectures. The first one involves a smart sensor including analogue processors allowing on-chip image processing. An external microprocessor is used to control the on-chip dataflow and integrated operators. The second system implements a logarithmic CMOS/APS sensor interfaced to the same microprocessor, in which all computations are carried out. We have designed two vision systems as proof of concept. The comparison is related to image processing time.

  20. Zero-Transition Serial Encoding for Image Sensors

    OpenAIRE

    Jahier Pagliari, Daniele; Macii, Enrico; Poncino, Massimo

    2017-01-01

    Off-chip serial buses are the most common interfaces between sensors and processing elements in embedded systems. Due to their length, these connections dissipate a large amount of energy, contributing significantly to the total consumption of the system. The error-tolerant feature of many sensor applications can be leveraged to reduce this energy contribution by means of an approximate serial data encoding. In this paper, we propose one such encoding called Serial T0, particularly, effective...

  1. Biomedical Applications of the Information-efficient Spectral Imaging Sensor (ISIS)

    Energy Technology Data Exchange (ETDEWEB)

    Gentry, S.M.; Levenson, R.

    1999-01-21

    The Information-efficient Spectral Imaging Sensor (ISIS) approach to spectral imaging seeks to bridge the gap between tuned multispectral and fixed hyperspectral imaging sensors. By allowing the definition of completely general spectral filter functions, truly optimal measurements can be made for a given task. These optimal measurements significantly improve signal-to-noise ratio (SNR) and speed, minimize data volume and data rate, while preserving classification accuracy. The following paper investigates the application of the ISIS sensing approach in two sample biomedical applications: prostate and colon cancer screening. It is shown that in these applications, two to three optimal measurements are sufficient to capture the majority of classification information for critical sample constituents. In the prostate cancer example, the optimal measurements allow 8% relative improvement in classification accuracy of critical cell constituents over a red, green, blue (RGB) sensor. In the colon cancer example, use of optimal measurements boost the classification accuracy of critical cell constituents by 28% relative to the RGB sensor. In both cases, optimal measurements match the performance achieved by the entire hyperspectral data set. The paper concludes that an ISIS style spectral imager can acquire these optimal spectral images directly, allowing improved classification accuracy over an RGB sensor. Compared to a hyperspectral sensor, the ISIS approach can achieve similar classification accuracy using a significantly lower number of spectral samples, thus minimizing overall sample classification time and cost.

  2. Toward One Giga Frames per Second — Evolution of in Situ Storage Image Sensors

    Directory of Open Access Journals (Sweden)

    Edoardo Charbon

    2013-04-01

    Full Text Available The ISIS is an ultra-fast image sensor with in-pixel storage. The evolution of the ISIS in the past and in the near future is reviewed and forecasted. To cover the storage area with a light shield, the conventional frontside illuminated ISIS has a limited fill factor. To achieve higher sensitivity, a BSI ISIS was developed. To avoid direct intrusion of light and migration of signal electrons to the storage area on the frontside, a cross-sectional sensor structure with thick pnpn layers was developed, and named “Tetratified structure”. By folding and looping in-pixel storage CCDs, an image signal accumulation sensor, ISAS, is proposed. The ISAS has a new function, the in-pixel signal accumulation, in addition to the ultra-high-speed imaging. To achieve much higher frame rate, a multi-collection-gate (MCG BSI image sensor architecture is proposed. The photoreceptive area forms a honeycomb-like shape. Performance of a hexagonal CCD-type MCG BSI sensor is examined by simulations. The highest frame rate is theoretically more than 1Gfps. For the near future, a stacked hybrid CCD/CMOS MCG image sensor seems most promising. The associated problems are discussed. A fine TSV process is the key technology to realize the structure.

  3. A three-phase time-correlation image sensor using pinned photodiode active pixels

    Science.gov (United States)

    Han, Sangman; Iwahori, Tomohiro; Sawada, Tomonari; Kawahito, Shoji; Ando, Shigeru

    2010-01-01

    A time correlation (TC) image sensor is a device that produces 3-phase time-correlated signals between the incident light intensity and three reference signals. A conventional implementation of the TC image sensor using a standard CMOS technology works at low frequency and with low sensitivity. In order to achieve higher modulation frequency and high sensitivity, the TC image sensor with a dual potential structure using a pinned diode is proposed. The dual potential structure is created by changing the impurity doping concentration in the two different potential regions. In this structure, high-frequency modulation can be achieved, while maintaining a sufficient light receiving area. A prototype TC image sensor with 366×390pixels is implemented with 0.18-μm 1P4M CMOS image sensor technology. Each pixel with the size of 12μm×12μm has one pinned photodiode with the dual potential structure, 12 transistors and 3capacitors to implement three-parallel-output active pixel circuits. A fundamental operation of the implemented TC sensor is demonstrated.

  4. Study of CT-based positron range correction in high resolution 3D PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Cal-Gonzalez, J., E-mail: jacobo@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Herraiz, J.L. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Vicente, E. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Instituto de Estructura de la Materia, Consejo Superior de Investigaciones Cientificas (CSIC), Madrid (Spain); Herranz, E. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Desco, M. [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Vaquero, J.J. [Dpto. de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Positron range limits the spatial resolution of PET images and has a different effect for different isotopes and positron propagation materials. Therefore it is important to consider it during image reconstruction, in order to obtain optimal image quality. Positron range distributions for most common isotopes used in PET in different materials were computed using the Monte Carlo simulations with PeneloPET. The range profiles were introduced into the 3D OSEM image reconstruction software FIRST and employed to blur the image either in the forward projection or in the forward and backward projection. The blurring introduced takes into account the different materials in which the positron propagates. Information on these materials may be obtained, for instance, from a segmentation of a CT image. The results of introducing positron blurring in both forward and backward projection operations was compared to using it only during forward projection. Further, the effect of different shapes of positron range profile in the quality of the reconstructed images with positron range correction was studied. For high positron energy isotopes, the reconstructed images show significant improvement in spatial resolution when positron range is taken into account during reconstruction, compared to reconstructions without positron range modeling.

  5. Indoor Pedestrian Navigation Using Foot-Mounted IMU and Portable Ultrasound Range Sensors

    NARCIS (Netherlands)

    Girard, G.; Cote, S.; Zlatanova, S.; Barette, Y.; St-Pierre, J.; Van Oosterom, P.J.M.

    2011-01-01

    Many solutions have been proposed for indoor pedestrian navigation. Some rely on pre-installed sensor networks, which offer good accuracy but are limited to areas that have been prepared for that purpose, thus requiring an expensive and possibly time-consuming process. Such methods are therefore

  6. A novel capacitive detection principle for Coriolis mass flow sensors enabling range/sensitivity tuning

    NARCIS (Netherlands)

    Alveringh, Dennis; Groenesteijn, Jarno; Ma, Kechun; Wiegerink, Remco J.; Lötters, Joost Conrad

    2015-01-01

    We report on a novel capacitive detection principle for Coriolis mass flow sensors which allows for one order of magnitude increased sensitivity. The detection principle consists of two pairs of comb-structures: one pair produces two signals with a phase shift directly dependent on the mass flow,

  7. Image Quality Assessment of a CMOS/Gd2O2S:Pr,Ce,F X-Ray Sensor

    Directory of Open Access Journals (Sweden)

    Christos Michail

    2015-01-01

    Full Text Available The aim of the present study was to examine the image quality performance of a CMOS digital imaging optical sensor coupled to custom made gadolinium oxysulfide powder scintillators, doped with praseodymium, cerium, and fluorine (Gd2O2S:Pr,Ce,F. The screens, with coating thicknesses 35.7 and 71.2 mg/cm2, were prepared in our laboratory from Gd2O2S:Pr,Ce,F powder (Phosphor Technology, Ltd. by sedimentation on silica substrates and were placed in direct contact with the optical sensor. Image quality was determined through single index (information capacity, IC and spatial frequency dependent parameters, by assessing the Modulation Transfer Function (MTF and the Normalized Noise Power Spectrum (NNPS. The MTF was measured using the slanted-edge method. The CMOS sensor/Gd2O2S:Pr,Ce,F screens combinations were irradiated under the RQA-5 (IEC 62220-1 beam quality. The detector response function was linear for the exposure range under investigation. Under the general radiography conditions, both Gd2O2S:Pr,Ce,F screen/CMOS combinations exhibited moderate imaging properties, in terms of IC, with previously published scintillators, such as CsI:Tl, Gd2O2S:Tb, and Gd2O2S:Eu.

  8. Sensitivity-Improved Strain Sensor over a Large Range of Temperatures Using an Etched and Regenerated Fiber Bragg Grating

    Directory of Open Access Journals (Sweden)

    Yupeng Wang

    2014-10-01

    Full Text Available A sensitivity-improved fiber-optic strain sensor using an etched and regenerated fiber Bragg grating (ER-FBG suitable for a large range of temperature measurements has been proposed and experimentally demonstrated. The process of chemical etching (from 125 µm to 60 µm provides regenerated gratings (at a temperature of 680 °C with a stronger reflective intensity (from 43.7% to 69.8%, together with an improved and linear strain sensitivity (from 0.9 pm/με to 4.5 pm/με over a large temperature range (from room temperature to 800 °C, making it a useful strain sensor for high temperature environments.

  9. Surface Plasmon Resonance sensor showing enhanced sensitivity for CO2 detection in the mid-infrared range.

    Science.gov (United States)

    Herminjard, Sylvain; Sirigu, Lorenzo; Herzig, Hans Peter; Studemann, Eric; Crottini, Andrea; Pellaux, Jean-Paul; Gresch, Tobias; Fischer, Milan; Faist, Jérôme

    2009-01-05

    We present the first optical sensor based on Surface Plasmon Resonance (SPR) operating in the mid-infrared range. The experimental setup is based on a Kretschmann geometry with Ti/Au layers deposited on a CaF(2) prism where light excitation is provided by a Quantum Cascade Laser (QCL) source. Evidence of SPR is presented and the sensing capability of the system is demonstrated by using CO(2) and N(2) mixtures as test samples. Due to the absorption of CO(2) at this wavelength, it is shown that the sensitivity of this configuration is five times higher than a similar SPR sensor operating in the visible range of the spectrum.

  10. Study on enhancing dynamic range of CCD imaging based on digital micro-mirror device

    Science.gov (United States)

    Zhou, Wang

    2009-05-01

    DMD used as SLM modulation area array CCD design is proposed in the paper. It can Solve a problem in exposing high-contrast scenes by ordinary CCD camera, with images appearing over-exposure or under exposure, bringing a loss of the details of the photo. The method adoptes a forecast imaging scene, CCD is purposely designed by way of more exposure regions and exposure times. Through modulation function of DMD micro-mirror, CCD is exposed with sub-region and time-sharing, at the same time a purposely designed structure of image data enhances the area CCD dynamic range. Experiments shows: This method not only improves visible quality of an image and clear details in the backlighting or highlight, but also enhances the dynamic range of image data. The high-quality image and high dynamic range data are real-time captured, the "fused" software is no longer required.

  11. Predicted image quality of a CMOS APS X-ray detector across a range of mammographic beam qualities

    Science.gov (United States)

    Konstantinidis, A.

    2015-09-01

    Digital X-ray detectors based on Complementary Metal-Oxide- Semiconductor (CMOS) Active Pixel Sensor (APS) technology have been introduced in the early 2000s in medical imaging applications. In a previous study the X-ray performance (i.e. presampling Modulation Transfer Function (pMTF), Normalized Noise Power Spectrum (NNPS), Signal-to-Noise Ratio (SNR) and Detective Quantum Efficiency (DQE)) of the Dexela 2923MAM CMOS APS X-ray detector was evaluated within the mammographic energy range using monochromatic synchrotron radiation (i.e. 17-35 keV). In this study image simulation was used to predict how the mammographic beam quality affects image quality. In particular, the experimentally measured monochromatic pMTF, NNPS and SNR parameters were combined with various mammographic spectral shapes (i.e. Molybdenum/Molybdenum (Mo/Mo), Rhodium/Rhodium (Rh/Rh), Tungsten/Aluminium (W/Al) and Tungsten/Rhodium (W/Rh) anode/filtration combinations at 28 kV). The image quality was measured in terms of Contrast-to-Noise Ratio (CNR) using a synthetic breast phantom (4 cm thick with 50% glandularity). The results can be used to optimize the imaging conditions in order to minimize patient's Mean Glandular Dose (MGD).

  12. The application of camera calibration in range-gated 3D imaging technology

    Science.gov (United States)

    Liu, Xiao-quan; Wang, Xian-wei; Zhou, Yan

    2013-09-01

    Range-gated laser imaging technology was proposed in 1966 by LF Gillespiethe in U.S. Army Night Vision Laboratory(NVL). Using pulse laser and intensified charge-coupled device(ICCD) as light source and detector respectively, range-gated laser imaging technology can realize space-slice imaging while restraining the atmospheric backs-catter, and in turn detect the target effectively, by controlling the delay between the laser pulse and strobe. Owing to the constraints of the development of key components such as narrow pulse laser and gated imaging devices, the research has been progressed slowly in the next few decades. Until the beginning of this century, as the hardware technology continues to mature, this technology has developed rapidly in fields such as night vision, underwater imaging, biomedical imaging, three-dimensional imaging, especially range-gated three-dimensional(3-D) laser imaging field purposing of access to target spatial information. 3-D reconstruction is the processing of restoration of 3-D objects visible surface geometric structure from three-dimensional(2-D) image. Range-gated laser imaging technology can achieve gated imaging of slice space to form a slice image, and in turn provide the distance information corresponding to the slice image. But to inverse the information of 3-D space, we need to obtain the imaging visual field of system, that is, the focal length of the system. Then based on the distance information of the space slice, the spatial information of each unit space corresponding to each pixel can be inversed. Camera calibration is an indispensable step in 3-D reconstruction, including analysis of the internal structure of camera parameters and the external parameters . In order to meet the technical requirements of the range-gated 3-D imaging, this paper intends to study the calibration of the zoom lens system. After summarizing the camera calibration technique comprehensively, a classic calibration method based on line is

  13. Gamma-ray irradiation tests of CMOS sensors used in imaging techniques

    Directory of Open Access Journals (Sweden)

    Cappello Salvatore G.

    2014-01-01

    Full Text Available Technologically-enhanced electronic image sensors are used in various fields as diagnostic techniques in medicine or space applications. In the latter case the devices can be exposed to intense radiation fluxes over time which may impair the functioning of the same equipment. In this paper we report the results of gamma-ray irradiation tests on CMOS image sensors simulating the space radiation over a long time period. Gamma-ray irradiation tests were carried out by means of IGS-3 gamma irradiation facility of Palermo University, based on 60Co sources with different activities. To reduce the dose rate and realize a narrow gamma-ray beam, a lead-collimation system was purposely built. It permits to have dose rate values less than 10 mGy/s and to irradiate CMOS Image Sensors during operation. The total ionizing dose to CMOS image sensors was monitored in-situ, during irradiation, up to 1000 Gy and images were acquired every 25 Gy. At the end of the tests, the sensors continued to operate despite a background noise and some pixels were completely saturated. These effects, however, involve isolated pixels and therefore, should not affect the image quality.

  14. Image sensor for security applications with on-chip data authentication

    Science.gov (United States)

    Stifter, P.; Eberhardt, K.; Erni, A.; Hofmann, K.

    2006-04-01

    Sensors in a networked environment which are used for security applications could be jeopardized by man-in-the-middle or address spoofing attacks. By authentication and secure data transmission of the sensor's data stream, this can be thwart by fusing the image sensor with the necessary digital encryption and authentication circuit, which fulfils the three standard requirements of cryptography: data integrity, confidentiality and non-repudiation. This paper presents the development done by AIM, which led to the unique sensor SECVGA, a high performance monochrome (B/W) CMOS active pixel image sensor. The device captures still and motion images with a resolution of 800x600 active pixels and converts them into a digital data stream. Additional to a standard imaging sensor there is the capability of the on-chip cryptographic engine to provide the authentication of the sensor to the host, based on a one-way challenge/response protocol. The protocol that has been realized uses the exchange of a session key to secure the following video data transmission. To achieve this, we calculate a cryptographic checksum derived from a message authentication code (MAC) for a complete image frame. The imager is equipped with an EEPROM to give it the capability to personalize it with a unique and unchangeable identity. A two-wire I2C compatible serial interface allows to program the functions of the imager, i.e. various operating modes, including the authentication procedure, the control of the integration time, sub-frames and the frame rate.

  15. Multi-Sensor Image Fusion for Target Recognition in the Environment of Network Decision Support Systems

    Science.gov (United States)

    2015-12-01

    E. Liggins, David L. Hall, Handbook of Multisensor Data Fusion - Theory and Practice, 2nd ed. Boca Raton, Florida: CRC Press, 2009. [60] “Machine...imagery data . Additionally, multi-spectral image fusion of thermal and visual images for target recognition yielded the best classification...43 a. Speeded-Up Robust Features (SURF) .............................43 3. Multi-Sensor Data Fusion

  16. A new methodology for in-flight radiometric calibration of the MIVIS imaging sensor

    Directory of Open Access Journals (Sweden)

    G. Lechi

    2006-06-01

    Full Text Available Sensor radiometric calibration is of great importance in computing physical values of radiance of the investigated targets, but often airborne scanners are not equipped with any in-flight radiometric calibration facility. Consequently, the radiometric calibration or airborne systems usually relies only on pre-flight and vicarious calibration or on indirect approaches. This paper introduces an experimental approach that makes use of on-board calibration techniques to perform the radiometric calibration of the CNR’s MIVIS (Multispectral Infrared and Visible Imaging Spectrometer airborne scanner. This approach relies on the use of an experimental optical test bench originally designed at Politecnico di Milano University (Italy, called MIVIS Flying Test Bench (MFTB, to perform the first On-The-Fly (OTF calibration of the MIVIS reflective spectral bands. The main task of this study is to estimate how large are the effects introduced by aircraft motion (e.g., e.m. noise or vibrations and by environment conditions (e.g., environment temperature on the radiance values measured by the MIVIS sensor during the fly. This paper describes the first attempt to perform an On-The-Fly (OTF calibration of the MIVIS reflective spectral bands (ranging from 430 nm to 2.500 nm. Analysis of results seems to point out limitations of traditional radiometric calibration methodology based only on pre-flight approaches, with important implications for data quality assessment.

  17. VLC-Based Positioning System for an Indoor Environment Using an Image Sensor and an Accelerometer Sensor.

    Science.gov (United States)

    Huynh, Phat; Yoo, Myungsik

    2016-05-28

    Recently, it is believed that lighting and communication technologies are being replaced by high power LEDs, which are core parts of the visible light communication (VLC) system. In this paper, by taking advantages of VLC, we propose a novel design for an indoor positioning system using LEDs, an image sensor (IS) and an accelerometer sensor (AS) from mobile devices. The proposed algorithm, which provides a high precision indoor position, consists of four LEDs mounted on the ceiling transmitting their own three-dimensional (3D) world coordinates and an IS at an unknown position receiving and demodulating the signals. Based on the 3D world coordinates and the 2D image coordinate of LEDs, the position of the mobile device is determined. Compared to existing algorithms, the proposed algorithm only requires one IS. In addition, by using an AS, the mobile device is allowed to have arbitrary orientation. Last but not least, a mechanism for reducing the image sensor noise is proposed to further improve the accuracy of the positioning algorithm. A simulation is conducted to verify the performance of the proposed algorithm.

  18. Pesticide residue quantification analysis by hyperspectral imaging sensors

    Science.gov (United States)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  19. Data Transfer for Multiple Sensor Networks Over a Broad Temperature Range

    Science.gov (United States)

    Krasowski, Michael

    2013-01-01

    At extreme temperatures, cryogenic and over 300 C, few electronic components are available to support intelligent data transfer over a common, linear combining medium. This innovation allows many sensors to operate on the same wire bus (or on the same airwaves or optical channel: any linearly combining medium), transmitting simultaneously, but individually recoverable at a node in a cooler part of the test area. This innovation has been demonstrated using room-temperature silicon microcircuits as proxy. The microcircuits have analog functionality comparable to componentry designed using silicon carbide. Given a common, linearly combining medium, multiple sending units may transmit information simultaneously. A listening node, using various techniques, can pick out the signal from a single sender, if it has unique qualities, e.g. a voice. The problem being solved is commonly referred to as the cocktail party problem. The human brain uses the cocktail party effect when it is able to recognize and follow a single conversation in a party full of talkers and other noise sources. High-temperature sensors have been used in silicon carbide electronic oscillator circuits. The frequency of the oscillator changes as a function of the changes in the sensed parameter, such as pressure. This change is analogous to changes in the pitch of a person s voice. The output of this oscillator and many others may be superimposed onto a single medium. This medium may be the power lines supplying current to the sensors, a third wire dedicated to data transmission, the airwaves through radio transmission, an optical medium, etc. However, with nothing to distinguish the identities of each source that is, the source separation this system is useless. Using digital electronic functions, unique codes or patterns are created and used to modulate the output of the sensor.

  20. Imaging objects behind a partially reflective surface with a modified time-of-flight sensor

    Science.gov (United States)

    Geerardyn, D.; Kuijk, M.

    2014-05-01

    Time-of-Flight (ToF) methods are used in different applications for depth measurements. There are mainly 2 types of ToF measurements, Pulsed Time-of-Flight and Continuous-Wave Time-of-Flight. Pulsed Time-of-Flight (PToF) techniques are mostly used in combination with a scanning mirror, which makes them not well suited for imaging purposes. Continuous-wave Time-of-Flight (CWToF) techniques are mostly used wide-field, hence they are much faster and more suited for imaging purposes but cannot be used behind partially-reflective surfaces. In commercial applications, both ToF methods require specific hardware, which cannot be exchanged. In this paper, we discuss the transformation of a CWToF sensor to a PToF camera, which is able to make images and measure the distances of objects behind a partially-reflective surface, like the air-water interface in swimming pools when looking from above. We first created our own depth camera which is suitable for both CWToF and PToF. We describe the necessary hardware components for a normal ToF camera and compare it with the adapted components which make it a range-gating depth imager. Afterwards, we modeled the distances and images of one or more objects positioned behind a partially-reflective surface and combine it with measurement data of the optical pulse. A scene was virtualized and the rays from a raytracing software tool were exported to Matlab™. Subsequently, pulse deformations were calculated for every pixel, which resulted in the calculation of the depth information.

  1. Wireless wearable range-of-motion sensor system for upper and lower extremity joints: a validation study

    Science.gov (United States)

    Kumar, Yogaprakash; Yen, Shih-Cheng; Lee, Wangwei; Gao, Fan; Zhao, Ziyi; Li, Jingze; Hon, Benjamin; Tian-Ma Xu, Tim; Cheong, Angela; Koh, Karen; Ng, Yee-Sien; Chew, Effie; Koh, Gerald

    2015-01-01

    Range-of-motion (ROM) assessment is a critical assessment tool during the rehabilitation process. The conventional approach uses the goniometer which remains the most reliable instrument but it is usually time-consuming and subject to both intra- and inter-therapist measurement errors. An automated wireless wearable sensor system for the measurement of ROM has previously been developed by the current authors. Presented is the correlation and accuracy of the automated wireless wearable sensor system against a goniometer in measuring ROM in the major joints of upper (UEs) and lower extremities (LEs) in 19 healthy subjects and 20 newly disabled inpatients through intra (same) subject comparison of ROM assessments between the sensor system against goniometer measurements by physical therapists. In healthy subjects, ROM measurements using the new sensor system were highly correlated with goniometry, with 95% of differences goniometry, with 95% of the differences being < 20° and 25° for most movements in the major joints of UE and LE, respectively. PMID:26609398

  2. AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data

    Directory of Open Access Journals (Sweden)

    Daniel Scheffler

    2017-07-01

    Full Text Available Geospatial co-registration is a mandatory prerequisite when dealing with remote sensing data. Inter- or intra-sensoral misregistration will negatively affect any subsequent image analysis, specifically when processing multi-sensoral or multi-temporal data. In recent decades, many algorithms have been developed to enable manual, semi- or fully automatic displacement correction. Especially in the context of big data processing and the development of automated processing chains that aim to be applicable to different remote sensing systems, there is a strong need for efficient, accurate and generally usable co-registration. Here, we present AROSICS (Automated and Robust Open-Source Image Co-Registration Software, a Python-based open-source software including an easy-to-use user interface for automatic detection and correction of sub-pixel misalignments between various remote sensing datasets. It is independent of spatial or spectral characteristics and robust against high degrees of cloud coverage and spectral and temporal land cover dynamics. The co-registration is based on phase correlation for sub-pixel shift estimation in the frequency domain utilizing the Fourier shift theorem in a moving-window manner. A dense grid of spatial shift vectors can be created and automatically filtered by combining various validation and quality estimation metrics. Additionally, the software supports the masking of, e.g., clouds and cloud shadows to exclude such areas from spatial shift detection. The software has been tested on more than 9000 satellite images acquired by different sensors. The results are evaluated exemplarily for two inter-sensoral and two intra-sensoral use cases and show registration results in the sub-pixel range with root mean square error fits around 0.3 pixels and better.

  3. Revolution of Sensors in Micro-Electromechanical Systems

    Science.gov (United States)

    Esashi, Masayoshi

    2012-08-01

    Microsensors realized by micro-electromechanical systems (MEMS) technology play a key role as the input devices of systems. In this report, the following sensors are reviewed: piezoresistive and capacitive pressure sensors, surface acoustic wave (SAW) wireless pressure sensors, tactile sensor networks for robots, accelerometers, angular velocity sensors (gyroscopes), range image sensors using optical scanners, infrared imagers, chemical sensing systems as Fourier transform infrared (FTIR) spectroscopy and gas chromatography, flow sensors for fluids, and medical sensors such as ultrafine optical-fiber blood pressure sensors and implantable pressure sensors.

  4. Video image processing to create a speed sensor

    Science.gov (United States)

    1999-11-01

    Image processing has been applied to traffic analysis in recent years, with different goals. In the report, a new approach is presented for extracting vehicular speed information, given a sequence of real-time traffic images. We extract moving edges ...

  5. A novel design of subminiature star sensor's imaging system based on TMS320DM3730

    Science.gov (United States)

    Liu, Meiying; Wang, Hu; Wen, Desheng; Yang, Shaodong

    2017-02-01

    Development of the next generation star sensor is tending to miniaturization, low cost and low power consumption, so the imaging system based on FPGA in the past could not meet its developing requirements. A novel design of digital imaging system is discussed in this paper. Combined with the MT9P031 CMOS image sensor's timing sequence and working mode, the sensor driving circuit and image data memory circuit were implemented with the main control unit TMS320DM3730. In order to make the hardware system has the advantage of small size and light weight, the hardware adopted miniaturization design. The software simulation and experimental results demonstrated that the designed imaging system was reasonable, the function of tunable integration time and selectable window readout modes were realized. The communication with computer was exact. The system has the advantage of the powerful image processing, small-size, compact, stable, reliable and low power consumption. The whole system volume is 40 mm *40 mm *40mm,the system weight is 105g, the system power consumption is lower than 1w. This design provided a feasible solution for the realization of the subminiature star sensor's imaging system.

  6. Construction, imaging, and analysis of FRET-based tension sensors in living cells.

    Science.gov (United States)

    LaCroix, Andrew S; Rothenberg, Katheryn E; Berginski, Matthew E; Urs, Aarti N; Hoffman, Brenton D

    2015-01-01

    Due to an increased appreciation for the importance of mechanical stimuli in many biological contexts, an interest in measuring the forces experienced by specific proteins in living cells has recently emerged. The development and use of Förster resonance energy transfer (FRET)-based molecular tension sensors has enabled these types of studies and led to important insights into the mechanisms those cells utilize to probe and respond to the mechanical nature of their surrounding environment. The process for creating and utilizing FRET-based tension sensors can be divided into three main parts: construction, imaging, and analysis. First we review several methods for the construction of genetically encoded FRET-based tension sensors, including restriction enzyme-based methods as well as the more recently developed overlap extension or Gibson Assembly protocols. Next, we discuss the intricacies associated with imaging tension sensors, including optimizing imaging parameters as well as common techniques for estimating artifacts within standard imaging systems. Then, we detail the analysis of such data and describe how to extract useful information from a FRET experiment. Finally, we provide a discussion on identifying and correcting common artifacts in the imaging of FRET-based tension sensors. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Experiment on digital CDS with 33-M pixel 120-fps super hi-vision image sensor

    Science.gov (United States)

    Yonai, J.; Yasue, T.; Kitamura, K.; Hayashida, T.; Watabe, T.; Shimamoto, H.; Kawahito, S.

    2014-03-01

    We have developed a CMOS image sensor with 33 million pixels and 120 frames per second (fps) for Super Hi-Vision (SHV:8K version of UHDTV). There is a way to reduce the fixed pattern noise (FPN) caused in CMOS image sensors by using digital correlated double sampling (digital CDS), but digital CDS methods need high-speed analog-to-digital conversion and are not applicable to conventional UHDTV image sensors due to their speed limit. Our image sensor, on the other hand, has a very fast analog-to-digital converter (ADC) using "two-stage cyclic ADC" architecture that is capable of being driven at 120-fps, which is double the normal frame rate for TV. In this experiment, we performed experimental digital CDS using the high-frame rate UHDTV image sensor. By reading the same row twice at 120-fps and subtracting dark pixel signals from accumulated pixel signals, we obtained a 60-fps equivalent video signal with digital noise reduction. The results showed that the VFPN was effectively reduced from 24.25 e-rms to 0.43 e-rms.

  8. A bio-image sensor for simultaneous detection of multi-neurotransmitters.

    Science.gov (United States)

    Lee, You-Na; Okumura, Koichi; Horio, Tomoko; Iwata, Tatsuya; Takahashi, Kazuhiro; Hattori, Toshiaki; Sawada, Kazuaki

    2018-03-01

    We report here a new bio-image sensor for simultaneous detection of spatial and temporal distribution of multi-neurotransmitters. It consists of multiple enzyme-immobilized membranes on a 128 × 128 pixel array with read-out circuit. Apyrase and acetylcholinesterase (AChE), as selective elements, are used to recognize adenosine 5'-triphosphate (ATP) and acetylcholine (ACh), respectively. To enhance the spatial resolution, hydrogen ion (H+) diffusion barrier layers are deposited on top of the bio-image sensor and demonstrated their prevention capability. The results are used to design the space among enzyme-immobilized pixels and the null H+ sensor to minimize the undesired signal overlap by H+ diffusion. Using this bio-image sensor, we can obtain H+ diffusion-independent imaging of concentration gradients of ATP and ACh in real-time. The sensing characteristics, such as sensitivity and detection of limit, are determined experimentally. With the proposed bio-image sensor the possibility exists for customizable monitoring of the activities of various neurochemicals by using different kinds of proton-consuming or generating enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. An Ultrahigh-Resolution Digital Image Sensor with Pixel Size of 50 nm by Vertical Nanorod Arrays.

    Science.gov (United States)

    Jiang, Chengming; Song, Jinhui

    2015-07-01

    The pixel size limit of existing digital image sensors is successfully overcome by using vertically aligned semiconducting nanorods as the 3D photosensing pixels. On this basis, an unprecedentedly high-resolution digital image sensor with a pixel size of 50 nm and a resolution of 90 nm is fabricated. The ultrahigh-resolution digital image sensor can heavily impact the field of visual information. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Range-Gated LADAR Coherent Imaging Using Parametric Up-Conversion of IR and NIR Light for Imaging with a Visible-Range Fast-Shuttered Intensified Digital CCD Camera

    Energy Technology Data Exchange (ETDEWEB)

    YATES,GEORGE J.; MCDONALD,THOMAS E. JR.; BLISS,DAVID E.; CAMERON,STEWART M.; ZUTAVERN,FRED J.

    2000-12-20

    Research is presented on infrared (IR) and near infrared (NIR) sensitive sensor technologies for use in a high speed shuttered/intensified digital video camera system for range-gated imaging at ''eye-safe'' wavelengths in the region of 1.5 microns. The study is based upon nonlinear crystals used for second harmonic generation (SHG) in optical parametric oscillators (OPOS) for conversion of NIR and IR laser light to visible range light for detection with generic S-20 photocathodes. The intensifiers are ''stripline'' geometry 18-mm diameter microchannel plate intensifiers (MCPIIS), designed by Los Alamos National Laboratory and manufactured by Philips Photonics. The MCPIIS are designed for fast optical shattering with exposures in the 100-200 ps range, and are coupled to a fast readout CCD camera. Conversion efficiency and resolution for the wavelength conversion process are reported. Experimental set-ups for the wavelength shifting and the optical configurations for producing and transporting laser reflectance images are discussed.

  11. Novel Smart Pan/Tilt/Zoom Sensor for Launch Range Video Surveillance Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA has a pressing need for increasing the efficiency of launch range surveillance during mission launch operations. Difficulty in verifying a cleared range causes...

  12. Multiple image sensor data fusion through artificial neural networks

    Science.gov (United States)

    With multisensor data fusion technology, the data from multiple sensors are fused in order to make a more accurate estimation of the environment through measurement, processing and analysis. Artificial neural networks are the computational models that mimic biological neural networks. With high per...

  13. Electro-optic modulation methods in range-gated active imaging.

    Science.gov (United States)

    Chen, Zhen; Liu, Bo; Liu, Enhai; Peng, Zhangxian

    2016-01-20

    A time-resolved imaging method based on electro-optic modulation is proposed in this paper. To implement range resolution, two kinds of polarization-modulated methods are designed, and high spatial and range resolution can be achieved by the active imaging system. In the system, with polarization beam splitting the incident light is split into two parts, one of which is modulated with cos(2) function and the other is modulated with sin(2) function. Afterward, a depth map can be obtained from two simultaneously received images by dual electron multiplying charge-coupled devices. Furthermore, an intensity image can also be obtained from the two images. Comparisons of the two polarization-modulated methods indicate that range accuracy will be promoted when the polarized light is modulated before beam splitting.

  14. Simulation and measurement of total ionizing dose radiation induced image lag increase in pinned photodiode CMOS image sensors

    Science.gov (United States)

    Liu, Jing; Chen, Wei; Wang, Zujun; Xue, Yuanyuan; Yao, Zhibin; He, Baoping; Ma, Wuying; Jin, Junshan; Sheng, Jiangkun; Dong, Guantao

    2017-06-01

    This paper presents an investigation of total ionizing dose (TID) induced image lag sources in pinned photodiodes (PPD) CMOS image sensors based on radiation experiments and TCAD simulation. The radiation experiments have been carried out at the Cobalt -60 gamma-ray source. The experimental results show the image lag degradation is more and more serious with increasing TID. Combining with the TCAD simulation results, we can confirm that the junction of PPD and transfer gate (TG) is an important region forming image lag during irradiation. These simulations demonstrate that TID can generate a potential pocket leading to incomplete transfer.

  15. Simulation and measurement of total ionizing dose radiation induced image lag increase in pinned photodiode CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jing [School of Materials Science and Engineering, Xiangtan University, Hunan (China); State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Chen, Wei, E-mail: chenwei@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Wang, Zujun, E-mail: wangzujun@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Xue, Yuanyuan; Yao, Zhibin; He, Baoping; Ma, Wuying; Jin, Junshan; Sheng, Jiangkun; Dong, Guantao [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China)

    2017-06-01

    This paper presents an investigation of total ionizing dose (TID) induced image lag sources in pinned photodiodes (PPD) CMOS image sensors based on radiation experiments and TCAD simulation. The radiation experiments have been carried out at the Cobalt −60 gamma-ray source. The experimental results show the image lag degradation is more and more serious with increasing TID. Combining with the TCAD simulation results, we can confirm that the junction of PPD and transfer gate (TG) is an important region forming image lag during irradiation. These simulations demonstrate that TID can generate a potential pocket leading to incomplete transfer.

  16. PROVE GOES-8 Images of Jornada Experimental Range, New Mexico, 1997

    Data.gov (United States)

    National Aeronautics and Space Administration — As part of the Prototype Validation Experiment (PROVE) at the Jornada Experimental Range, GOES-8 images were collected every 30 minutes for 15 days overlapping the...

  17. Sensors

    Energy Technology Data Exchange (ETDEWEB)

    Jensen, H. [PBI-Dansensor A/S (Denmark); Toft Soerensen, O. [Risoe National Lab., Materials Research Dept. (Denmark)

    1999-10-01

    A new type of ceramic oxygen sensors based on semiconducting oxides was developed in this project. The advantage of these sensors compared to standard ZrO{sub 2} sensors is that they do not require a reference gas and that they can be produced in small sizes. The sensor design and the techniques developed for production of these sensors are judged suitable by the participating industry for a niche production of a new generation of oxygen sensors. Materials research on new oxygen ion conducting conductors both for applications in oxygen sensors and in fuel was also performed in this project and finally a new process was developed for fabrication of ceramic tubes by dip-coating. (EHS)

  18. Sensors

    CERN Document Server

    Pigorsch, Enrico

    1997-01-01

    This is the 5th edition of the Metra Martech Directory "EUROPEAN CENTRES OF EXPERTISE - SENSORS." The entries represent a survey of European sensors development. The new edition contains 425 detailed profiles of companies and research institutions in 22 countries. This is reflected in the diversity of sensors development programmes described, from sensors for physical parameters to biosensors and intelligent sensor systems. We do not claim that all European organisations developing sensors are included, but this is a good cross section from an invited list of participants. If you see gaps or omissions, or would like your organisation to be included, please send details. The data base invites the formation of effective joint ventures by identifying and providing access to specific areas in which organisations offer collaboration. This issue is recognised to be of great importance and most entrants include details of collaboration offered and sought. We hope the directory on Sensors will help you to find the ri...

  19. Fusing range and intensity images for generating dense models of three-dimensional environments

    DEFF Research Database (Denmark)

    Ellekilde, Lars-Peter; Miró, Jaime Valls; Dissanayake., Gamini

    This paper presents a novel strategy for the construction of dense three-dimensional environment models by combining images from a conventional camera and a range imager. Ro- bust data association is ?rst accomplished by exploiting the Scale Invariant Feature Transformation (SIFT) technique on th...

  20. Imaging the tissue distribution of glucose in livers using a PARACEST sensor.

    Science.gov (United States)

    Ren, Jimin; Trokowski, Robert; Zhang, Shanrong; Malloy, Craig R; Sherry, A Dean

    2008-11-01

    Noninvasive imaging of glucose in tissues could provide important insights about glucose gradients in tissue, the origins of gluconeogenesis, or perhaps differences in tissue glucose utilization in vivo. Direct spectral detection of glucose in vivo by (1)H NMR is complicated by interfering signals from other metabolites and the much larger water signal. One potential way to overcome these problems is to use an exogenous glucose sensor that reports glucose concentrations indirectly through the water signal by chemical exchange saturation transfer (CEST). Such a method is demonstrated here in mouse liver perfused with a Eu(3+)-based glucose sensor containing two phenylboronate moieties as the recognition site. Activation of the sensor by applying a frequency-selective presaturation pulse at 42 ppm resulted in a 17% decrease in water signal in livers perfused with 10 mM sensor and 10 mM glucose compared with livers with the same amount of sensor but without glucose. It was shown that livers perfused with 5 mM sensor but no glucose can detect glucose exported from hepatocytes after hormonal stimulation of glycogenolysis. CEST images of livers perfused in the magnet responded to changes in glucose concentrations demonstrating that the method has potential for imaging the tissue distribution of glucose in vivo.

  1. Data Collection and Analysis Using Wearable Sensors for Monitoring Knee Range of Motion after Total Knee Arthroplasty

    Directory of Open Access Journals (Sweden)

    Chih-Yen Chiang

    2017-02-01

    Full Text Available Total knee arthroplasty (TKA is the most common treatment for degenerative osteoarthritis of that articulation. However, either in rehabilitation clinics or in hospital wards, the knee range of motion (ROM can currently only be assessed using a goniometer. In order to provide continuous and objective measurements of knee ROM, we propose the use of wearable inertial sensors to record the knee ROM during the recovery progress. Digitalized and objective data can assist the surgeons to control the recovery status and flexibly adjust rehabilitation programs during the early acute inpatient stage. The more knee flexion ROM regained during the early inpatient period, the better the long-term knee recovery will be and the sooner early discharge can be achieved. The results of this work show that the proposed wearable sensor approach can provide an alternative for continuous monitoring and objective assessment of knee ROM recovery progress for TKA patients compared to the traditional goniometer measurements.

  2. Fast Hue and Range Preserving Histogram: Specification: Theory and New Algorithms for Color Image Enhancement.

    Science.gov (United States)

    Nikolova, Mila; Steidl, Gabriele

    2014-07-16

    Color image enhancement is a complex and challenging task in digital imaging with abundant applications. Preserving the hue of the input image is crucial in a wide range of situations. We propose simple image enhancement algorithms which conserve the hue and preserve the range (gamut) of the R, G, B channels in an optimal way. In our setup, the intensity input image is transformed into a target intensity image whose histogram matches a specified, well-behaved histogram. We derive a new color assignment methodology where the resulting enhanced image fits the target intensity image. We analyse the obtained algorithms in terms of chromaticity improvement and compare them with the unique and quite popular histogram based hue and range preserving algorithm of Naik and Murthy. Numerical tests confirm our theoretical results and show that our algorithms perform much better than the Naik-Murthy algorithm. In spite of their simplicity, they compete with well-established alternative methods for images where hue-preservation is desired.

  3. Multimass velocity-map imaging with the Pixel Imaging Mass Spectrometry (PImMS) sensor: an ultra-fast event-triggered camera for particle imaging.

    Science.gov (United States)

    Clark, Andrew T; Crooks, Jamie P; Sedgwick, Iain; Turchetta, Renato; Lee, Jason W L; John, Jaya John; Wilman, Edward S; Hill, Laura; Halford, Edward; Slater, Craig S; Winter, Benjamin; Yuen, Wei Hao; Gardiner, Sara H; Lipciuc, M Laura; Brouard, Mark; Nomerotski, Andrei; Vallance, Claire

    2012-11-15

    We present the first multimass velocity-map imaging data acquired using a new ultrafast camera designed for time-resolved particle imaging. The PImMS (Pixel Imaging Mass Spectrometry) sensor allows particle events to be imaged with time resolution as high as 25 ns over data acquisition times of more than 100 μs. In photofragment imaging studies, this allows velocity-map images to be acquired for multiple fragment masses on each time-of-flight cycle. We describe the sensor architecture and present bench-testing data and multimass velocity-map images for photofragments formed in the UV photolysis of two test molecules: Br(2) and N,N-dimethylformamide.

  4. Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition

    Directory of Open Access Journals (Sweden)

    Chulhee Park

    2016-05-01

    Full Text Available A multispectral filter array (MSFA image sensor with red, green, blue and near-infrared (NIR filters is useful for various imaging applications with the advantages that it obtains color information and NIR information simultaneously. Because the MSFA image sensor needs to acquire invisible band information, it is necessary to remove the IR cut-offfilter (IRCF. However, without the IRCF, the color of the image is desaturated by the interference of the additional NIR component of each RGB color channel. To overcome color degradation, a signal processing approach is required to restore natural color by removing the unwanted NIR contribution to the RGB color channels while the additional NIR information remains in the N channel. Thus, in this paper, we propose a color restoration method for an imaging system based on the MSFA image sensor with RGBN filters. To remove the unnecessary NIR component in each RGB color channel, spectral estimation and spectral decomposition are performed based on the spectral characteristics of the MSFA sensor. The proposed color restoration method estimates the spectral intensity in NIR band and recovers hue and color saturation by decomposing the visible band component and the NIR band component in each RGB color channel. The experimental results show that the proposed method effectively restores natural color and minimizes angular errors.

  5. Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition.

    Science.gov (United States)

    Park, Chulhee; Kang, Moon Gi

    2016-05-18

    A multispectral filter array (MSFA) image sensor with red, green, blue and near-infrared (NIR) filters is useful for various imaging applications with the advantages that it obtains color information and NIR information simultaneously. Because the MSFA image sensor needs to acquire invisible band information, it is necessary to remove the IR cut-offfilter (IRCF). However, without the IRCF, the color of the image is desaturated by the interference of the additional NIR component of each RGB color channel. To overcome color degradation, a signal processing approach is required to restore natural color by removing the unwanted NIR contribution to the RGB color channels while the additional NIR information remains in the N channel. Thus, in this paper, we propose a color restoration method for an imaging system based on the MSFA image sensor with RGBN filters. To remove the unnecessary NIR component in each RGB color channel, spectral estimation and spectral decomposition are performed based on the spectral characteristics of the MSFA sensor. The proposed color restoration method estimates the spectral intensity in NIR band and recovers hue and color saturation by decomposing the visible band component and the NIR band component in each RGB color channel. The experimental results show that the proposed method effectively restores natural color and minimizes angular errors.

  6. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... with a multivariate analysis technique from RGB pictures. The color information is also transformed to hue, saturation and intensity components. Both sets of image features are combined with traditional process measurements to obtain an inferential model by partial least squares (PLS) regression. A dynamic PLS model...... the weight of each regressor, is achieved through multivariate regression. The framework is described and illustrated with applications to cement kiln systems that are characterized by off-line quality measurements and on-line analyzers with limited reliability. Image features are extracted...

  7. Imaging Spectroscopy Techniques for Rapid Assessment of Geologic and Cryospheric Science Data from future Satellite Sensors

    Science.gov (United States)

    Calvin, W. M.; Hill, R.

    2016-12-01

    Several efforts are currently underway to develop and launch the next generation of imaging spectrometer systems on satellite platforms for a wide range of Earth Observation goals. Systems that include the reflected solar wavelength range up to 2.5 μm will be capable of detailed mapping of the composition of the Earth's surface. Sensors under development include EnMAP, HISUI, PRISMA, HERO, and HyspIRI. These systems are expected to be able to provide global data for insights and constraints on fundamental geological processes, natural and anthropogenic hazards, water, energy and mineral resource assessments. Coupled with the development of these sensors is the challenge of bringing a multi-channel user community (from Landsat, MODIS, and ASTER) into the rich science return available from imaging spectrometer systems. Most data end users will never be spectroscopy experts so that making the derived science products accessible to a wide user community is imperative. Simple band parameterizations have been developed for the CRISM instrument at Mars, including mafic and alteration minerals, frost and volatile ice indices. These products enhance and augment the use of that data set by broader group of scientists. Summary products for terrestrial geologic and water resource applications would help build a wider user base for future satellite systems, and rapidly key spectral experts to important regions for detailed spectral mapping. Summary products take advantage of imaging spectroscopy's narrow spectral channels with band depth calculations in addition to band ratios that are commonly used by multi-channel systems (e.g. NDVI, NDWI, NDSI). We are testing summary products for Earth geologic and snow scenes over California using AVIRIS data at 18m/pixel. This has resulted in several algorithms for rapid mineral discrimination and mapping and data collects over the melting Sierra snowpack in spring 2016 are expected to generate algorithms for snow grain size and surface

  8. Luminescence imaging of water during proton-beam irradiation for range estimation

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Seiichi, E-mail: s-yama@met.nagoya-u.ac.jp; Okumura, Satoshi; Komori, Masataka [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine, Nagoya 461-8673 (Japan); Toshito, Toshiyuki [Department of Proton Therapy Physics, Nagoya Proton Therapy Center, Nagoya City West Medical Center, Nagoya 462-8508 (Japan)

    2015-11-15

    Purpose: Proton therapy has the ability to selectively deliver a dose to the target tumor, so the dose distribution should be accurately measured by a precise and efficient method. The authors found that luminescence was emitted from water during proton irradiation and conjectured that this phenomenon could be used for estimating the dose distribution. Methods: To achieve more accurate dose distribution, the authors set water phantoms on a table with a spot scanning proton therapy system and measured the luminescence images of these phantoms with a high-sensitivity, cooled charge coupled device camera during proton-beam irradiation. The authors imaged the phantoms of pure water, fluorescein solution, and an acrylic block. Results: The luminescence images of water phantoms taken during proton-beam irradiation showed clear Bragg peaks, and the measured proton ranges from the images were almost the same as those obtained with an ionization chamber. Furthermore, the image of the pure-water phantom showed almost the same distribution as the tap-water phantom, indicating that the luminescence image was not related to impurities in the water. The luminescence image of the fluorescein solution had ∼3 times higher intensity than water, with the same proton range as that of water. The luminescence image of the acrylic phantom had a 14.5% shorter proton range than that of water; the proton range in the acrylic phantom generally matched the calculated value. The luminescence images of the tap-water phantom during proton irradiation could be obtained in less than 2 s. Conclusions: Luminescence imaging during proton-beam irradiation is promising as an effective method for range estimation in proton therapy.

  9. Luminescence imaging of water during carbon-ion irradiation for range estimation

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Seiichi, E-mail: s-yama@met.nagoya-u.ac.jp; Komori, Masataka; Koyama, Shuji; Morishita, Yuki; Sekihara, Eri [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine, Higashi-ku, Nagoya, Aichi 461-8673 (Japan); Akagi, Takashi; Yamashita, Tomohiro [Hygo Ion Beam Medical Center, Hyogo 679-5165 (Japan); Toshito, Toshiyuki [Department of Proton Therapy Physics, Nagoya Proton Therapy Center, Nagoya City West Medical Center, Aichi 462-8508 (Japan)

    2016-05-15

    Purpose: The authors previously reported successful luminescence imaging of water during proton irradiation and its application to range estimation. However, since the feasibility of this approach for carbon-ion irradiation remained unclear, the authors conducted luminescence imaging during carbon-ion irradiation and estimated the ranges. Methods: The authors placed a pure-water phantom on the patient couch of a carbon-ion therapy system and measured the luminescence images with a high-sensitivity, cooled charge-coupled device camera during carbon-ion irradiation. The authors also carried out imaging of three types of phantoms (tap-water, an acrylic block, and a plastic scintillator) and compared their intensities and distributions with those of a phantom containing pure-water. Results: The luminescence images of pure-water phantoms during carbon-ion irradiation showed clear Bragg peaks, and the measured carbon-ion ranges from the images were almost the same as those obtained by simulation. The image of the tap-water phantom showed almost the same distribution as that of the pure-water phantom. The acrylic block phantom’s luminescence image produced seven times higher luminescence and had a 13% shorter range than that of the water phantoms; the range with the acrylic phantom generally matched the calculated value. The plastic scintillator showed ∼15 000 times higher light than that of water. Conclusions: Luminescence imaging during carbon-ion irradiation of water is not only possible but also a promising method for range estimation in carbon-ion therapy.

  10. Edge pixel response studies of edgeless silicon sensor technology for pixellated imaging detectors

    Science.gov (United States)

    Maneuski, D.; Bates, R.; Blue, A.; Buttar, C.; Doonan, K.; Eklund, L.; Gimenez, E. N.; Hynds, D.; Kachkanov, S.; Kalliopuska, J.; McMullen, T.; O'Shea, V.; Tartoni, N.; Plackett, R.; Vahanen, S.; Wraight, K.

    2015-03-01

    Silicon sensor technologies with reduced dead area at the sensor's perimeter are under development at a number of institutes. Several fabrication methods for sensors which are sensitive close to the physical edge of the device are under investigation utilising techniques such as active-edges, passivated edges and current-terminating rings. Such technologies offer the goal of a seamlessly tiled detection surface with minimum dead space between the individual modules. In order to quantify the performance of different geometries and different bulk and implant types, characterisation of several sensors fabricated using active-edge technology were performed at the B16 beam line of the Diamond Light Source. The sensors were fabricated by VTT and bump-bonded to Timepix ROICs. They were 100 and 200 μ m thick sensors, with the last pixel-to-edge distance of either 50 or 100 μ m. The sensors were fabricated as either n-on-n or n-on-p type devices. Using 15 keV monochromatic X-rays with a beam spot of 2.5 μ m, the performance at the outer edge and corners pixels of the sensors was evaluated at three bias voltages. The results indicate a significant change in the charge collection properties between the edge and 5th (up to 275 μ m) from edge pixel for the 200 μ m thick n-on-n sensor. The edge pixel performance of the 100 μ m thick n-on-p sensors is affected only for the last two pixels (up to 110 μ m) subject to biasing conditions. Imaging characteristics of all sensor types investigated are stable over time and the non-uniformities can be minimised by flat-field corrections. The results from the synchrotron tests combined with lab measurements are presented along with an explanation of the observed effects.

  11. Plasmonics-Based Multifunctional Electrodes for Low-Power-Consumption Compact Color-Image Sensors.

    Science.gov (United States)

    Lin, Keng-Te; Chen, Hsuen-Li; Lai, Yu-Sheng; Chi, Yi-Min; Chu, Ting-Wei

    2016-03-01

    High pixel density, efficient color splitting, a compact structure, superior quantum efficiency, and low power consumption are all important features for contemporary color-image sensors. In this study, we developed a surface plasmonics-based color-image sensor displaying a high photoelectric response, a microlens-free structure, and a zero-bias working voltage. Our compact sensor comprised only (i) a multifunctional electrode based on a single-layer structured aluminum (Al) film and (ii) an underlying silicon (Si) substrate. This approach significantly simplifies the device structure and fabrication processes; for example, the red, green, and blue color pixels can be prepared simultaneously in a single lithography step. Moreover, such Schottky-based plasmonic electrodes perform multiple functions, including color splitting, optical-to-electrical signal conversion, and photogenerated carrier collection for color-image detection. Our multifunctional, electrode-based device could also avoid the interference phenomenon that degrades the color-splitting spectra found in conventional color-image sensors. Furthermore, the device took advantage of the near-field surface plasmonic effect around the Al-Si junction to enhance the optical absorption of Si, resulting in a significant photoelectric current output even under low-light surroundings and zero bias voltage. These plasmonic Schottky-based color-image devices could convert a photocurrent directly into a photovoltage and provided sufficient voltage output for color-image detection even under a light intensity of only several femtowatts per square micrometer. Unlike conventional color image devices, using voltage as the output signal decreases the area of the periphery read-out circuit because it does not require a current-to-voltage conversion capacitor or its related circuit. Therefore, this strategy has great potential for direct integration with complementary metal-oxide-semiconductor (CMOS)-compatible circuit

  12. Feasibility of fiber optic displacement sensor scanning system for imaging of dental cavity

    Science.gov (United States)

    Rahman, Husna Abdul; Che Ani, Adi Izhar; Harun, Sulaiman Wadi; Yasin, Moh.; Apsari, Retna; Ahmad, Harith

    2012-07-01

    The purpose of this study is to investigate the potential of intensity modulated fiber optic displacement sensor scanning system for the imaging of dental cavity. Here, we discuss our preliminary results in the imaging of cavities on various teeth surfaces, as well as measurement of the diameter of the cavities which are represented by drilled holes on the teeth surfaces. Based on the analysis of displacement measurement, the sensitivities and linear range for the molar, canine, hybrid composite resin, and acrylic surfaces are obtained at 0.09667 mV/mm and 0.45 mm 0.775 mV/mm and 0.4 mm 0.5109 mV/mm and 0.5 mm and 0.25 mV/mm and 0.5 mm, respectively, with a good linearity of more than 99%. The results also show a clear distinction between the cavity and surrounding tooth region. The stability, simplicity of design, and low cost of fabrication make it suitable for restorative dentistry.

  13. Image Centroid Algorithms for Sun Sensors with Super Wide Field of View

    Directory of Open Access Journals (Sweden)

    ZHAN Yinhu

    2015-10-01

    Full Text Available Sun image centroid algorithm is one of the key technologies of celestial navigation using sun sensors, which directly determine the precision of the sensors. Due to the limitation of centroid algorithm for non-circular sun image of the sun sensor of large field of view, firstly, the ellipse fitting algorithm is proposed for solving elliptical or sub-elliptical sun images. Then the spherical circle fitting algorithm is put forward. Based on the projection model and distortion model of the camera, the spherical circle fitting algorithm is used to obtain the edge points of the sun in the object space, and then the centroid of the sun can be determined by fitting the edge points as a spherical circle. In order to estimate the precision of spherical circle fitting algorithm, the centroid of the sun should be projected back to the image space. Theoretically, the spherical circle fitting algorithm is no longer need to take into account the shape of the sun image, the algorithm is more precise. The results of practical sun images demonstrate that the ellipse fitting algorithm is more suitable for the sun image with 70°~80.3° half angle of view, and the mean precision is about 0.075 pixels; the spherical circle fitting algorithm is more suitable for the sun image with a half angle of view larger than 80.3°, and the mean precision is about 0.082 pixels.

  14. Image accuracy and representational enhancement through low-level, multi-sensor integration techniques

    Energy Technology Data Exchange (ETDEWEB)

    Baker, J.E.

    1994-09-01

    Multi-Sensor Integration (MSI) is the combining of data and information from more than one source in order to generate a more reliable and consistent representation of the environment. The need for MSI derives largely from basic ambiguities inherent in our current sensor imaging technologies. These ambiguities exist as long as the mapping from reality to image is not 1-to-1. That is, if different {open_quotes}realities{close_quotes} lead to identical images, a single image cannot reveal the particular reality which was the truth. MSI techniques attempt to resolve some of these ambiguities by appropriately coupling complementary images to eliminate possible inverse mappings. What constitutes the best MSI technique is dependent on the given application domain, available sensors, and task requirements. MSI techniques can be divided into three categories based on the relative information content of the original images with that of the desired representation: (1) {open_quotes}detail enhancement,{close_quotes} wherein the relative information content of the original images is less rich than the desired representation; (2) {open_quotes}data enhancement,{close_quotes} wherein the MSI techniques are concerned with improving the accuracy of the data rather than either increasing or decreasing the level of detail; and (3) {open_quotes}conceptual enhancement,{close_quotes} wherein the image contains more detail than is desired, making it difficult to easily recognize objects of interest. In conceptual enhancement one must group pixels corresponding to the same conceptual object and thereby reduce the level of extraneous detail.

  15. Wide Range Temperature Sensors Based on One-Dimensional Photonic Crystal with a Single Defect

    Directory of Open Access Journals (Sweden)

    Arun Kumar

    2012-01-01

    Full Text Available Transmission characteristics of one-dimensional photonic crystal structure with a defect have been studied. Transfer matrix method has been employed to find the transmission spectra of the proposed structure. We consider a Si/air multilayer system and refractive index of Si layer has been taken as temperature dependent. As the refractive index of Si layer is a function of temperature of medium, so the central wavelength of the defect mode is a function of temperature. Variation in temperature causes the shifting of defect modes. It is found that the average change or shift in central wavelength of defect modes is 0.064 nm/K. This property can be exploited in the design of a temperature sensor.

  16. Lidar multi-range integrated Dewar assembly (IDA) for active-optical vision navigation sensor

    Science.gov (United States)

    Mayner, Philip; Clemet, Ed; Asbrock, Jim; Chen, Isabel; Getty, Jonathan; Malone, Neil; De Loo, John; Giroux, Mark

    2013-09-01

    A multi-range focal plane was developed and delivered by Raytheon Vision Systems for a docking system that was demonstrated on STS-134. This required state of the art focal plane and electronics synchronization to capture nanosecond length laser pulses to determine ranges with an accuracy of less than 1 inch.

  17. Waveguide piezoelectric micromachined ultrasonic transducer array for short-range pulse-echo imaging

    Science.gov (United States)

    Lu, Y.; Tang, H.; Wang, Q.; Fung, S.; Tsai, J. M.; Daneman, M.; Boser, B. E.; Horsley, D. A.

    2015-05-01

    This paper presents an 8 × 24 element, 100 μm-pitch, 20 MHz ultrasound imager based on a piezoelectric micromachined ultrasonic transducer (PMUT) array having integrated acoustic waveguides. The 70 μm diameter, 220 μm long waveguides function both to direct acoustic waves and to confine acoustic energy, and also to provide mechanical protection for the PMUT array used for surface-imaging applications such as an ultrasonic fingerprint sensor. The imager consists of a PMUT array bonded with a CMOS ASIC using wafer-level conductive eutectic bonding. This construction allows each PMUT in the array to have a dedicated front-end receive amplifier, which together with on-chip analog multiplexing enables individual pixel read-out with high signal-to-noise ratio through minimized parasitic capacitance between the PMUT and the front-end amplifier. Finite element method simulations demonstrate that the waveguides preserve the pressure amplitude of acoustic pulses over distances of 600 μm. Moreover, the waveguide design demonstrated here enables pixel-by-pixel readout of the ultrasound image due to improved directivity of the PMUT by directing acoustic waves and creating a pressure field with greater spatial uniformity at the end of the waveguide. Pulse-echo imaging experiments conducted using a one-dimensional steel grating demonstrate the array's ability to form a two-dimensional image of a target.

  18. Integrated sensor with frame memory and programmable resolution for light adaptive imaging

    Science.gov (United States)

    Zhou, Zhimin (Inventor); Fossum, Eric R. (Inventor); Pain, Bedabrata (Inventor)

    2004-01-01

    An image sensor operable to vary the output spatial resolution according to a received light level while maintaining a desired signal-to-noise ratio. Signals from neighboring pixels in a pixel patch with an adjustable size are added to increase both the image brightness and signal-to-noise ratio. One embodiment comprises a sensor array for receiving input signals, a frame memory array for temporarily storing a full frame, and an array of self-calibration column integrators for uniform column-parallel signal summation. The column integrators are capable of substantially canceling fixed pattern noise.

  19. Synthetic SAR Image Generation using Sensor, Terrain and Target Models

    DEFF Research Database (Denmark)

    Kusk, Anders; Abulaitijiang, Adili; Dall, Jørgen

    2016-01-01

    A tool to generate synthetic SAR images of objects set on a clutter background is described. The purpose is to generate images for training Automatic Target Recognition and Identification algorithms. The tool employs a commercial electromagnetic simulation program to calculate radar cross sections...

  20. CMOS active pixel sensor type imaging system on a chip

    Science.gov (United States)

    Fossum, Eric R. (Inventor); Nixon, Robert (Inventor)

    2011-01-01

    A single chip camera which includes an .[.intergrated.]. .Iadd.integrated .Iaddend.image acquisition portion and control portion and which has double sampling/noise reduction capabilities thereon. Part of the .[.intergrated.]. .Iadd.integrated .Iaddend.structure reduces the noise that is picked up during imaging.

  1. Three-dimensional near-field MIMO array imaging using range migration techniques.

    Science.gov (United States)

    Zhuge, Xiaodong; Yarovoy, Alexander G

    2012-06-01

    This paper presents a 3-D near-field imaging algorithm that is formulated for 2-D wideband multiple-input-multiple-output (MIMO) imaging array topology. The proposed MIMO range migration technique performs the image reconstruction procedure in the frequency-wavenumber domain. The algorithm is able to completely compensate the curvature of the wavefront in the near-field through a specifically defined interpolation process and provides extremely high computational efficiency by the application of the fast Fourier transform. The implementation aspects of the algorithm and the sampling criteria of a MIMO aperture are discussed. The image reconstruction performance and computational efficiency of the algorithm are demonstrated both with numerical simulations and measurements using 2-D MIMO arrays. Real-time 3-D near-field imaging can be achieved with a real-aperture array by applying the proposed MIMO range migration techniques.

  2. Ultra sensitive sensor with enhanced dynamic range for high speed detection of multi-color fluorescence radiation.

    Science.gov (United States)

    Tsupryk, A; Tovkach, I; Gavrilov, D; Kosobokova, O; Gudkov, G; Tyshko, G; Gorbovitski, B; Gorfinkel, V

    2008-05-15

    This paper describes design of the new ultra sensitive sensor system for fluorescence detection applications. System comprises two units: optical spectra separation unit and detection unit. Optical unit of the sensor performs spatial spectra separation of signal from the laser excited fluorescence, and resulting spectra is collected in the detection part of the system. Optical part is built using diffraction grating as spectra separation element. Detection part comprises 32-channel photomultiplier tube working in single photon counting mode with our 32-channel amplifier. Using single photon detection technique and specific signal processing algorithms for collected data, the proposed system allows to achieve unique combination of characteristics--high sensitivity, high detection speed and wide linearity dynamic range comparing to existing commercial instruments. DNA sequencing experiments with new sensor as detection device, and using two types of lasers (Ar-ion and Nd-YAG) were carried out, yielding sequencing traces which have quality factor of 20 for read lengths as long as 650 base pairs.

  3. Intelligent Luminance Control of Lighting Systems Based on Imaging Sensor Feedback

    Directory of Open Access Journals (Sweden)

    Haoting Liu

    2017-02-01

    Full Text Available An imaging sensor-based intelligent Light Emitting Diode (LED lighting system for desk use is proposed. In contrast to the traditional intelligent lighting system, such as the photosensitive resistance sensor-based or the infrared sensor-based system, the imaging sensor can realize a finer perception of the environmental light; thus it can guide a more precise lighting control. Before this system works, first lots of typical imaging lighting data of the desk application are accumulated. Second, a series of subjective and objective Lighting Effect Evaluation Metrics (LEEMs are defined and assessed for these datasets above. Then the cluster benchmarks of these objective LEEMs can be obtained. Third, both a single LEEM-based control and a multiple LEEMs-based control are developed to realize a kind of optimal luminance tuning. When this system works, first it captures the lighting image using a wearable camera. Then it computes the objective LEEMs of the captured image and compares them with the cluster benchmarks of the objective LEEMs. Finally, the single LEEM-based or the multiple LEEMs-based control can be implemented to get a kind of optimal lighting effect. Many experiment results have shown the proposed system can tune the LED lamp automatically according to environment luminance changes.

  4. Intelligent Luminance Control of Lighting Systems Based on Imaging Sensor Feedback

    Science.gov (United States)

    Liu, Haoting; Zhou, Qianxiang; Yang, Jin; Jiang, Ting; Liu, Zhizhen; Li, Jie

    2017-01-01

    An imaging sensor-based intelligent Light Emitting Diode (LED) lighting system for desk use is proposed. In contrast to the traditional intelligent lighting system, such as the photosensitive resistance sensor-based or the infrared sensor-based system, the imaging sensor can realize a finer perception of the environmental light; thus it can guide a more precise lighting control. Before this system works, first lots of typical imaging lighting data of the desk application are accumulated. Second, a series of subjective and objective Lighting Effect Evaluation Metrics (LEEMs) are defined and assessed for these datasets above. Then the cluster benchmarks of these objective LEEMs can be obtained. Third, both a single LEEM-based control and a multiple LEEMs-based control are developed to realize a kind of optimal luminance tuning. When this system works, first it captures the lighting image using a wearable camera. Then it computes the objective LEEMs of the captured image and compares them with the cluster benchmarks of the objective LEEMs. Finally, the single LEEM-based or the multiple LEEMs-based control can be implemented to get a kind of optimal lighting effect. Many experiment results have shown the proposed system can tune the LED lamp automatically according to environment luminance changes. PMID:28208781

  5. Crosstalk in multi-collection-gate image sensors and its improvement

    Science.gov (United States)

    Nguyen, A. Q.; Dao, V. T. S.; Shimonomura, K.; Kamakura, Y.; Etoh, T. G.

    2017-02-01

    Crosstalk in the backside-illuminated multi-collection-gate (BSI-MCG) image sensor was analyzed by means of Monte Carlo simulation. The BSI-MCG image sensor was proposed to achieve the temporal resolution of 1 ns. In this sensor, signal electrons generated by incident light near the back side travel to the central area of the pixel on the front side. Most of the signal electrons are collected by a collecting gate, to which a higher voltage is applied than that of other collection gates. However, due to spatial and temporal diffusion, some of the signal electrons migrate to other collection gates than the collecting gate, resulting in spatiotemporal crosstalk, i.e., mixture of signal electrons at neighboring collection gates and/or pixels. To reduce the crosstalk, the BSI-MCG structure is modified and the performance is preliminarily evaluated by Monte Carlo simulation. An additional donut-shaped N type implantation at the collection-gate area improves the potential gradient to the collecting gate, which reduces the crosstalk caused by the spatial diffusion. A multi-framing camera based on the BSI-MCG image sensor can be applied to Fluorescence Lifetime Imaging Microscopy (FLIM). In this case, crosstalk reduces accuracy in estimation of the lifetimes of fluorophore samples. The inaccuracy is compensated in a post image processing based on a proposed impulse response method.

  6. Noise suppression algorithm of short-wave infrared star image for daytime star sensor

    Science.gov (United States)

    Wang, Wenjie; Wei, Xinguo; Li, Jian; Wang, Gangyi

    2017-09-01

    As an important development trend of star sensor technology, research on daytime star sensor technology can expand the applications of star sensor from spacecrafts to airborne vehicles. The biggest problem for daytime star sensor is the detection of dim stars from strong atmospheric background radiation. The use of short-wave infrared (SWIR) technology has been proven to be an effective approach to solve this problem. However, the SWIR star images inevitably contain stripe nonuniformity noise and defective pixels, which degrade the quality of the acquired images and affect the subsequent star spot extraction and star centroiding accuracy seriously. Because the characteristics of stripe nonuniformity and defective pixels in the SWIR star images change with time during a long term continuous operation, the method of one-time off-line calibration is not applicable. To solve this problem, an algorithm of noise suppression for SWIR star image is proposed. It firstly extracts non-background pixels by one-dimensional mean filtering. Then through one-dimensional feature point descriptor, which is used to distinguish the bright star spots pixels from defective pixels, various types of defective pixels are accurately detected. Finally, the method of moment matching is adopted to remove the stripe nonuniformity and the defective pixels are compensated effectively. The simulation experiments results indicates that the proposed algorithm can adaptively and effectively suppress the influence of stripe nonuniformity and defective pixels in SWIR star images and it is beneficial to obtain higher star centroiding accuracy.

  7. Intelligent Luminance Control of Lighting Systems Based on Imaging Sensor Feedback.

    Science.gov (United States)

    Liu, Haoting; Zhou, Qianxiang; Yang, Jin; Jiang, Ting; Liu, Zhizhen; Li, Jie

    2017-02-09

    An imaging sensor-based intelligent Light Emitting Diode (LED) lighting system for desk use is proposed. In contrast to the traditional intelligent lighting system, such as the photosensitive resistance sensor-based or the infrared sensor-based system, the imaging sensor can realize a finer perception of the environmental light; thus it can guide a more precise lighting control. Before this system works, first lots of typical imaging lighting data of the desk application are accumulated. Second, a series of subjective and objective Lighting Effect Evaluation Metrics (LEEMs) are defined and assessed for these datasets above. Then the cluster benchmarks of these objective LEEMs can be obtained. Third, both a single LEEM-based control and a multiple LEEMs-based control are developed to realize a kind of optimal luminance tuning. When this system works, first it captures the lighting image using a wearable camera. Then it computes the objective LEEMs of the captured image and compares them with the cluster benchmarks of the objective LEEMs. Finally, the single LEEM-based or the multiple LEEMs-based control can be implemented to get a kind of optimal lighting effect. Many experiment results have shown the proposed system can tune the LED lamp automatically according to environment luminance changes.

  8. Cross-Sensor Calibration of the GAI Long Range Detection Network

    Science.gov (United States)

    Boccippio, Dennis J.; Boeck, William; Goodman, Steven J.; Cummins, K.; Cramer, J.

    1999-01-01

    The long range component of the North American Lightning Detection Network has been providing experimental data products since July 1996, offering cloud-to-ground lightning coverage throughout the Atlantic and Western Pacific oceans, as well as south to the Intertropical Convergence Zone. The network experiences a strong decrease in detection efficiency with range, which is also significantly modulated by differential propagation under day, night and terminator-crossing conditions. A climatological comparison of total lightning data observed by the Optical Transient Detector (OTD) and CG lightning observed by the long range network is conducted, with strict quality control and allowance for differential network performance before and after the activation of the Canadian Lightning Detection Network. This yields a first-order geographic estimate of long range network detection efficiency and its spatial variability. Intercomparisons are also performed over the continental US, allowing large scale estimates of the midlatitude climatological IC:CG ratio and its possible dependence on latitude.

  9. Survey of Collision Avoidance and Ranging Sensors for Mobile Robots. Revision 1

    Science.gov (United States)

    1992-12-01

    63 3.1.21 Laser Systems Devices MR-101 Missile Rangefinder ................ 64 3.1.22 IBEO Pulsar Survey Series Rangefinders...68 3.1.27 TRC Light Direction and Ranging System, LIDAR ................. 69 3.1.28 NOMADIC Sensus 300 Infrared Proximity System...based TOF ranging systems (also known as light or laser radar ( LIDAR )) first appeared in work performed at the Jet Propulsion Laboratory in the 1970s

  10. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    Science.gov (United States)

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  11. Adapting range migration techniques for imaging with metasurface antennas: analysis and limitations

    Science.gov (United States)

    Pulido Mancera, Laura; Fromenteze, Thomas; Sleasman, Timothy; Boyarsky, Michael; Imani, Mohammadreza F.; Reynolds, Matthew S.; Smith, David R.

    2017-04-01

    Dynamic metasurface antennas are planar structures that exhibit remarkable capabilities in controlling electromagnetic wave-fronts, advantages which are particularly attractive for microwave imaging. These antennas exhibit strong frequency dispersion and produce diverse radiation patterns. Such behavior presents unique challenges for integration with conventional imaging algorithms. We analyze an adapted version of the range migration algorithm (RMA) for use with dynamic metasurfaces in image reconstruction. Focusing on the the proposed pre-processing step, that ultimately allows a fast processing of the backscattered signal in the spatial frequency domain from which the fast Fourier transform can efficiently reconstruct the scene. Numerical studies illustrate imaging performance using both conventional methods and the adapted RMA, demonstrating that the RMA can reconstruct images with comparable quality in a fraction of the time. In this paper, we demonstrate the capabilities of the algorithm as a fast reconstruction tool, and we analyze the limitations of the presented technique in terms of image quality.

  12. Energy-Efficient Transmission of Wavelet-Based Images in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Vincent Lecuire

    2007-01-01

    Full Text Available We propose a self-adaptive image transmission scheme driven by energy efficiency considerations in order to be suitable for wireless sensor networks. It is based on wavelet image transform and semireliable transmission to achieve energy conservation. Wavelet image transform provides data decomposition in multiple levels of resolution, so the image can be divided into packets with different priorities. Semireliable transmission enables priority-based packet discarding by intermediate nodes according to their battery's state-of-charge. Such an image transmission approach provides a graceful tradeoff between the reconstructed images quality and the sensor nodes' lifetime. An analytical study in terms of dissipated energy is performed to compare the self-adaptive image transmission scheme to a fully reliable scheme. Since image processing is computationally intensive and operates on a large data set, the cost of the wavelet image transform is considered in the energy consumption analysis. Results show up to 80% reduction in the energy consumption achieved by our proposal compared to a nonenergy-aware one, with the guarantee for the image quality to be lower-bounded.

  13. Supplementary Golay pair for range side lobe suppression in dual-frequency tissue harmonic imaging.

    Science.gov (United States)

    Shen, Che-Chou; Wu, Chi; Peng, Jun-Kai

    2015-02-01

    In dual-frequency (DF) harmonic imaging, the second harmonic signal at second harmonic (2f0) frequency and the inter-modulation harmonic signal at fundamental (f0) frequency are simultaneously imaged for spectral compounding. When the phase-encoded Golay pair is utilized to improve the harmonic signal-to-noise ratio (SNR), however, the DF imaging suffers from range side lobe artifacts due to spectral cross-talk with other harmonic components at DC and third harmonic (3f0) frequency. In this study, a supplementary Golay pair is developed to suppress the range side lobes in combination with the original Golay pair. Since the phase code of the DC interference cannot be manipulated, the supplementary Golay is designed to reverse the polarity of the 3f0 interference and the f0 signal while keeping the 2f0 signal unchanged. For 2f0 imaging, the echo summation of the supplementary and the original Golay can cancel the 3f0 interference. On the contrary, the echo difference between the two Golay pairs can eliminate the DC interference for f0 imaging. Hydrophone measurements indicate that the range side lobe level (RSLL) increases with the signal bandwidth of DF harmonic imaging. By using the combination of the two Golay pairs, the achievable suppression of RSLL can be 3 and 14 dB, respectively for the f0 and 2f0 harmonic signal. B-mode phantom imaging also verifies the presence of range side lobe artifacts when only the original Golay pair is utilized. In combination with the supplementary Golay pair, the artifacts are effectively suppressed. The corresponding range side lobe magnitude reduces by about 8 dB in 2f0 imaging but remains unchanged in f0 imaging. Meanwhile, the harmonic SNR improves by 8-10 dB and the contrast-to-noise ratio of harmonic image increases from about 1 to 1.2 by spectral compounding. For DF tissue harmonic imaging, the spectral cross-talk in Golay excitation results in severe range side lobe artifacts. To restore the image quality, two particular

  14. Filter-free image sensor pixels comprising silicon nanowires with selective color absorption.

    Science.gov (United States)

    Park, Hyunsung; Dan, Yaping; Seo, Kwanyong; Yu, Young J; Duane, Peter K; Wober, Munib; Crozier, Kenneth B

    2014-01-01

    The organic dye filters of conventional color image sensors achieve the red/green/blue response needed for color imaging, but have disadvantages related to durability, low absorption coefficient, and fabrication complexity. Here, we report a new paradigm for color imaging based on all-silicon nanowire devices and no filters. We fabricate pixels consisting of vertical silicon nanowires with integrated photodetectors, demonstrate that their spectral sensitivities are governed by nanowire radius, and perform color imaging. Our approach is conceptually different from filter-based methods, as absorbed light is converted to photocurrent, ultimately presenting the opportunity for very high photon efficiency.

  15. Determination of visual range during fog and mist using digital camera images

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, John R; Moogan, Jamie C, E-mail: j.taylor@adfa.edu.a [School of Physical, Environmental and Mathematical Sciences, UNSW-ADFA, Canberra ACT, 2600 (Australia)

    2010-08-15

    During the winter of 2008, daily time series of images of five 'unit-cell chequerboard' targets were acquired using a digital camera. The camera and targets were located in the Majura Valley approximately 3 km from Canberra airport. We show how the contrast between the black and white sections of the targets is related to the meteorological range (or standard visual range), and compare estimates of this quantity derived from images acquired during fog and mist conditions with those from the Vaisala FD-12 visibility meter operated by the Bureau of Meteorology at Canberra Airport. The two sets of ranges are consistent but show the variability of visibility in the patchy fog conditions that often prevail in the Majura Valley. Significant spatial variations of the light extinction coefficient were found to occur over the longest 570 m optical path sampled by the imaging system. Visual ranges could be estimated out to ten times the distance to the furthest target, or approximately 6 km, in these experiments. Image saturation of the white sections of the targets was the major limitation on the quantitative interpretation of the images. In the future, the camera images will be processed in real time so that the camera exposure can be adjusted to avoid saturation.

  16. a Semi-Rigorous Sensor Model for Precision Geometric Processing of Mini-Rf Bistatic Radar Images of the Moon

    Science.gov (United States)

    Kirk, R. L.; Barrett, J. M.; Wahl, D. E.; Erteza, I.; Jackowatz, C. V.; Yocky, D. A.; Turner, S.; Bussey, D. B. J.; Paterson, G. W.

    2016-06-01

    The spaceborne synthetic aperture radar (SAR) instruments known as Mini-RF were designed to image shadowed areas of the lunar poles and assay the presence of ice deposits by quantitative polarimetry. We have developed radargrammetric processing techniques to enhance the value of these observations by removing spacecraft ephemeris errors and distortions caused by topographic parallax so the polarimetry can be compared with other data sets. Here we report on the extension of this capability from monostatic imaging (signal transmitted and received on the same spacecraft) to bistatic (transmission from Earth and reception on the spacecraft) which provides a unique opportunity to measure radar scattering at nonzero phase angles. In either case our radargrammetric sensor models first reconstruct the observed range and Doppler frequency from recorded image coordinates, then determine the ground location with a corrected trajectory on a more detailed topographic surface. The essential difference for bistatic radar is that range and Doppler shift depend on the transmitter as well as receiver trajectory. Incidental differences include the preparation of the images in a different (map projected) coordinate system and use of "squint" (i.e., imaging at nonzero rather than zero Doppler shift) to achieve the desired phase angle. Our approach to the problem is to reconstruct the time-of-observation, range, and Doppler shift of the image pixel by pixel in terms of rigorous geometric optics, then fit these functions with low-order polynomials accurate to a small fraction of a pixel. Range and Doppler estimated by using these polynomials can then be georeferenced rigorously on a new surface with an updated trajectory. This "semi-rigorous" approach (based on rigorous physics but involving fitting functions) speeds the calculation and avoids the need to manage both the original and adjusted trajectory data. We demonstrate the improvement in registration of the bistatic images for

  17. Change Detection with GRASS GIS – Comparison of images taken by different sensors

    Directory of Open Access Journals (Sweden)

    Michael Fuchs

    2009-04-01

    Full Text Available Images of American military reconnaissance satellites of the Sixties (CORONA in combination with modern sensors (SPOT, QuickBird were used for detection of changes in land use. The pilot area was located about 40 km northwest of Yemen’s capital Sana’a and covered approximately 100 km2 . To produce comparable layers from images of distinctly different sources, the moving window technique was applied, using the diversity parameter. The resulting difference layers reveal plausible and interpretable change patterns, particularly in areas where urban sprawl occurs.The comparison of CORONA images with images taken by modern sensors proved to be an additional tool to visualize and quantify major changes in land use. The results should serve as additional basic data eg. in regional planning.The computation sequence was executed in GRASS GIS.

  18. Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review

    Directory of Open Access Journals (Sweden)

    Zhuowen Lv

    2015-01-01

    Full Text Available Gait is a unique perceptible biometric feature at larger distances, and the gait representation approach plays a key role in a video sensor-based gait recognition system. Class Energy Image is one of the most important gait representation methods based on appearance, which has received lots of attentions. In this paper, we reviewed the expressions and meanings of various Class Energy Image approaches, and analyzed the information in the Class Energy Images. Furthermore, the effectiveness and robustness of these approaches were compared on the benchmark gait databases. We outlined the research challenges and provided promising future directions for the field. To the best of our knowledge, this is the first review that focuses on Class Energy Image. It can provide a useful reference in the literature of video sensor-based gait representation approach.

  19. Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review

    Science.gov (United States)

    Lv, Zhuowen; Xing, Xianglei; Wang, Kejun; Guan, Donghai

    2015-01-01

    Gait is a unique perceptible biometric feature at larger distances, and the gait representation approach plays a key role in a video sensor-based gait recognition system. Class Energy Image is one of the most important gait representation methods based on appearance, which has received lots of attentions. In this paper, we reviewed the expressions and meanings of various Class Energy Image approaches, and analyzed the information in the Class Energy Images. Furthermore, the effectiveness and robustness of these approaches were compared on the benchmark gait databases. We outlined the research challenges and provided promising future directions for the field. To the best of our knowledge, this is the first review that focuses on Class Energy Image. It can provide a useful reference in the literature of video sensor-based gait representation approach. PMID:25574935

  20. A Target Tracking System Based on Imaging Sensor Network with Wi-Fi

    Directory of Open Access Journals (Sweden)

    Aiqun Chen

    2014-05-01

    Full Text Available With the rapid development of network communication technology, a variety of network technology and communication technology has been integrated into our lives and work, bringing great convenience to our work and life. Current research in wireless sensor network technology in the field of communication technology is more popular because of the use of wireless sensor network technology can achieve the communication between objects and objects, people and things, the application of this technology has greatly expanded the ability for people to obtain information, have important significance to the development of people and society. Based on the powerful function of wireless sensor and bring the influence of people, this paper focuses on the design and implementation of the target tracking system based on image sensor networks with Wi-Fi.