WorldWideScience

Sample records for range image sensor

  1. High dynamic range imaging sensors and architectures

    CERN Document Server

    Darmont, Arnaud

    2013-01-01

    Illumination is a crucial element in many applications, matching the luminance of the scene with the operational range of a camera. When luminance cannot be adequately controlled, a high dynamic range (HDR) imaging system may be necessary. These systems are being increasingly used in automotive on-board systems, road traffic monitoring, and other industrial, security, and military applications. This book provides readers with an intermediate discussion of HDR image sensors and techniques for industrial and non-industrial applications. It describes various sensor and pixel architectures capable

  2. Characterization of modulated time-of-flight range image sensors

    Science.gov (United States)

    Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.

    2009-01-01

    A number of full field image sensors have been developed that are capable of simultaneously measuring intensity and distance (range) for every pixel in a given scene using an indirect time-of-flight measurement technique. A light source is intensity modulated at a frequency between 10-100 MHz, and an image sensor is modulated at the same frequency, synchronously sampling light reflected from objects in the scene (homodyne detection). The time of flight is manifested as a phase shift in the illumination modulation envelope, which can be determined from the sampled data simultaneously for each pixel in the scene. This paper presents a method of characterizing the high frequency modulation response of these image sensors, using a pico-second laser pulser. The characterization results allow the optimal operating parameters, such as the modulation frequency, to be identified in order to maximize the range measurement precision for a given sensor. A number of potential sources of error exist when using these sensors, including deficiencies in the modulation waveform shape, duty cycle, or phase, resulting in contamination of the resultant range data. From the characterization data these parameters can be identified and compensated for by modifying the sensor hardware or through post processing of the acquired range measurements.

  3. Introduction to sensors for ranging and imaging

    CERN Document Server

    Brooker, Graham

    2009-01-01

    ""This comprehensive text-reference provides a solid background in active sensing technology. It is concerned with active sensing, starting with the basics of time-of-flight sensors (operational principles, components), and going through the derivation of the radar range equation and the detection of echo signals, both fundamental to the understanding of radar, sonar and lidar imaging. Several chapters cover signal propagation of both electromagnetic and acoustic energy, target characteristics, stealth, and clutter. The remainder of the book introduces the range measurement process, active ima

  4. Range-Measuring Video Sensors

    Science.gov (United States)

    Howard, Richard T.; Briscoe, Jeri M.; Corder, Eric L.; Broderick, David

    2006-01-01

    Optoelectronic sensors of a proposed type would perform the functions of both electronic cameras and triangulation- type laser range finders. That is to say, these sensors would both (1) generate ordinary video or snapshot digital images and (2) measure the distances to selected spots in the images. These sensors would be well suited to use on robots that are required to measure distances to targets in their work spaces. In addition, these sensors could be used for all the purposes for which electronic cameras have been used heretofore. The simplest sensor of this type, illustrated schematically in the upper part of the figure, would include a laser, an electronic camera (either video or snapshot), a frame-grabber/image-capturing circuit, an image-data-storage memory circuit, and an image-data processor. There would be no moving parts. The laser would be positioned at a lateral distance d to one side of the camera and would be aimed parallel to the optical axis of the camera. When the range of a target in the field of view of the camera was required, the laser would be turned on and an image of the target would be stored and preprocessed to locate the angle (a) between the optical axis and the line of sight to the centroid of the laser spot.

  5. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics.

    Science.gov (United States)

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao

    2016-11-25

    For many practical applications of image sensors, how to extend the depth-of-field (DoF) is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known "extended DoF" (EDoF) technique, or "wavefront coding," by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy.

  6. Sensor assembly method using silicon interposer with trenches for three-dimensional binocular range sensors

    Science.gov (United States)

    Nakajima, Kazuhiro; Yamamoto, Yuji; Arima, Yutaka

    2018-04-01

    To easily assemble a three-dimensional binocular range sensor, we devised an alignment method for two image sensors using a silicon interposer with trenches. The trenches were formed using deep reactive ion etching (RIE) equipment. We produced a three-dimensional (3D) range sensor using the method and experimentally confirmed that sufficient alignment accuracy was realized. It was confirmed that the alignment accuracy of the two image sensors when using the proposed method is more than twice that of the alignment assembly method on a conventional board. In addition, as a result of evaluating the deterioration of the detection performance caused by the alignment accuracy, it was confirmed that the vertical deviation between the corresponding pixels in the two image sensors is substantially proportional to the decrease in detection performance. Therefore, we confirmed that the proposed method can realize more than twice the detection performance of the conventional method. Through these evaluations, the effectiveness of the 3D binocular range sensor aligned by the silicon interposer with the trenches was confirmed.

  7. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics

    Directory of Open Access Journals (Sweden)

    Sheng-Hsun Hsieh

    2016-11-01

    Full Text Available For many practical applications of image sensors, how to extend the depth-of-field (DoF is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known “extended DoF” (EDoF technique, or “wavefront coding,” by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy.

  8. Increasing Linear Dynamic Range of a CMOS Image Sensor

    Science.gov (United States)

    Pain, Bedabrata

    2007-01-01

    A generic design and a corresponding operating sequence have been developed for increasing the linear-response dynamic range of a complementary metal oxide/semiconductor (CMOS) image sensor. The design provides for linear calibrated dual-gain pixels that operate at high gain at a low signal level and at low gain at a signal level above a preset threshold. Unlike most prior designs for increasing dynamic range of an image sensor, this design does not entail any increase in noise (including fixed-pattern noise), decrease in responsivity or linearity, or degradation of photometric calibration. The figure is a simplified schematic diagram showing the circuit of one pixel and pertinent parts of its column readout circuitry. The conventional part of the pixel circuit includes a photodiode having a small capacitance, CD. The unconventional part includes an additional larger capacitance, CL, that can be connected to the photodiode via a transfer gate controlled in part by a latch. In the high-gain mode, the signal labeled TSR in the figure is held low through the latch, which also helps to adapt the gain on a pixel-by-pixel basis. Light must be coupled to the pixel through a microlens or by back illumination in order to obtain a high effective fill factor; this is necessary to ensure high quantum efficiency, a loss of which would minimize the efficacy of the dynamic- range-enhancement scheme. Once the level of illumination of the pixel exceeds the threshold, TSR is turned on, causing the transfer gate to conduct, thereby adding CL to the pixel capacitance. The added capacitance reduces the conversion gain, and increases the pixel electron-handling capacity, thereby providing an extension of the dynamic range. By use of an array of comparators also at the bottom of the column, photocharge voltages on sampling capacitors in each column are compared with a reference voltage to determine whether it is necessary to switch from the high-gain to the low-gain mode. Depending upon

  9. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders — from Optical Triangulation to the Automotive Field

    Directory of Open Access Journals (Sweden)

    Joe-Air Jiang

    2008-03-01

    Full Text Available With their significant features, the applications of complementary metal-oxidesemiconductor (CMOS image sensors covers a very extensive range, from industrialautomation to traffic applications such as aiming systems, blind guidance, active/passiverange finders, etc. In this paper CMOS image sensor-based active and passive rangefinders are presented. The measurement scheme of the proposed active/passive rangefinders is based on a simple triangulation method. The designed range finders chieflyconsist of a CMOS image sensor and some light sources such as lasers or LEDs. Theimplementation cost of our range finders is quite low. Image processing software to adjustthe exposure time (ET of the CMOS image sensor to enhance the performance oftriangulation-based range finders was also developed. An extensive series of experimentswere conducted to evaluate the performance of the designed range finders. From theexperimental results, the distance measurement resolutions achieved by the active rangefinder and the passive range finder can be better than 0.6% and 0.25% within themeasurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests onapplications of the developed CMOS image sensor-based range finders to the automotivefield were also conducted. The experimental results demonstrated that our range finders arewell-suited for distance measurements in this field.

  10. Large area CMOS image sensors

    International Nuclear Information System (INIS)

    Turchetta, R; Guerrini, N; Sedgwick, I

    2011-01-01

    CMOS image sensors, also known as CMOS Active Pixel Sensors (APS) or Monolithic Active Pixel Sensors (MAPS), are today the dominant imaging devices. They are omnipresent in our daily life, as image sensors in cellular phones, web cams, digital cameras, ... In these applications, the pixels can be very small, in the micron range, and the sensors themselves tend to be limited in size. However, many scientific applications, like particle or X-ray detection, require large format, often with large pixels, as well as other specific performance, like low noise, radiation hardness or very fast readout. The sensors are also required to be sensitive to a broad spectrum of radiation: photons from the silicon cut-off in the IR down to UV and X- and gamma-rays through the visible spectrum as well as charged particles. This requirement calls for modifications to the substrate to be introduced to provide optimized sensitivity. This paper will review existing CMOS image sensors, whose size can be as large as a single CMOS wafer, and analyse the technical requirements and specific challenges of large format CMOS image sensors.

  11. High Dynamic Range Imaging at the Quantum Limit with Single Photon Avalanche Diode-Based Image Sensors

    Science.gov (United States)

    Mattioli Della Rocca, Francescopaolo

    2018-01-01

    This paper examines methods to best exploit the High Dynamic Range (HDR) of the single photon avalanche diode (SPAD) in a high fill-factor HDR photon counting pixel that is scalable to megapixel arrays. The proposed method combines multi-exposure HDR with temporal oversampling in-pixel. We present a silicon demonstration IC with 96 × 40 array of 8.25 µm pitch 66% fill-factor SPAD-based pixels achieving >100 dB dynamic range with 3 back-to-back exposures (short, mid, long). Each pixel sums 15 bit-planes or binary field images internally to constitute one frame providing 3.75× data compression, hence the 1k frames per second (FPS) output off-chip represents 45,000 individual field images per second on chip. Two future projections of this work are described: scaling SPAD-based image sensors to HDR 1 MPixel formats and shrinking the pixel pitch to 1–3 µm. PMID:29641479

  12. Sampling Number Effects in 2D and Range Imaging of Range-gated Acquisition

    International Nuclear Information System (INIS)

    Kwon, Seong-Ouk; Park, Seung-Kyu; Baik, Sung-Hoon; Cho, Jai-Wan; Jeong, Kyung-Min

    2015-01-01

    In this paper, we analyzed the number effects of sampling images for making a 2D image and a range image from acquired RGI images. We analyzed the number effects of RGI images for making a 2D image and a range image using a RGI vision system. As the results, 2D image quality was not much depended on the number of sampling images but on how much well extract efficient RGI images. But, the number of RGI images was important for making a range image because range image quality was proportional to the number of RGI images. Image acquiring in a monitoring area of nuclear industry is an important function for safety inspection and preparing appropriate control plans. To overcome the non-visualization problem caused by airborne obstacle particles, vision systems should have extra-functions, such as active illumination lightening through disturbance airborne particles. One of these powerful active vision systems is a range-gated imaging system. The vision system based on the range-gated imaging system can acquire image data from raining or smoking environments. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and 3D images is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra-short exposure time to only get the illumination light. Here, the illuminant illuminates objects by flashing strong light through airborne disturbance particles. Thus, in contrast to passive conventional vision systems, the RGI active vision technology robust for low-visibility environments

  13. Sampling Number Effects in 2D and Range Imaging of Range-gated Acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Seong-Ouk; Park, Seung-Kyu; Baik, Sung-Hoon; Cho, Jai-Wan; Jeong, Kyung-Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, we analyzed the number effects of sampling images for making a 2D image and a range image from acquired RGI images. We analyzed the number effects of RGI images for making a 2D image and a range image using a RGI vision system. As the results, 2D image quality was not much depended on the number of sampling images but on how much well extract efficient RGI images. But, the number of RGI images was important for making a range image because range image quality was proportional to the number of RGI images. Image acquiring in a monitoring area of nuclear industry is an important function for safety inspection and preparing appropriate control plans. To overcome the non-visualization problem caused by airborne obstacle particles, vision systems should have extra-functions, such as active illumination lightening through disturbance airborne particles. One of these powerful active vision systems is a range-gated imaging system. The vision system based on the range-gated imaging system can acquire image data from raining or smoking environments. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and 3D images is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra-short exposure time to only get the illumination light. Here, the illuminant illuminates objects by flashing strong light through airborne disturbance particles. Thus, in contrast to passive conventional vision systems, the RGI active vision technology robust for low-visibility environments.

  14. Contactless respiratory monitoring system for magnetic resonance imaging applications using a laser range sensor

    Directory of Open Access Journals (Sweden)

    Krug Johannes W.

    2016-09-01

    Full Text Available During a magnetic resonance imaging (MRI exam, a respiratory signal can be required for different purposes, e.g. for patient monitoring, motion compensation or for research studies such as in functional MRI. In addition, respiratory information can be used as a biofeedback for the patient in order to control breath holds or shallow breathing. To reduce patient preparation time or distortions of the MR imaging system, we propose the use of a contactless approach for gathering information about the respiratory activity. An experimental setup based on a commercially available laser range sensor was used to detect respiratory induced motion of the chest or abdomen. This setup was tested using a motion phantom and different human subjects in an MRI scanner. A nasal airflow sensor served as a reference. For both, the phantom as well as the different human subjects, the motion frequency was precisely measured. These results show that a low cost, contactless, laser-based approach can be used to obtain information about the respiratory motion during an MRI exam.

  15. CMOS Active-Pixel Image Sensor With Intensity-Driven Readout

    Science.gov (United States)

    Langenbacher, Harry T.; Fossum, Eric R.; Kemeny, Sabrina

    1996-01-01

    Proposed complementary metal oxide/semiconductor (CMOS) integrated-circuit image sensor automatically provides readouts from pixels in order of decreasing illumination intensity. Sensor operated in integration mode. Particularly useful in number of image-sensing tasks, including diffractive laser range-finding, three-dimensional imaging, event-driven readout of sparse sensor arrays, and star tracking.

  16. High dynamic range coding imaging system

    Science.gov (United States)

    Wu, Renfan; Huang, Yifan; Hou, Guangqi

    2014-10-01

    We present a high dynamic range (HDR) imaging system design scheme based on coded aperture technique. This scheme can help us obtain HDR images which have extended depth of field. We adopt Sparse coding algorithm to design coded patterns. Then we utilize the sensor unit to acquire coded images under different exposure settings. With the guide of the multiple exposure parameters, a series of low dynamic range (LDR) coded images are reconstructed. We use some existing algorithms to fuse and display a HDR image by those LDR images. We build an optical simulation model and get some simulation images to verify the novel system.

  17. Nanophotonic Image Sensors.

    Science.gov (United States)

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Establishing imaging sensor specifications for digital still cameras

    Science.gov (United States)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  19. Imaging in scattering media using correlation image sensors and sparse convolutional coding

    KAUST Repository

    Heide, Felix; Xiao, Lei; Kolb, Andreas; Hullin, Matthias B.; Heidrich, Wolfgang

    2014-01-01

    Correlation image sensors have recently become popular low-cost devices for time-of-flight, or range cameras. They usually operate under the assumption of a single light path contributing to each pixel. We show that a more thorough analysis of the sensor data from correlation sensors can be used can be used to analyze the light transport in much more complex environments, including applications for imaging through scattering and turbid media. The key of our method is a new convolutional sparse coding approach for recovering transient (light-in-flight) images from correlation image sensors. This approach is enabled by an analysis of sparsity in complex transient images, and the derivation of a new physically-motivated model for transient images with drastically improved sparsity.

  20. Imaging in scattering media using correlation image sensors and sparse convolutional coding

    KAUST Repository

    Heide, Felix

    2014-10-17

    Correlation image sensors have recently become popular low-cost devices for time-of-flight, or range cameras. They usually operate under the assumption of a single light path contributing to each pixel. We show that a more thorough analysis of the sensor data from correlation sensors can be used can be used to analyze the light transport in much more complex environments, including applications for imaging through scattering and turbid media. The key of our method is a new convolutional sparse coding approach for recovering transient (light-in-flight) images from correlation image sensors. This approach is enabled by an analysis of sparsity in complex transient images, and the derivation of a new physically-motivated model for transient images with drastically improved sparsity.

  1. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    Science.gov (United States)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  2. Calibration and control for range imaging in mobile robot navigation

    Energy Technology Data Exchange (ETDEWEB)

    Dorum, O.H. [Norges Tekniske Hoegskole, Trondheim (Norway). Div. of Computer Systems and Telematics; Hoover, A. [University of South Florida, Tampa, FL (United States). Dept. of Computer Science and Engineering; Jones, J.P. [Oak Ridge National Lab., TN (United States)

    1994-06-01

    This paper addresses some issues in the development of sensor-based systems for mobile robot navigation which use range imaging sensors as the primary source for geometric information about the environment. In particular, we describe a model of scanning laser range cameras which takes into account the properties of the mechanical system responsible for image formation and a calibration procedure which yields improved accuracy over previous models. In addition, we describe an algorithm which takes the limitations of these sensors into account in path planning and path execution. In particular, range imaging sensors are characterized by a limited field of view and a standoff distance -- a minimum distance nearer than which surfaces cannot be sensed. These limitations can be addressed by enriching the concept of configuration space to include information about what can be sensed from a given configuration, and using this information to guide path planning and path following.

  3. A contest of sensors in close range 3D imaging: performance evaluation with a new metric test object

    Directory of Open Access Journals (Sweden)

    M. Hess

    2014-06-01

    Full Text Available An independent means of 3D image quality assessment is introduced, addressing non-professional users of sensors and freeware, which is largely characterized as closed-sourced and by the absence of quality metrics for processing steps, such as alignment. A performance evaluation of commercially available, state-of-the-art close range 3D imaging technologies is demonstrated with the help of a newly developed Portable Metric Test Artefact. The use of this test object provides quality control by a quantitative assessment of 3D imaging sensors. It will enable users to give precise specifications which spatial resolution and geometry recording they expect as outcome from their 3D digitizing process. This will lead to the creation of high-quality 3D digital surrogates and 3D digital assets. The paper is presented in the form of a competition of teams, and a possible winner will emerge.

  4. POTENTIALS OF IMAGE BASED ACTIVE RANGING TO CAPTURE DYNAMIC SCENES

    Directory of Open Access Journals (Sweden)

    B. Jutzi

    2012-09-01

    Full Text Available Obtaining a 3D description of man-made and natural environments is a basic task in Computer Vision and Remote Sensing. To this end, laser scanning is currently one of the dominating techniques to gather reliable 3D information. The scanning principle inherently needs a certain time interval to acquire the 3D point cloud. On the other hand, new active sensors provide the possibility of capturing range information by images with a single measurement. With this new technique image-based active ranging is possible which allows capturing dynamic scenes, e.g. like walking pedestrians in a yard or moving vehicles. Unfortunately most of these range imaging sensors have strong technical limitations and are not yet sufficient for airborne data acquisition. It can be seen from the recent development of highly specialized (far-range imaging sensors – so called flash-light lasers – that most of the limitations could be alleviated soon, so that future systems will be equipped with improved image size and potentially expanded operating range. The presented work is a first step towards the development of methods capable for application of range images in outdoor environments. To this end, an experimental setup was set up for investigating these proposed possibilities. With the experimental setup a measurement campaign was carried out and first results will be presented within this paper.

  5. High dynamic range vision sensor for automotive applications

    Science.gov (United States)

    Grenet, Eric; Gyger, Steve; Heim, Pascal; Heitger, Friedrich; Kaess, Francois; Nussbaum, Pascal; Ruedi, Pierre-Francois

    2005-02-01

    A 128 x 128 pixels, 120 dB vision sensor extracting at the pixel level the contrast magnitude and direction of local image features is used to implement a lane tracking system. The contrast representation (relative change of illumination) delivered by the sensor is independent of the illumination level. Together with the high dynamic range of the sensor, it ensures a very stable image feature representation even with high spatial and temporal inhomogeneities of the illumination. Dispatching off chip image feature is done according to the contrast magnitude, prioritizing features with high contrast magnitude. This allows to reduce drastically the amount of data transmitted out of the chip, hence the processing power required for subsequent processing stages. To compensate for the low fill factor (9%) of the sensor, micro-lenses have been deposited which increase the sensitivity by a factor of 5, corresponding to an equivalent of 2000 ASA. An algorithm exploiting the contrast representation output by the vision sensor has been developed to estimate the position of a vehicle relative to the road markings. The algorithm first detects the road markings based on the contrast direction map. Then, it performs quadratic fits on selected kernel of 3 by 3 pixels to achieve sub-pixel accuracy on the estimation of the lane marking positions. The resulting precision on the estimation of the vehicle lateral position is 1 cm. The algorithm performs efficiently under a wide variety of environmental conditions, including night and rainy conditions.

  6. Photon-counting image sensors

    CERN Document Server

    Teranishi, Nobukazu; Theuwissen, Albert; Stoppa, David; Charbon, Edoardo

    2017-01-01

    The field of photon-counting image sensors is advancing rapidly with the development of various solid-state image sensor technologies including single photon avalanche detectors (SPADs) and deep-sub-electron read noise CMOS image sensor pixels. This foundational platform technology will enable opportunities for new imaging modalities and instrumentation for science and industry, as well as new consumer applications. Papers discussing various photon-counting image sensor technologies and selected new applications are presented in this all-invited Special Issue.

  7. A Dynamic Range Enhanced Readout Technique with a Two-Step TDC for High Speed Linear CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Zhiyuan Gao

    2015-11-01

    Full Text Available This paper presents a dynamic range (DR enhanced readout technique with a two-step time-to-digital converter (TDC for high speed linear CMOS image sensors. A multi-capacitor and self-regulated capacitive trans-impedance amplifier (CTIA structure is employed to extend the dynamic range. The gain of the CTIA is auto adjusted by switching different capacitors to the integration node asynchronously according to the output voltage. A column-parallel ADC based on a two-step TDC is utilized to improve the conversion rate. The conversion is divided into coarse phase and fine phase. An error calibration scheme is also proposed to correct quantization errors caused by propagation delay skew within −Tclk~+Tclk. A linear CMOS image sensor pixel array is designed in the 0.13 μm CMOS process to verify this DR-enhanced high speed readout technique. The post simulation results indicate that the dynamic range of readout circuit is 99.02 dB and the ADC achieves 60.22 dB SNDR and 9.71 bit ENOB at a conversion rate of 2 MS/s after calibration, with 14.04 dB and 2.4 bit improvement, compared with SNDR and ENOB of that without calibration.

  8. Toward CMOS image sensor based glucose monitoring.

    Science.gov (United States)

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2012-09-07

    Complementary metal oxide semiconductor (CMOS) image sensor is a powerful tool for biosensing applications. In this present study, CMOS image sensor has been exploited for detecting glucose levels by simple photon count variation with high sensitivity. Various concentrations of glucose (100 mg dL(-1) to 1000 mg dL(-1)) were added onto a simple poly-dimethylsiloxane (PDMS) chip and the oxidation of glucose was catalyzed with the aid of an enzymatic reaction. Oxidized glucose produces a brown color with the help of chromogen during enzymatic reaction and the color density varies with the glucose concentration. Photons pass through the PDMS chip with varying color density and hit the sensor surface. Photon count was recognized by CMOS image sensor depending on the color density with respect to the glucose concentration and it was converted into digital form. By correlating the obtained digital results with glucose concentration it is possible to measure a wide range of blood glucose levels with great linearity based on CMOS image sensor and therefore this technique will promote a convenient point-of-care diagnosis.

  9. An improved triangulation laser rangefinder using a custom CMOS HDR linear image sensor

    Science.gov (United States)

    Liscombe, Michael

    3-D triangulation laser rangefinders are used in many modern applications, from terrain mapping to biometric identification. Although a wide variety of designs have been proposed, laser speckle noise still provides a fundamental limitation on range accuracy. These works propose a new triangulation laser rangefinder designed specifically to mitigate the effects of laser speckle noise. The proposed rangefinder uses a precision linear translator to laterally reposition the imaging system (e.g., image sensor and imaging lens). For a given spatial location of the laser spot, capturing N spatially uncorrelated laser spot profiles is shown to improve range accuracy by a factor of N . This technique has many advantages over past speckle-reduction technologies, such as a fixed system cost and form factor, and the ability to virtually eliminate laser speckle noise. These advantages are made possible through spatial diversity and come at the cost of increased acquisition time. The rangefinder makes use of the ICFYKWG1 linear image sensor, a custom CMOS sensor developed at the Vision Sensor Laboratory (York University). Tests are performed on the image sensor's innovative high dynamic range technology to determine its effects on range accuracy. As expected, experimental results have shown that the sensor provides a trade-off between dynamic range and range accuracy.

  10. Real-time image processing of TOF range images using a reconfigurable processor system

    Science.gov (United States)

    Hussmann, S.; Knoll, F.; Edeler, T.

    2011-07-01

    During the last years, Time-of-Flight sensors achieved a significant impact onto research fields in machine vision. In comparison to stereo vision system and laser range scanners they combine the advantages of active sensors providing accurate distance measurements and camera-based systems recording a 2D matrix at a high frame rate. Moreover low cost 3D imaging has the potential to open a wide field of additional applications and solutions in markets like consumer electronics, multimedia, digital photography, robotics and medical technologies. This paper focuses on the currently implemented 4-phase-shift algorithm in this type of sensors. The most time critical operation of the phase-shift algorithm is the arctangent function. In this paper a novel hardware implementation of the arctangent function using a reconfigurable processor system is presented and benchmarked against the state-of-the-art CORDIC arctangent algorithm. Experimental results show that the proposed algorithm is well suited for real-time processing of the range images of TOF cameras.

  11. Visualization of heavy ion-induced charge production in a CMOS image sensor

    CERN Document Server

    Végh, J; Klamra, W; Molnár, J; Norlin, LO; Novák, D; Sánchez-Crespo, A; Van der Marel, J; Fenyvesi, A; Valastyan, I; Sipos, A

    2004-01-01

    A commercial CMOS image sensor was irradiated with heavy ion beams in the several MeV energy range. The image sensor is equipped with a standard video output. The data were collected on-line through frame grabbing and analysed off-line after digitisation. It was shown that the response of the image sensor to the heavy ion bombardment varied with the type and energy of the projectiles. The sensor will be used for the CMS Barrel Muon Alignment system.

  12. Focus on image sensors

    NARCIS (Netherlands)

    Jos Gunsing; Daniël Telgen; Johan van Althuis; Jaap van de Loosdrecht; Mark Stappers; Peter Klijn

    2013-01-01

    Robots need sensors to operate properly. Using a single image sensor, various aspects of a robot operating in its environment can be measured or monitored. Over the past few years, image sensors have improved a lot: frame rate and resolution have increased, while prices have fallen. As a result,

  13. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process.

    Science.gov (United States)

    Takayanagi, Isao; Yoshimura, Norio; Mori, Kazuya; Matsuo, Shinichiro; Tanaka, Shunsuke; Abe, Hirofumi; Yasuda, Naoto; Ishikawa, Kenichiro; Okura, Shunsuke; Ohsawa, Shinji; Otaka, Toshinori

    2018-01-12

    To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke - . Readout noise under the highest pixel gain condition is 1 e - with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR) signal is obtained. Using this technology, a 1/2.7", 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR) approach.

  14. Commercial CMOS image sensors as X-ray imagers and particle beam monitors

    International Nuclear Information System (INIS)

    Castoldi, A.; Guazzoni, C.; Maffessanti, S.; Montemurro, G.V.; Carraresi, L.

    2015-01-01

    CMOS image sensors are widely used in several applications such as mobile handsets webcams and digital cameras among others. Furthermore they are available across a wide range of resolutions with excellent spectral and chromatic responses. In order to fulfill the need of cheap systems as beam monitors and high resolution image sensors for scientific applications we exploited the possibility of using commercial CMOS image sensors as X-rays and proton detectors. Two different sensors have been mounted and tested. An Aptina MT9v034, featuring 752 × 480 pixels, 6μm × 6μm pixel size has been mounted and successfully tested as bi-dimensional beam profile monitor, able to take pictures of the incoming proton bunches at the DeFEL beamline (1–6 MeV pulsed proton beam) of the LaBeC of INFN in Florence. The naked sensor is able to successfully detect the interactions of the single protons. The sensor point-spread-function (PSF) has been qualified with 1MeV protons and is equal to one pixel (6 mm) r.m.s. in both directions. A second sensor MT9M032, featuring 1472 × 1096 pixels, 2.2 × 2.2 μm pixel size has been mounted on a dedicated board as high-resolution imager to be used in X-ray imaging experiments with table-top generators. In order to ease and simplify the data transfer and the image acquisition the system is controlled by a dedicated micro-processor board (DM3730 1GHz SoC ARM Cortex-A8) on which a modified LINUX kernel has been implemented. The paper presents the architecture of the sensor systems and the results of the experimental measurements

  15. System overview and applications of a panoramic imaging perimeter sensor

    International Nuclear Information System (INIS)

    Pritchard, D.A.

    1995-01-01

    This paper presents an overview of the design and potential applications of a 360-degree scanning, multi-spectral intrusion detection sensor. This moderate-resolution, true panoramic imaging sensor is intended for exterior use at ranges from 50 to 1,500 meters. This Advanced Exterior Sensor (AES) simultaneously uses three sensing technologies (infrared, visible, and radar) along with advanced data processing methods to provide low false-alarm intrusion detection, tracking, and immediate visual assessment. The images from the infrared and visible detector sets and the radar range data are updated as the sensors rotate once per second. The radar provides range data with one-meter resolution. This sensor has been designed for easy use and rapid deployment to cover wide areas beyond or in place of typical perimeters, and tactical applications around fixed or temporary high-value assets. AES prototypes are in development. Applications discussed in this paper include replacements, augmentations, or new installations at fixed sites where topological features, atmospheric conditions, environmental restrictions, ecological regulations, and archaeological features limit the use of conventional security components and systems

  16. RADIANCE DOMAIN COMPOSITING FOR HIGH DYNAMIC RANGE IMAGING

    Directory of Open Access Journals (Sweden)

    M.R. Renu

    2013-02-01

    Full Text Available High dynamic range imaging aims at creating an image with a range of intensity variations larger than the range supported by a camera sensor. Most commonly used methods combine multiple exposure low dynamic range (LDR images, to obtain the high dynamic range (HDR image. Available methods typically neglect the noise term while finding appropriate weighting functions to estimate the camera response function as well as the radiance map. We look at the HDR imaging problem in a denoising frame work and aim at reconstructing a low noise radiance map from noisy low dynamic range images, which is tone mapped to get the LDR equivalent of the HDR image. We propose a maximum aposteriori probability (MAP based reconstruction of the HDR image using Gibb’s prior to model the radiance map, with total variation (TV as the prior to avoid unnecessary smoothing of the radiance field. To make the computation with TV prior efficient, we extend the majorize-minimize method of upper bounding the total variation by a quadratic function to our case which has a nonlinear term arising from the camera response function. A theoretical justification for doing radiance domain denoising as opposed to image domain denoising is also provided.

  17. Evaluation of a HDR image sensor with logarithmic response for mobile video-based applications

    Science.gov (United States)

    Tektonidis, Marco; Pietrzak, Mateusz; Monnin, David

    2017-10-01

    The performance of mobile video-based applications using conventional LDR (Low Dynamic Range) image sensors highly depends on the illumination conditions. As an alternative, HDR (High Dynamic Range) image sensors with logarithmic response are capable to acquire illumination-invariant HDR images in a single shot. We have implemented a complete image processing framework for a HDR sensor, including preprocessing methods (nonuniformity correction (NUC), cross-talk correction (CTC), and demosaicing) as well as tone mapping (TM). We have evaluated the HDR sensor for video-based applications w.r.t. the display of images and w.r.t. image analysis techniques. Regarding the display we have investigated the image intensity statistics over time, and regarding image analysis we assessed the number of feature correspondences between consecutive frames of temporal image sequences. For the evaluation we used HDR image data recorded from a vehicle on outdoor or combined outdoor/indoor itineraries, and we performed a comparison with corresponding conventional LDR image data.

  18. Thermal infrared panoramic imaging sensor

    Science.gov (United States)

    Gutin, Mikhail; Tsui, Eddy K.; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-05-01

    Panoramic cameras offer true real-time, 360-degree coverage of the surrounding area, valuable for a variety of defense and security applications, including force protection, asset protection, asset control, security including port security, perimeter security, video surveillance, border control, airport security, coastguard operations, search and rescue, intrusion detection, and many others. Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence of target detection, and enables both man-in-the-loop and fully automated system operation. Thermal imaging provides the benefits of all-weather, 24-hour day/night operation with no downtime. In addition, thermal signatures of different target types facilitate better classification, beyond the limits set by camera's spatial resolution. The useful range of catadioptric panoramic cameras is affected by their limited resolution. In many existing systems the resolution is optics-limited. Reflectors customarily used in catadioptric imagers introduce aberrations that may become significant at large camera apertures, such as required in low-light and thermal imaging. Advantages of panoramic imagers with high image resolution include increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) combines the strengths of improved, high-resolution panoramic optics with thermal imaging in the 8 - 14 micron spectral range, leveraged by intelligent video processing for automated detection, location, and tracking of moving targets. The work in progress supports the Future Combat Systems (FCS) and the

  19. An Over 90 dB Intra-Scene Single-Exposure Dynamic Range CMOS Image Sensor Using a 3.0 μm Triple-Gain Pixel Fabricated in a Standard BSI Process

    Directory of Open Access Journals (Sweden)

    Isao Takayanagi

    2018-01-01

    Full Text Available To respond to the high demand for high dynamic range imaging suitable for moving objects with few artifacts, we have developed a single-exposure dynamic range image sensor by introducing a triple-gain pixel and a low noise dual-gain readout circuit. The developed 3 μm pixel is capable of having three conversion gains. Introducing a new split-pinned photodiode structure, linear full well reaches 40 ke−. Readout noise under the highest pixel gain condition is 1 e− with a low noise readout circuit. Merging two signals, one with high pixel gain and high analog gain, and the other with low pixel gain and low analog gain, a single exposure dynamic rage (SEHDR signal is obtained. Using this technology, a 1/2.7”, 2M-pixel CMOS image sensor has been developed and characterized. The image sensor also employs an on-chip linearization function, yielding a 16-bit linear signal at 60 fps, and an intra-scene dynamic range of higher than 90 dB was successfully demonstrated. This SEHDR approach inherently mitigates the artifacts from moving objects or time-varying light sources that can appear in the multiple exposure high dynamic range (MEHDR approach.

  20. Design and Fabrication of Vertically-Integrated CMOS Image Sensors

    Science.gov (United States)

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors. PMID:22163860

  1. A video-rate range sensor based on depth from defocus

    OpenAIRE

    Ghita, Ovidiu; Whelan, Paul F.

    2001-01-01

    Recovering the depth information derived from dynamic scenes implies real-time range estimation. This paper addresses the implementation of a bifocal range sensor which estimates the depth by measuring the relative blurring between two images captured with different focal settings. To recover the depth accurately even in cases when the scene is textureless, one possible solution is to project a structured light on the scene. As a consequence, in the scene's spectrum a spatial frequency derive...

  2. First Experiences with Kinect v2 Sensor for Close Range 3d Modelling

    Science.gov (United States)

    Lachat, E.; Macher, H.; Mittet, M.-A.; Landes, T.; Grussenmeyer, P.

    2015-02-01

    RGB-D cameras, also known as range imaging cameras, are a recent generation of sensors. As they are suitable for measuring distances to objects at high frame rate, such sensors are increasingly used for 3D acquisitions, and more generally for applications in robotics or computer vision. This kind of sensors became popular especially since the Kinect v1 (Microsoft) arrived on the market in November 2010. In July 2014, Windows has released a new sensor, the Kinect for Windows v2 sensor, based on another technology as its first device. However, due to its initial development for video games, the quality assessment of this new device for 3D modelling represents a major investigation axis. In this paper first experiences with Kinect v2 sensor are related, and the ability of close range 3D modelling is investigated. For this purpose, error sources on output data as well as a calibration approach are presented.

  3. FIRST EXPERIENCES WITH KINECT V2 SENSOR FOR CLOSE RANGE 3D MODELLING

    Directory of Open Access Journals (Sweden)

    E. Lachat

    2015-02-01

    Full Text Available RGB-D cameras, also known as range imaging cameras, are a recent generation of sensors. As they are suitable for measuring distances to objects at high frame rate, such sensors are increasingly used for 3D acquisitions, and more generally for applications in robotics or computer vision. This kind of sensors became popular especially since the Kinect v1 (Microsoft arrived on the market in November 2010. In July 2014, Windows has released a new sensor, the Kinect for Windows v2 sensor, based on another technology as its first device. However, due to its initial development for video games, the quality assessment of this new device for 3D modelling represents a major investigation axis. In this paper first experiences with Kinect v2 sensor are related, and the ability of close range 3D modelling is investigated. For this purpose, error sources on output data as well as a calibration approach are presented.

  4. Characterization of the range effect in synthetic aperture radar images of concrete specimens for width estimation

    Science.gov (United States)

    Alzeyadi, Ahmed; Yu, Tzuyang

    2018-03-01

    Nondestructive evaluation (NDE) is an indispensable approach for the sustainability of critical civil infrastructure systems such as bridges and buildings. Recently, microwave/radar sensors are widely used for assessing the condition of concrete structures. Among existing imaging techniques in microwave/radar sensors, synthetic aperture radar (SAR) imaging enables researchers to conduct surface and subsurface inspection of concrete structures in the range-cross-range representation of SAR images. The objective of this paper is to investigate the range effect of concrete specimens in SAR images at various ranges (15 cm, 50 cm, 75 cm, 100 cm, and 200 cm). One concrete panel specimen (water-to-cement ratio = 0.45) of 30-cm-by-30-cm-by-5-cm was manufactured and scanned by a 10 GHz SAR imaging radar sensor inside an anechoic chamber. Scatterers in SAR images representing two corners of the concrete panel were used to estimate the width of the panel. It was found that the range-dependent pattern of corner scatters can be used to predict the width of concrete panels. Also, the maximum SAR amplitude decreases when the range increases. An empirical model was also proposed for width estimation of concrete panels.

  5. CMOS sensors for atmospheric imaging

    Science.gov (United States)

    Pratlong, Jérôme; Burt, David; Jerram, Paul; Mayer, Frédéric; Walker, Andrew; Simpson, Robert; Johnson, Steven; Hubbard, Wendy

    2017-09-01

    Recent European atmospheric imaging missions have seen a move towards the use of CMOS sensors for the visible and NIR parts of the spectrum. These applications have particular challenges that are completely different to those that have driven the development of commercial sensors for applications such as cell-phone or SLR cameras. This paper will cover the design and performance of general-purpose image sensors that are to be used in the MTG (Meteosat Third Generation) and MetImage satellites and the technology challenges that they have presented. We will discuss how CMOS imagers have been designed with 4T pixel sizes of up to 250 μm square achieving good charge transfer efficiency, or low lag, with signal levels up to 2M electrons and with high line rates. In both devices a low noise analogue read-out chain is used with correlated double sampling to suppress the readout noise and give a maximum dynamic range that is significantly larger than in standard commercial devices. Radiation hardness is a particular challenge for CMOS detectors and both of these sensors have been designed to be fully radiation hard with high latch-up and single-event-upset tolerances, which is now silicon proven on MTG. We will also cover the impact of ionising radiation on these devices. Because with such large pixels the photodiodes have a large open area, front illumination technology is sufficient to meet the detection efficiency requirements but with thicker than standard epitaxial silicon to give improved IR response (note that this makes latch up protection even more important). However with narrow band illumination reflections from the front and back of the dielectric stack on the top of the sensor produce Fabry-Perot étalon effects, which have been minimised with process modifications. We will also cover the addition of precision narrow band filters inside the MTG package to provide a complete imaging subsystem. Control of reflected light is also critical in obtaining the

  6. Temperature Sensors Integrated into a CMOS Image Sensor

    NARCIS (Netherlands)

    Abarca Prouza, A.N.; Xie, S.; Markenhof, Jules; Theuwissen, A.J.P.

    2017-01-01

    In this work, a novel approach is presented for measuring relative temperature variations inside the pixel array of a CMOS image sensor itself. This approach can give important information when compensation for dark (current) fixed pattern noise (FPN) is needed. The test image sensor consists of

  7. SENSOR CORRECTION AND RADIOMETRIC CALIBRATION OF A 6-BAND MULTISPECTRAL IMAGING SENSOR FOR UAV REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    J. Kelcey

    2012-07-01

    Full Text Available The increased availability of unmanned aerial vehicles (UAVs has resulted in their frequent adoption for a growing range of remote sensing tasks which include precision agriculture, vegetation surveying and fine-scale topographic mapping. The development and utilisation of UAV platforms requires broad technical skills covering the three major facets of remote sensing: data acquisition, data post-processing, and image analysis. In this study, UAV image data acquired by a miniature 6-band multispectral imaging sensor was corrected and calibrated using practical image-based data post-processing techniques. Data correction techniques included dark offset subtraction to reduce sensor noise, flat-field derived per-pixel look-up-tables to correct vignetting, and implementation of the Brown- Conrady model to correct lens distortion. Radiometric calibration was conducted with an image-based empirical line model using pseudo-invariant features (PIFs. Sensor corrections and radiometric calibration improve the quality of the data, aiding quantitative analysis and generating consistency with other calibrated datasets.

  8. Event-Based Color Segmentation With a High Dynamic Range Sensor

    Directory of Open Access Journals (Sweden)

    Alexandre Marcireau

    2018-04-01

    Full Text Available This paper introduces a color asynchronous neuromorphic event-based camera and a methodology to process color output from the device to perform color segmentation and tracking at the native temporal resolution of the sensor (down to one microsecond. Our color vision sensor prototype is a combination of three Asynchronous Time-based Image Sensors, sensitive to absolute color information. We devise a color processing algorithm leveraging this information. It is designed to be computationally cheap, thus showing how low level processing benefits from asynchronous acquisition and high temporal resolution data. The resulting color segmentation and tracking performance is assessed both with an indoor controlled scene and two outdoor uncontrolled scenes. The tracking's mean error to the ground truth for the objects of the outdoor scenes ranges from two to twenty pixels.

  9. Parametric Optimization of Lateral NIPIN Phototransistors for Flexible Image Sensors

    Directory of Open Access Journals (Sweden)

    Min Seok Kim

    2017-08-01

    Full Text Available Curved image sensors, which are a key component in bio-inspired imaging systems, have been widely studied because they can improve an imaging system in various aspects such as low optical aberrations, small-form, and simple optics configuration. Many methods and materials to realize a curvilinear imager have been proposed to address the drawbacks of conventional imaging/optical systems. However, there have been few theoretical studies in terms of electronics on the use of a lateral photodetector as a flexible image sensor. In this paper, we demonstrate the applicability of a Si-based lateral phototransistor as the pixel of a high-efficiency curved photodetector by conducting various electrical simulations with technology computer aided design (TCAD. The single phototransistor is analyzed with different device parameters: the thickness of the active cell, doping concentration, and structure geometry. This work presents a method to improve the external quantum efficiency (EQE, linear dynamic range (LDR, and mechanical stability of the phototransistor. We also evaluated the dark current in a matrix form of phototransistors to estimate the feasibility of the device as a flexible image sensor. Moreover, we fabricated and demonstrated an array of phototransistors based on our study. The theoretical study and design guidelines of a lateral phototransistor create new opportunities in flexible image sensors.

  10. Progress in sensor performance testing, modeling and range prediction using the TOD method: an overview

    Science.gov (United States)

    Bijl, Piet; Hogervorst, Maarten A.; Toet, Alexander

    2017-05-01

    The Triangle Orientation Discrimination (TOD) methodology includes i) a widely applicable, accurate end-to-end EO/IR sensor test, ii) an image-based sensor system model and iii) a Target Acquisition (TA) range model. The method has been extensively validated against TA field performance for a wide variety of well- and under-sampled imagers, systems with advanced image processing techniques such as dynamic super resolution and local adaptive contrast enhancement, and sensors showing smear or noise drift, for both static and dynamic test stimuli and as a function of target contrast. Recently, significant progress has been made in various directions. Dedicated visual and NIR test charts for lab and field testing are available and thermal test benches are on the market. Automated sensor testing using an objective synthetic human observer is within reach. Both an analytical and an image-based TOD model have recently been developed and are being implemented in the European Target Acquisition model ECOMOS and in the EOSTAR TDA. Further, the methodology is being applied for design optimization of high-end security camera systems. Finally, results from a recent perception study suggest that DRI ranges for real targets can be predicted by replacing the relevant distinctive target features by TOD test patterns of the same characteristic size and contrast, enabling a new TA modeling approach. This paper provides an overview.

  11. Automatic Generation of Wide Dynamic Range Image without Pseudo-Edge Using Integration of Multi-Steps Exposure Images

    Science.gov (United States)

    Migiyama, Go; Sugimura, Atsuhiko; Osa, Atsushi; Miike, Hidetoshi

    Recently, digital cameras are offering technical advantages rapidly. However, the shot image is different from the sight image generated when that scenery is seen with the naked eye. There are blown-out highlights and crushed blacks in the image that photographed the scenery of wide dynamic range. The problems are hardly generated in the sight image. These are contributory cause of difference between the shot image and the sight image. Blown-out highlights and crushed blacks are caused by the difference of dynamic range between the image sensor installed in a digital camera such as CCD and CMOS and the human visual system. Dynamic range of the shot image is narrower than dynamic range of the sight image. In order to solve the problem, we propose an automatic method to decide an effective exposure range in superposition of edges. We integrate multi-step exposure images using the method. In addition, we try to erase pseudo-edges using the process to blend exposure values. Afterwards, we get a pseudo wide dynamic range image automatically.

  12. Multiple-Event, Single-Photon Counting Imaging Sensor

    Science.gov (United States)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  13. Three dimensional multi perspective imaging with randomly distributed sensors

    International Nuclear Information System (INIS)

    DaneshPanah, Mehdi; Javidi, Bahrain

    2008-01-01

    In this paper, we review a three dimensional (3D) passive imaging system that exploits the visual information captured from the scene from multiple perspectives to reconstruct the scene voxel by voxel in 3D space. The primary contribution of this work is to provide a computational reconstruction scheme based on randomly distributed sensor locations in space. In virtually all of multi perspective techniques (e.g. integral imaging, synthetic aperture integral imaging, etc), there is an implicit assumption that the sensors lie on a simple, regular pickup grid. Here, we relax this assumption and suggest a computational reconstruction framework that unifies the available methods as its special cases. The importance of this work is that it enables three dimensional imaging technology to be implemented in a multitude of novel application domains such as 3D aerial imaging, collaborative imaging, long range 3D imaging and etc, where sustaining a regular pickup grid is not possible and/or the parallax requirements call for a irregular or sparse synthetic aperture mode. Although the sensors can be distributed in any random arrangement, we assume that the pickup position is measured at the time of capture of each elemental image. We demonstrate the feasibility of the methods proposed here by experimental results.

  14. Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.

    Science.gov (United States)

    Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J

    2017-06-12

    A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.

  15. A Biologically Inspired CMOS Image Sensor

    CERN Document Server

    Sarkar, Mukul

    2013-01-01

    Biological systems are a source of inspiration in the development of small autonomous sensor nodes. The two major types of optical vision systems found in nature are the single aperture human eye and the compound eye of insects. The latter are among the most compact and smallest vision sensors. The eye is a compound of individual lenses with their own photoreceptor arrays.  The visual system of insects allows them to fly with a limited intelligence and brain processing power. A CMOS image sensor replicating the perception of vision in insects is discussed and designed in this book for industrial (machine vision) and medical applications. The CMOS metal layer is used to create an embedded micro-polarizer able to sense polarization information. This polarization information is shown to be useful in applications like real time material classification and autonomous agent navigation. Further the sensor is equipped with in pixel analog and digital memories which allow variation of the dynamic range and in-pixel b...

  16. Dense range map reconstruction from a versatile robotic sensor system with an active trinocular vision and a passive binocular vision.

    Science.gov (United States)

    Kim, Min Young; Lee, Hyunkee; Cho, Hyungsuck

    2008-04-10

    One major research issue associated with 3D perception by robotic systems is the creation of efficient sensor systems that can generate dense range maps reliably. A visual sensor system for robotic applications is developed that is inherently equipped with two types of sensor, an active trinocular vision and a passive stereo vision. Unlike in conventional active vision systems that use a large number of images with variations of projected patterns for dense range map acquisition or from conventional passive vision systems that work well on specific environments with sufficient feature information, a cooperative bidirectional sensor fusion method for this visual sensor system enables us to acquire a reliable dense range map using active and passive information simultaneously. The fusion algorithms are composed of two parts, one in which the passive stereo vision helps active vision and the other in which the active trinocular vision helps the passive one. The first part matches the laser patterns in stereo laser images with the help of intensity images; the second part utilizes an information fusion technique using the dynamic programming method in which image regions between laser patterns are matched pixel-by-pixel with help of the fusion results obtained in the first part. To determine how the proposed sensor system and fusion algorithms can work in real applications, the sensor system is implemented on a robotic system, and the proposed algorithms are applied. A series of experimental tests is performed for a variety of configurations of robot and environments. The performance of the sensor system is discussed in detail.

  17. Automated Registration Of Images From Multiple Sensors

    Science.gov (United States)

    Rignot, Eric J. M.; Kwok, Ronald; Curlander, John C.; Pang, Shirley S. N.

    1994-01-01

    Images of terrain scanned in common by multiple Earth-orbiting remote sensors registered automatically with each other and, where possible, on geographic coordinate grid. Simulated image of terrain viewed by sensor computed from ancillary data, viewing geometry, and mathematical model of physics of imaging. In proposed registration algorithm, simulated and actual sensor images matched by area-correlation technique.

  18. X-ray imaging characterization of active edge silicon pixel sensors

    International Nuclear Information System (INIS)

    Ponchut, C; Ruat, M; Kalliopuska, J

    2014-01-01

    The aim of this work was the experimental characterization of edge effects in active-edge silicon pixel sensors, in the frame of X-ray pixel detectors developments for synchrotron experiments. We produced a set of active edge pixel sensors with 300 to 500 μm thickness, edge widths ranging from 100 μm to 150 μm, and n or p pixel contact types. The sensors with 256 × 256 pixels and 55 × 55 μm 2 pixel pitch were then bump-bonded to Timepix readout chips for X-ray imaging measurements. The reduced edge widths makes the edge pixels more sensitive to the electrical field distribution at the sensor boundaries. We characterized this effect by mapping the spatial response of the sensor edges with a finely focused X-ray synchrotron beam. One of the samples showed a distortion-free response on all four edges, whereas others showed variable degrees of distortions extending at maximum to 300 micron from the sensor edge. An application of active edge pixel sensors to coherent diffraction imaging with synchrotron beams is described

  19. Image-based occupancy sensor

    Science.gov (United States)

    Polese, Luigi Gentile; Brackney, Larry

    2015-05-19

    An image-based occupancy sensor includes a motion detection module that receives and processes an image signal to generate a motion detection signal, a people detection module that receives the image signal and processes the image signal to generate a people detection signal, a face detection module that receives the image signal and processes the image signal to generate a face detection signal, and a sensor integration module that receives the motion detection signal from the motion detection module, receives the people detection signal from the people detection module, receives the face detection signal from the face detection module, and generates an occupancy signal using the motion detection signal, the people detection signal, and the face detection signal, with the occupancy signal indicating vacancy or occupancy, with an occupancy indication specifying that one or more people are detected within the monitored volume.

  20. Active Sensor for Microwave Tissue Imaging with Bias-Switched Arrays.

    Science.gov (United States)

    Foroutan, Farzad; Nikolova, Natalia K

    2018-05-06

    A prototype of a bias-switched active sensor was developed and measured to establish the achievable dynamic range in a new generation of active arrays for microwave tissue imaging. The sensor integrates a printed slot antenna, a low-noise amplifier (LNA) and an active mixer in a single unit, which is sufficiently small to enable inter-sensor separation distance as small as 12 mm. The sensor’s input covers the bandwidth from 3 GHz to 7.5 GHz. Its output intermediate frequency (IF) is 30 MHz. The sensor is controlled by a simple bias-switching circuit, which switches ON and OFF the bias of the LNA and the mixer simultaneously. It was demonstrated experimentally that the dynamic range of the sensor, as determined by its ON and OFF states, is 109 dB and 118 dB at resolution bandwidths of 1 kHz and 100 Hz, respectively.

  1. CMOS foveal image sensor chip

    Science.gov (United States)

    Bandera, Cesar (Inventor); Scott, Peter (Inventor); Sridhar, Ramalingam (Inventor); Xia, Shu (Inventor)

    2002-01-01

    A foveal image sensor integrated circuit comprising a plurality of CMOS active pixel sensors arranged both within and about a central fovea region of the chip. The pixels in the central fovea region have a smaller size than the pixels arranged in peripheral rings about the central region. A new photocharge normalization scheme and associated circuitry normalizes the output signals from the different size pixels in the array. The pixels are assembled into a multi-resolution rectilinear foveal image sensor chip using a novel access scheme to reduce the number of analog RAM cells needed. Localized spatial resolution declines monotonically with offset from the imager's optical axis, analogous to biological foveal vision.

  2. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel.

    Science.gov (United States)

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-11-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments.

  3. Range-Image Acquisition for Discriminated Objects in a Range-gated Robot Vision System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seung-Kyu; Ahn, Yong-Jin; Park, Nak-Kyu; Baik, Sung-Hoon; Choi, Young-Soo; Jeong, Kyung-Min [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The imaging capability of a surveillance vision system from harsh low-visibility environments such as in fire and detonation areas is a key function to monitor the safety of the facilities. 2D and range image data acquired from low-visibility environment are important data to assess the safety and prepare appropriate countermeasures. Passive vision systems, such as conventional camera and binocular stereo vision systems usually cannot acquire image information when the reflected light is highly scattered and absorbed by airborne particles such as fog. In addition, the image resolution captured through low-density airborne particles is decreased because the image is blurred and dimmed by the scattering, emission and absorption. Active vision systems, such as structured light vision and projected stereo vision are usually more robust for harsh environment than passive vision systems. However, the performance is considerably decreased in proportion to the density of the particles. The RGI system provides 2D and range image data from several RGI images and it moreover provides clear images from low-visibility fog and smoke environment by using the sum of time-sliced images. Nowadays, the Range-gated (RG) imaging is an emerging technology in the field of surveillance for security applications, especially in the visualization of invisible night and fog environment. Although RGI viewing was discovered in the 1960's, this technology is, nowadays becoming more applicable by virtue of the rapid development of optical and sensor technologies. Especially, this system can be adopted in robot-vision system by virtue of its compact portable configuration. In contrast to passive vision systems, this technology enables operation even in harsh environments like fog and smoke. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been

  4. Range-Image Acquisition for Discriminated Objects in a Range-gated Robot Vision System

    International Nuclear Information System (INIS)

    Park, Seung-Kyu; Ahn, Yong-Jin; Park, Nak-Kyu; Baik, Sung-Hoon; Choi, Young-Soo; Jeong, Kyung-Min

    2015-01-01

    The imaging capability of a surveillance vision system from harsh low-visibility environments such as in fire and detonation areas is a key function to monitor the safety of the facilities. 2D and range image data acquired from low-visibility environment are important data to assess the safety and prepare appropriate countermeasures. Passive vision systems, such as conventional camera and binocular stereo vision systems usually cannot acquire image information when the reflected light is highly scattered and absorbed by airborne particles such as fog. In addition, the image resolution captured through low-density airborne particles is decreased because the image is blurred and dimmed by the scattering, emission and absorption. Active vision systems, such as structured light vision and projected stereo vision are usually more robust for harsh environment than passive vision systems. However, the performance is considerably decreased in proportion to the density of the particles. The RGI system provides 2D and range image data from several RGI images and it moreover provides clear images from low-visibility fog and smoke environment by using the sum of time-sliced images. Nowadays, the Range-gated (RG) imaging is an emerging technology in the field of surveillance for security applications, especially in the visualization of invisible night and fog environment. Although RGI viewing was discovered in the 1960's, this technology is, nowadays becoming more applicable by virtue of the rapid development of optical and sensor technologies. Especially, this system can be adopted in robot-vision system by virtue of its compact portable configuration. In contrast to passive vision systems, this technology enables operation even in harsh environments like fog and smoke. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been

  5. A protein-dye hybrid system as a narrow range tunable intracellular pH sensor.

    Science.gov (United States)

    Anees, Palapuravan; Sudheesh, Karivachery V; Jayamurthy, Purushothaman; Chandrika, Arunkumar R; Omkumar, Ramakrishnapillai V; Ajayaghosh, Ayyappanpillai

    2016-11-18

    Accurate monitoring of pH variations inside cells is important for the early diagnosis of diseases such as cancer. Even though a variety of different pH sensors are available, construction of a custom-made sensor array for measuring minute variations in a narrow biological pH window, using easily available constituents, is a challenge. Here we report two-component hybrid sensors derived from a protein and organic dye nanoparticles whose sensitivity range can be tuned by choosing different ratios of the components, to monitor the minute pH variations in a given system. The dye interacts noncovalently with the protein at lower pH and covalently at higher pH, triggering two distinguishable fluorescent signals at 700 and 480 nm, respectively. The pH sensitivity region of the probe can be tuned for every unit of the pH window resulting in custom-made pH sensors. These narrow range tunable pH sensors have been used to monitor pH variations in HeLa cells using the fluorescence imaging technique.

  6. Research on range-gated laser active imaging seeker

    Science.gov (United States)

    You, Mu; Wang, PengHui; Tan, DongJie

    2013-09-01

    Compared with other imaging methods such as millimeter wave imaging, infrared imaging and visible light imaging, laser imaging provides both a 2-D array of reflected intensity data as well as 2-D array of range data, which is the most important data for use in autonomous target acquisition .In terms of application, it can be widely used in military fields such as radar, guidance and fuse. In this paper, we present a laser active imaging seeker system based on range-gated laser transmitter and sensor technology .The seeker system presented here consist of two important part, one is laser image system, which uses a negative lens to diverge the light from a pulse laser to flood illuminate a target, return light is collected by a camera lens, each laser pulse triggers the camera delay and shutter. The other is stabilization gimbals, which is designed to be a rotatable structure both in azimuth and elevation angles. The laser image system consists of transmitter and receiver. The transmitter is based on diode pumped solid-state lasers that are passively Q-switched at 532nm wavelength. A visible wavelength was chosen because the receiver uses a Gen III image intensifier tube with a spectral sensitivity limited to wavelengths less than 900nm.The receiver is image intensifier tube's micro channel plate coupled into high sensitivity charge coupled device camera. The image has been taken at range over one kilometer and can be taken at much longer range in better weather. Image frame frequency can be changed according to requirement of guidance with modifiable range gate, The instantaneous field of views of the system was found to be 2×2 deg. Since completion of system integration, the seeker system has gone through a series of tests both in the lab and in the outdoor field. Two different kinds of buildings have been chosen as target, which is located at range from 200m up to 1000m.To simulate dynamic process of range change between missile and target, the seeker system has

  7. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    Science.gov (United States)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  8. CMOS Image Sensors: Electronic Camera On A Chip

    Science.gov (United States)

    Fossum, E. R.

    1995-01-01

    Recent advancements in CMOS image sensor technology are reviewed, including both passive pixel sensors and active pixel sensors. On- chip analog to digital converters and on-chip timing and control circuits permit realization of an electronic camera-on-a-chip. Highly miniaturized imaging systems based on CMOS image sensor technology are emerging as a competitor to charge-coupled devices for low cost uses.

  9. Image Sensor

    OpenAIRE

    Jerram, Paul; Stefanov, Konstantin

    2017-01-01

    An image sensor of the type for providing charge multiplication by impact ionisation has plurality of multiplication elements. Each element is arranged to receive charge from photosensitive elements of an image area and each element comprises a sequence of electrodes to move charge along a transport path. Each of the electrodes has an edge defining a boundary with a first electrode, a maximum width across the charge transport path and a leading edge that defines a boundary with a second elect...

  10. Range-Based Localization in Mobile Sensor Networks

    NARCIS (Netherlands)

    Dil, B.J.; Dil, B.; Dulman, S.O.; Havinga, Paul J.M.; Romer, K.; Karl, H.; Mattern, F.

    2006-01-01

    Localization schemes for wireless sensor networks can be classified as range-based or range-free. They differ in the information used for localization. Range-based methods use range measurements, while range-free techniques only use the content of the messages. None of the existing algorithms

  11. Priority image transmission in wireless sensor networks

    International Nuclear Information System (INIS)

    Nasri, M.; Helali, A.; Sghaier, H.; Maaref, H.

    2011-01-01

    The emerging technology during the last years allowed the development of new sensors equipped with wireless communication which can be organized into a cooperative autonomous network. Some application areas for wireless sensor networks (WSNs) are home automations, health care services, military domain, and environment monitoring. The required constraints are limited capacity of processing, limited storage capability, and especially these nodes are limited in energy. In addition, such networks are tiny battery powered which their lifetime is very limited. During image processing and transmission to the destination, the lifetime of sensor network is decreased quickly due to battery and processing power constraints. Therefore, digital image transmissions are a significant challenge for image sensor based Wireless Sensor Networks (WSNs). Based on a wavelet image compression, we propose a novel, robust and energy-efficient scheme, called Priority Image Transmission (PIT) in WSN by providing various priority levels during image transmissions. Different priorities in the compressed image are considered. The information for the significant wavelet coeffcients are transmitted with higher quality assurance, whereas relatively less important coefficients are transmitted with lower overhead. Simulation results show that the proposed scheme prolongs the system lifetime and achieves higher energy efficiency in WSN with an acceptable compromise on the image quality.

  12. Beam imaging sensor and method for using same

    Energy Technology Data Exchange (ETDEWEB)

    McAninch, Michael D.; Root, Jeffrey J.

    2017-01-03

    The present invention relates generally to the field of sensors for beam imaging and, in particular, to a new and useful beam imaging sensor for use in determining, for example, the power density distribution of a beam including, but not limited to, an electron beam or an ion beam. In one embodiment, the beam imaging sensor of the present invention comprises, among other items, a circumferential slit that is either circular, elliptical or polygonal in nature. In another embodiment, the beam imaging sensor of the present invention comprises, among other things, a discontinuous partially circumferential slit. Also disclosed is a method for using the various beams sensor embodiments of the present invention.

  13. Image-based environmental monitoring sensor application using an embedded wireless sensor network.

    Science.gov (United States)

    Paek, Jeongyeup; Hicks, John; Coe, Sharon; Govindan, Ramesh

    2014-08-28

    This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet's built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Cannot Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions.

  14. Image-Based Environmental Monitoring Sensor Application Using an Embedded Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Jeongyeup Paek

    2014-08-01

    Full Text Available This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet’s built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Jacinto Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions.

  15. SNAPSHOT SPECTRAL AND COLOR IMAGING USING A REGULAR DIGITAL CAMERA WITH A MONOCHROMATIC IMAGE SENSOR

    Directory of Open Access Journals (Sweden)

    J. Hauser

    2017-10-01

    Full Text Available Spectral imaging (SI refers to the acquisition of the three-dimensional (3D spectral cube of spatial and spectral data of a source object at a limited number of wavelengths in a given wavelength range. Snapshot spectral imaging (SSI refers to the instantaneous acquisition (in a single shot of the spectral cube, a process suitable for fast changing objects. Known SSI devices exhibit large total track length (TTL, weight and production costs and relatively low optical throughput. We present a simple SSI camera based on a regular digital camera with (i an added diffusing and dispersing phase-only static optical element at the entrance pupil (diffuser and (ii tailored compressed sensing (CS methods for digital processing of the diffused and dispersed (DD image recorded on the image sensor. The diffuser is designed to mix the spectral cube data spectrally and spatially and thus to enable convergence in its reconstruction by CS-based algorithms. In addition to performing SSI, this SSI camera is capable to perform color imaging using a monochromatic or gray-scale image sensor without color filter arrays.

  16. Multi-image acquisition-based distance sensor using agile laser spot beam.

    Science.gov (United States)

    Riza, Nabeel A; Amin, M Junaid

    2014-09-01

    We present a novel laser-based distance measurement technique that uses multiple-image-based spatial processing to enable distance measurements. Compared with the first-generation distance sensor using spatial processing, the modified sensor is no longer hindered by the classic Rayleigh axial resolution limit for the propagating laser beam at its minimum beam waist location. The proposed high-resolution distance sensor design uses an electronically controlled variable focus lens (ECVFL) in combination with an optical imaging device, such as a charged-coupled device (CCD), to produce and capture different laser spot size images on a target with these beam spot sizes different from the minimal spot size possible at this target distance. By exploiting the unique relationship of the target located spot sizes with the varying ECVFL focal length for each target distance, the proposed distance sensor can compute the target distance with a distance measurement resolution better than the axial resolution via the Rayleigh resolution criterion. Using a 30 mW 633 nm He-Ne laser coupled with an electromagnetically actuated liquid ECVFL, along with a 20 cm focal length bias lens, and using five spot images captured per target position by a CCD-based Nikon camera, a proof-of-concept proposed distance sensor is successfully implemented in the laboratory over target ranges from 10 to 100 cm with a demonstrated sub-cm axial resolution, which is better than the axial Rayleigh resolution limit at these target distances. Applications for the proposed potentially cost-effective distance sensor are diverse and include industrial inspection and measurement and 3D object shape mapping and imaging.

  17. Low-voltage 96 dB snapshot CMOS image sensor with 4.5 nW power dissipation per pixel.

    Science.gov (United States)

    Spivak, Arthur; Teman, Adam; Belenky, Alexander; Yadid-Pecht, Orly; Fish, Alexander

    2012-01-01

    Modern "smart" CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR) ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage "smart" image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR) and Dynamic Range (DR) as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  18. Micro-digital sun sensor: an imaging sensor for space applications

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.; Büttgen, B.; Hakkesteegt, H.C.; Jasen, H.; Leijtens, J.A.P.

    2010-01-01

    Micro-Digital Sun Sensor is an attitude sensor which senses relative position of micro-satellites to the sun in space. It is composed of a solar cell power supply, a RF communication block and an imaging chip which is called APS+. The APS+ integrates a CMOS Active Pixel Sensor (APS) of 512×512

  19. Passive ranging using a filter-based non-imaging method based on oxygen absorption.

    Science.gov (United States)

    Yu, Hao; Liu, Bingqi; Yan, Zongqun; Zhang, Yu

    2017-10-01

    To solve the problem of poor real-time measurement caused by a hyperspectral imaging system and to simplify the design in passive ranging technology based on oxygen absorption spectrum, a filter-based non-imaging ranging method is proposed. In this method, three bandpass filters are used to obtain the source radiation intensities that are located in the oxygen absorption band near 762 nm and the band's left and right non-absorption shoulders, and a photomultiplier tube is used as the non-imaging sensor of the passive ranging system. Range is estimated by comparing the calculated values of band-average transmission due to oxygen absorption, τ O 2 , against the predicted curve of τ O 2 versus range. The method is tested under short-range conditions. Accuracy of 6.5% is achieved with the designed experimental ranging system at the range of 400 m.

  20. Fusion of Images from Dissimilar Sensor Systems

    National Research Council Canada - National Science Library

    Chow, Khin

    2004-01-01

    Different sensors exploit different regions of the electromagnetic spectrum; therefore a multi-sensor image fusion system can take full advantage of the complementary capabilities of individual sensors in the suit...

  1. Collaborative Image Coding and Transmission over Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Min Wu

    2007-01-01

    Full Text Available The imaging sensors are able to provide intuitive visual information for quick recognition and decision. However, imaging sensors usually generate vast amount of data. Therefore, processing and coding of image data collected in a sensor network for the purpose of energy efficient transmission poses a significant technical challenge. In particular, multiple sensors may be collecting similar visual information simultaneously. We propose in this paper a novel collaborative image coding and transmission scheme to minimize the energy for data transmission. First, we apply a shape matching method to coarsely register images to find out maximal overlap to exploit the spatial correlation between images acquired from neighboring sensors. For a given image sequence, we transmit background image only once. A lightweight and efficient background subtraction method is employed to detect targets. Only the regions of target and their spatial locations are transmitted to the monitoring center. The whole image can then be reconstructed by fusing the background and the target images as well as their spatial locations. Experimental results show that the energy for image transmission can indeed be greatly reduced with collaborative image coding and transmission.

  2. 3D CAPTURING PERFORMANCES OF LOW-COST RANGE SENSORS FOR MASS-MARKET APPLICATIONS

    Directory of Open Access Journals (Sweden)

    G. Guidi

    2016-06-01

    Full Text Available Since the advent of the first Kinect as motion controller device for the Microsoft XBOX platform (November 2010, several similar active and low-cost range sensing devices have been introduced on the mass-market for several purposes, including gesture based interfaces, 3D multimedia interaction, robot navigation, finger tracking, 3D body scanning for garment design and proximity sensors for automotive. However, given their capability to generate a real time stream of range images, these has been used in some projects also as general purpose range devices, with performances that for some applications might be satisfying. This paper shows the working principle of the various devices, analyzing them in terms of systematic errors and random errors for exploring the applicability of them in standard 3D capturing problems. Five actual devices have been tested featuring three different technologies: i Kinect V1 by Microsoft, Structure Sensor by Occipital, and Xtion PRO by ASUS, all based on different implementations of the Primesense sensor; ii F200 by Intel/Creative, implementing the Realsense pattern projection technology; Kinect V2 by Microsoft, equipped with the Canesta TOF Camera. A critical analysis of the results tries first of all to compare them, and secondarily to focus the range of applications for which such devices could actually work as a viable solution.

  3. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    Science.gov (United States)

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  4. Low-Voltage 96 dB Snapshot CMOS Image Sensor with 4.5 nW Power Dissipation per Pixel

    Directory of Open Access Journals (Sweden)

    Orly Yadid-Pecht

    2012-07-01

    Full Text Available Modern “smart” CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage “smart” image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR and Dynamic Range (DR as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  5. New amorphous-silicon image sensor for x-ray diagnostic medical imaging applications

    Science.gov (United States)

    Weisfield, Richard L.; Hartney, Mark A.; Street, Robert A.; Apte, Raj B.

    1998-07-01

    This paper introduces new high-resolution amorphous Silicon (a-Si) image sensors specifically configured for demonstrating film-quality medical x-ray imaging capabilities. The devices utilizes an x-ray phosphor screen coupled to an array of a-Si photodiodes for detecting visible light, and a-Si thin-film transistors (TFTs) for connecting the photodiodes to external readout electronics. We have developed imagers based on a pixel size of 127 micrometer X 127 micrometer with an approximately page-size imaging area of 244 mm X 195 mm, and array size of 1,536 data lines by 1,920 gate lines, for a total of 2.95 million pixels. More recently, we have developed a much larger imager based on the same pixel pattern, which covers an area of approximately 406 mm X 293 mm, with 2,304 data lines by 3,200 gate lines, for a total of nearly 7.4 million pixels. This is very likely to be the largest image sensor array and highest pixel count detector fabricated on a single substrate. Both imagers connect to a standard PC and are capable of taking an image in a few seconds. Through design rule optimization we have achieved a light sensitive area of 57% and optimized quantum efficiency for x-ray phosphor output in the green part of the spectrum, yielding an average quantum efficiency between 500 and 600 nm of approximately 70%. At the same time, we have managed to reduce extraneous leakage currents on these devices to a few fA per pixel, which allows for very high dynamic range to be achieved. We have characterized leakage currents as a function of photodiode bias, time and temperature to demonstrate high stability over these large sized arrays. At the electronics level, we have adopted a new generation of low noise, charge- sensitive amplifiers coupled to 12-bit A/D converters. Considerable attention was given to reducing electronic noise in order to demonstrate a large dynamic range (over 4,000:1) for medical imaging applications. Through a combination of low data lines capacitance

  6. Virtual View Image over Wireless Visual Sensor Network

    Directory of Open Access Journals (Sweden)

    Gamantyo Hendrantoro

    2011-12-01

    Full Text Available In general, visual sensors are applied to build virtual view images. When number of visual sensors increases then quantity and quality of the information improves. However, the view images generation is a challenging task in Wireless Visual Sensor Network environment due to energy restriction, computation complexity, and bandwidth limitation. Hence this paper presents a new method of virtual view images generation from selected cameras on Wireless Visual Sensor Network. The aim of the paper is to meet bandwidth and energy limitations without reducing information quality. The experiment results showed that this method could minimize number of transmitted imageries with sufficient information.

  7. A STEP TOWARDS DYNAMIC SCENE ANALYSIS WITH ACTIVE MULTI-VIEW RANGE IMAGING SYSTEMS

    Directory of Open Access Journals (Sweden)

    M. Weinmann

    2012-07-01

    Full Text Available Obtaining an appropriate 3D description of the local environment remains a challenging task in photogrammetric research. As terrestrial laser scanners (TLSs perform a highly accurate, but time-dependent spatial scanning of the local environment, they are only suited for capturing static scenes. In contrast, new types of active sensors provide the possibility of simultaneously capturing range and intensity information by images with a single measurement, and the high frame rate also allows for capturing dynamic scenes. However, due to the limited field of view, one observation is not sufficient to obtain a full scene coverage and therefore, typically, multiple observations are collected from different locations. This can be achieved by either placing several fixed sensors at different known locations or by using a moving sensor. In the latter case, the relation between different observations has to be estimated by using information extracted from the captured data and then, a limited field of view may lead to problems if there are too many moving objects within it. Hence, a moving sensor platform with multiple and coupled sensor devices offers the advantages of an extended field of view which results in a stabilized pose estimation, an improved registration of the recorded point clouds and an improved reconstruction of the scene. In this paper, a new experimental setup for investigating the potentials of such multi-view range imaging systems is presented which consists of a moving cable car equipped with two synchronized range imaging devices. The presented setup allows for monitoring in low altitudes and it is suitable for getting dynamic observations which might arise from moving cars or from moving pedestrians. Relying on both 3D geometry and 2D imagery, a reliable and fully automatic approach for co-registration of captured point cloud data is presented which is essential for a high quality of all subsequent tasks. The approach involves using

  8. Imaging system design and image interpolation based on CMOS image sensor

    Science.gov (United States)

    Li, Yu-feng; Liang, Fei; Guo, Rui

    2009-11-01

    An image acquisition system is introduced, which consists of a color CMOS image sensor (OV9620), SRAM (CY62148), CPLD (EPM7128AE) and DSP (TMS320VC5509A). The CPLD implements the logic and timing control to the system. SRAM stores the image data, and DSP controls the image acquisition system through the SCCB (Omni Vision Serial Camera Control Bus). The timing sequence of the CMOS image sensor OV9620 is analyzed. The imaging part and the high speed image data memory unit are designed. The hardware and software design of the image acquisition and processing system is given. CMOS digital cameras use color filter arrays to sample different spectral components, such as red, green, and blue. At the location of each pixel only one color sample is taken, and the other colors must be interpolated from neighboring samples. We use the edge-oriented adaptive interpolation algorithm for the edge pixels and bilinear interpolation algorithm for the non-edge pixels to improve the visual quality of the interpolated images. This method can get high processing speed, decrease the computational complexity, and effectively preserve the image edges.

  9. Smart CMOS image sensor for lightning detection and imaging.

    Science.gov (United States)

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-03-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel frame-to-frame difference comparison with an adjustable threshold and on-chip digital processing allowing an efficient localization of a faint lightning pulse on the entire large format array at a frequency of 1 kHz. A CMOS prototype sensor with a 256×256 pixel array and a 60 μm pixel pitch has been fabricated using a 0.35 μm 2P 5M technology and tested to validate the selected detection approach.

  10. Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Chen Qu

    2017-09-01

    Full Text Available The CMOS (Complementary Metal-Oxide-Semiconductor is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze, causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.

  11. Oriented Edge-Based Feature Descriptor for Multi-Sensor Image Alignment and Enhancement

    Directory of Open Access Journals (Sweden)

    Myung-Ho Ju

    2013-10-01

    Full Text Available In this paper, we present an efficient image alignment and enhancement method for multi-sensor images. The shape of the object captured in a multi-sensor images can be determined by comparing variability of contrast using corresponding edges across multi-sensor image. Using this cue, we construct a robust feature descriptor based on the magnitudes of the oriented edges. Our proposed method enables fast image alignment by identifying matching features in multi-sensor images. We enhance the aligned multi-sensor images through the fusion of the salient regions from each image. The results of stitching the multi-sensor images and their enhancement demonstrate that our proposed method can align and enhance multi-sensor images more efficiently than previous methods.

  12. Contact CMOS imaging of gaseous oxygen sensor array.

    Science.gov (United States)

    Daivasagaya, Daisy S; Yao, Lei; Yi Yung, Ka; Hajj-Hassan, Mohamad; Cheung, Maurice C; Chodavarapu, Vamsy P; Bright, Frank V

    2011-10-01

    We describe a compact luminescent gaseous oxygen (O 2 ) sensor microsystem based on the direct integration of sensor elements with a polymeric optical filter and placed on a low power complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC). The sensor operates on the measurement of excited-state emission intensity of O 2 -sensitive luminophore molecules tris(4,7-diphenyl-1,10-phenanthroline) ruthenium(II) ([Ru(dpp) 3 ] 2+ ) encapsulated within sol-gel derived xerogel thin films. The polymeric optical filter is made with polydimethylsiloxane (PDMS) that is mixed with a dye (Sudan-II). The PDMS membrane surface is molded to incorporate arrays of trapezoidal microstructures that serve to focus the optical sensor signals on to the imager pixels. The molded PDMS membrane is then attached with the PDMS color filter. The xerogel sensor arrays are contact printed on top of the PDMS trapezoidal lens-like microstructures. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. Correlated double sampling circuit, pixel address, digital control and signal integration circuits are also implemented on-chip. The CMOS imager data is read out as a serial coded signal. The CMOS imager consumes a static power of 320 µW and an average dynamic power of 625 µW when operating at 100 Hz sampling frequency and 1.8 V DC. This CMOS sensor system provides a useful platform for the development of miniaturized optical chemical gas sensors.

  13. A Short-Range Distance Sensor with Exceptional Linearity

    Science.gov (United States)

    Simmons, Steven; Youngquist, Robert

    2013-01-01

    A sensor has been demonstrated that can measure distance over a total range of about 300 microns to an accuracy of about 0.1 nm (resolution of about 0.01 nm). This represents an exceptionally large dynamic range of operation - over 1,000,000. The sensor is optical in nature, and requires the attachment of a mirror to the object whose distance is being measured. This work resulted from actively developing a white light interferometric system to be used to measure the depths of defects in the Space Shuttle Orbiter windows. The concept was then applied to measuring distance. The concept later expanded to include spectrometer calibration. In summary, broadband (i.e., white) light is launched into a Michelson interferometer, one mirror of which is fixed and one of which is attached to the object whose distance is to be measured. The light emerging from the interferometer has traveled one of two distances: either the distance to the fixed mirror and back, or the distance to the moving mirror and back. These two light beams mix and produce an interference pattern where some wavelengths interfere constructively and some destructively. Sending this light into a spectrometer allows this interference pattern to be analyzed, yielding the net distance difference between the two paths. The unique feature of this distance sensor is its ability to measure accurately distance over a dynamic range of more than one million, the ratio of its range (about 300 microns) to its accuracy (about 0.1 nanometer). Such a large linear operating range is rare and arises here because both amplitude and phase-matching algorithms contribute to the performance. The sensor is limited by the need to attach a mirror of some kind to the object being tracked, and by the fairly small total range, but the exceptional dynamic range should make it of interest.

  14. Multi-sensor image fusion and its applications

    CERN Document Server

    Blum, Rick S

    2005-01-01

    Taking another lesson from nature, the latest advances in image processing technology seek to combine image data from several diverse types of sensors in order to obtain a more accurate view of the scene: very much the same as we rely on our five senses. Multi-Sensor Image Fusion and Its Applications is the first text dedicated to the theory and practice of the registration and fusion of image data, covering such approaches as statistical methods, color-related techniques, model-based methods, and visual information display strategies.After a review of state-of-the-art image fusion techniques,

  15. Displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor

    International Nuclear Information System (INIS)

    Wang, Zujun; Huang, Shaoyan; Liu, Minbo; Xiao, Zhigang; He, Baoping; Yao, Zhibin; Sheng, Jiangkun

    2014-01-01

    The experiments of displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor are presented. The CMOS APS image sensors are manufactured in the standard 0.35 μm CMOS technology. The flux of neutron beams was about 1.33 × 10 8 n/cm 2 s. The three samples were exposed by 1 MeV neutron equivalent-fluence of 1 × 10 11 , 5 × 10 11 , and 1 × 10 12 n/cm 2 , respectively. The mean dark signal (K D ), dark signal spike, dark signal non-uniformity (DSNU), noise (V N ), saturation output signal voltage (V S ), and dynamic range (DR) versus neutron fluence are investigated. The degradation mechanisms of CMOS APS image sensors are analyzed. The mean dark signal increase due to neutron displacement damage appears to be proportional to displacement damage dose. The dark images from CMOS APS image sensors irradiated by neutrons are presented to investigate the generation of dark signal spike

  16. Near-IR Two-Photon Fluorescent Sensor for K(+) Imaging in Live Cells.

    Science.gov (United States)

    Sui, Binglin; Yue, Xiling; Kim, Bosung; Belfield, Kevin D

    2015-08-19

    A new two-photon excited fluorescent K(+) sensor is reported. The sensor comprises three moieties, a highly selective K(+) chelator as the K(+) recognition unit, a boron-dipyrromethene (BODIPY) derivative modified with phenylethynyl groups as the fluorophore, and two polyethylene glycol chains to afford water solubility. The sensor displays very high selectivity (>52-fold) in detecting K(+) over other physiological metal cations. Upon binding K(+), the sensor switches from nonfluorescent to highly fluorescent, emitting red to near-IR (NIR) fluorescence. The sensor exhibited a good two-photon absorption cross section, 500 GM at 940 nm. Moreover, it is not sensitive to pH in the physiological pH range. Time-dependent cell imaging studies via both one- and two-photon fluorescence microscopy demonstrate that the sensor is suitable for dynamic K(+) sensing in living cells.

  17. Fully wireless pressure sensor based on endoscopy images

    Science.gov (United States)

    Maeda, Yusaku; Mori, Hirohito; Nakagawa, Tomoaki; Takao, Hidekuni

    2018-04-01

    In this paper, the result of developing a fully wireless pressure sensor based on endoscopy images for an endoscopic surgery is reported for the first time. The sensor device has structural color with a nm-scale narrow gap, and the gap is changed by air pressure. The structural color of the sensor is acquired from camera images. Pressure detection can be realized with existing endoscope configurations only. The inner air pressure of the human body should be measured under flexible-endoscope operation using the sensor. Air pressure monitoring, has two important purposes. The first is to quantitatively measure tumor size under a constant air pressure for treatment selection. The second purpose is to prevent the endangerment of a patient due to over transmission of air. The developed sensor was evaluated, and the detection principle based on only endoscopy images has been successfully demonstrated.

  18. Hardware test program for evaluation of baseline range/range rate sensor concept

    Science.gov (United States)

    Pernic, E.

    1985-01-01

    The test program Phase II effort provides additional design information in terms of range and range rate (R/R) sensor performance when observing and tracking a typical spacecraft target. The target used in the test program was a one-third scale model of the Hubble Space Telescope (HST) available at the MSFC test site where the tests were performed. A modified Bendix millimeter wave radar served as the R/R sensor test bed for evaluation of range and range rate tracking performance, and generation of radar signature characteristics of the spacecraft target. A summary of program test results and conclusions are presented along with detailed description of the Bendix test bed radar with accompaning instrumentation. The MSFC test site and facilities are described. The test procedures used to establish background levels, and the calibration procedures used in the range accuracy tests and RCS (radar cross section) signature measurements, are presented and a condensed version of the daily log kept during the 5 September through 17 September test period is also presented. The test program results are given starting with the RCS signature measurements, then continuing with range measurement accuracy test results and finally the range and range rate tracking accuracy test results.

  19. 3D-LSI technology for image sensor

    International Nuclear Information System (INIS)

    Motoyoshi, Makoto; Koyanagi, Mitsumasa

    2009-01-01

    Recently, the development of three-dimensional large-scale integration (3D-LSI) technologies has accelerated and has advanced from the research level or the limited production level to the investigation level, which might lead to mass production. By separating 3D-LSI technology into elementary technologies such as (1) through silicon via (TSV) formation, (2) bump formation, (3) wafer thinning, (4) chip/wafer alignment, and (5) chip/wafer stacking and reconstructing the entire process and structure, many methods to realize 3D-LSI devices can be developed. However, by considering a specific application, the supply chain of base wafers, and the purpose of 3D integration, a few suitable combinations can be identified. In this paper, we focus on the application of 3D-LSI technologies to image sensors. We describe the process and structure of the chip size package (CSP), developed on the basis of current and advanced 3D-LSI technologies, to be used in CMOS image sensors. Using the current LSI technologies, CSPs for 1.3 M, 2 M, and 5 M pixel CMOS image sensors were successfully fabricated without any performance degradation. 3D-LSI devices can be potentially employed in high-performance focal-plane-array image sensors. We propose a high-speed image sensor with an optical fill factor of 100% to be developed using next-generation 3D-LSI technology and fabricated using micro(μ)-bumps and micro(μ)-TSVs.

  20. Image acquisition system using on sensor compressed sampling technique

    Science.gov (United States)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  1. Proximity gettering technology for advanced CMOS image sensors using carbon cluster ion-implantation technique. A review

    Energy Technology Data Exchange (ETDEWEB)

    Kurita, Kazunari; Kadono, Takeshi; Okuyama, Ryousuke; Shigemastu, Satoshi; Hirose, Ryo; Onaka-Masada, Ayumi; Koga, Yoshihiro; Okuda, Hidehiko [SUMCO Corporation, Saga (Japan)

    2017-07-15

    A new technique is described for manufacturing advanced silicon wafers with the highest capability yet reported for gettering transition metallic, oxygen, and hydrogen impurities in CMOS image sensor fabrication processes. Carbon and hydrogen elements are localized in the projection range of the silicon wafer by implantation of ion clusters from a hydrocarbon molecular gas source. Furthermore, these wafers can getter oxygen impurities out-diffused to device active regions from a Czochralski grown silicon wafer substrate to the carbon cluster ion projection range during heat treatment. Therefore, they can reduce the formation of transition metals and oxygen-related defects in the device active regions and improve electrical performance characteristics, such as the dark current, white spot defects, pn-junction leakage current, and image lag characteristics. The new technique enables the formation of high-gettering-capability sinks for transition metals, oxygen, and hydrogen impurities under device active regions of CMOS image sensors. The wafers formed by this technique have the potential to significantly improve electrical devices performance characteristics in advanced CMOS image sensors. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  2. A Wildlife Monitoring System Based on Wireless Image Sensor Networks

    Directory of Open Access Journals (Sweden)

    Junguo Zhang

    2014-10-01

    Full Text Available Survival and development of wildlife sustains the balance and stability of the entire ecosystem. Wildlife monitoring can provide lots of information such as wildlife species, quantity, habits, quality of life and habitat conditions, to help researchers grasp the status and dynamics of wildlife resources, and to provide basis for the effective protection, sustainable use, and scientific management of wildlife resources. Wildlife monitoring is the foundation of wildlife protection and management. Wireless Sensor Networks (WSN technology has become the most popular technology in the field of information. With advance of the CMOS image sensor technology, wireless sensor networks combined with image sensors, namely Wireless Image Sensor Networks (WISN technology, has emerged as an alternative in monitoring applications. Monitoring wildlife is one of its most promising applications. In this paper, system architecture of the wildlife monitoring system based on the wireless image sensor networks was presented to overcome the shortcomings of the traditional monitoring methods. Specifically, some key issues including design of wireless image sensor nodes and software process design have been studied and presented. A self-powered rotatable wireless infrared image sensor node based on ARM and an aggregation node designed for large amounts of data were developed. In addition, their corresponding software was designed. The proposed system is able to monitor wildlife accurately, automatically, and remotely in all-weather condition, which lays foundations for applications of wireless image sensor networks in wildlife monitoring.

  3. Passive long range acousto-optic sensor

    Science.gov (United States)

    Slater, Dan

    2006-08-01

    Alexander Graham Bell's photophone of 1880 was a simple free space optical communication device that used the sun to illuminate a reflective acoustic diaphragm. A selenium photocell located 213 m (700 ft) away converted the acoustically modulated light beam back into sound. A variation of the photophone is presented here that uses naturally formed free space acousto-optic communications links to provide passive multichannel long range acoustic sensing. This system, called RAS (remote acoustic sensor), functions as a long range microphone with a demonstrated range in excess of 40 km (25 miles).

  4. Visual Image Sensor Organ Replacement

    Science.gov (United States)

    Maluf, David A.

    2014-01-01

    This innovation is a system that augments human vision through a technique called "Sensing Super-position" using a Visual Instrument Sensory Organ Replacement (VISOR) device. The VISOR device translates visual and other sensors (i.e., thermal) into sounds to enable very difficult sensing tasks. Three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. Because the human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns, the translation of images into sounds reduces the risk of accidentally filtering out important clues. The VISOR device was developed to augment the current state-of-the-art head-mounted (helmet) display systems. It provides the ability to sense beyond the human visible light range, to increase human sensing resolution, to use wider angle visual perception, and to improve the ability to sense distances. It also allows compensation for movement by the human or changes in the scene being viewed.

  5. A Full Parallel Event Driven Readout Technique for Area Array SPAD FLIM Image Sensors

    Directory of Open Access Journals (Sweden)

    Kaiming Nie

    2016-01-01

    Full Text Available This paper presents a full parallel event driven readout method which is implemented in an area array single-photon avalanche diode (SPAD image sensor for high-speed fluorescence lifetime imaging microscopy (FLIM. The sensor only records and reads out effective time and position information by adopting full parallel event driven readout method, aiming at reducing the amount of data. The image sensor includes four 8 × 8 pixel arrays. In each array, four time-to-digital converters (TDCs are used to quantize the time of photons’ arrival, and two address record modules are used to record the column and row information. In this work, Monte Carlo simulations were performed in Matlab in terms of the pile-up effect induced by the readout method. The sensor’s resolution is 16 × 16. The time resolution of TDCs is 97.6 ps and the quantization range is 100 ns. The readout frame rate is 10 Mfps, and the maximum imaging frame rate is 100 fps. The chip’s output bandwidth is 720 MHz with an average power of 15 mW. The lifetime resolvability range is 5–20 ns, and the average error of estimated fluorescence lifetimes is below 1% by employing CMM to estimate lifetimes.

  6. A time-resolved image sensor for tubeless streak cameras

    Science.gov (United States)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  7. Hardware test program for evaluation of baseline range-range rate sensor concept

    Science.gov (United States)

    1985-01-01

    The baseline range/range rate sensor concept was evaluated. The Interrupted CW (ICW) mode of operation continued with emphasis on establishing the sensitivity of the video portion of the receiver was 7 dB less than the theoretical value. This departs from test results of previous implementations in which achieved sensitivity was within 1.5 to 2 dB of the theoretical value. Several potential causes of this discrepancy in performance were identified and are scheduled for further investigation. Results indicate that a cost savings in both per unit and program costs are realizable by eliminating one of the modes of operation. An acquisition (total program) cost savings of approximately 10% is projected by eliminating the CW mode of operation. The modified R/R sensor would operate in the ICW mode only and would provide coverage from initial acquisition at 12 nmi to within a few hundred feet of the OMV. If the ICW mode only were selected, then an accompanying sensor would be required to provide coverage from a few hundred feet to docking.

  8. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    Science.gov (United States)

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  9. Self-Similarity Superresolution for Resource-Constrained Image Sensor Node in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yuehai Wang

    2014-01-01

    Full Text Available Wireless sensor networks, in combination with image sensors, open up a grand sensing application field. It is a challenging problem to recover a high resolution (HR image from its low resolution (LR counterpart, especially for low-cost resource-constrained image sensors with limited resolution. Sparse representation-based techniques have been developed recently and increasingly to solve this ill-posed inverse problem. Most of these solutions are based on an external dictionary learned from huge image gallery, consequently needing tremendous iteration and long time to match. In this paper, we explore the self-similarity inside the image itself, and propose a new combined self-similarity superresolution (SR solution, with low computation cost and high recover performance. In the self-similarity image super resolution model (SSIR, a small size sparse dictionary is learned from the image itself by the methods such as KSVD. The most similar patch is searched and specially combined during the sparse regulation iteration. Detailed information, such as edge sharpness, is preserved more faithfully and clearly. Experiment results confirm the effectiveness and efficiency of this double self-learning method in the image super resolution.

  10. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  11. High-speed imaging using CMOS image sensor with quasi pixel-wise exposure

    Science.gov (United States)

    Sonoda, T.; Nagahara, H.; Endo, K.; Sugiyama, Y.; Taniguchi, R.

    2017-02-01

    Several recent studies in compressive video sensing have realized scene capture beyond the fundamental trade-off limit between spatial resolution and temporal resolution using random space-time sampling. However, most of these studies showed results for higher frame rate video that were produced by simulation experiments or using an optically simulated random sampling camera, because there are currently no commercially available image sensors with random exposure or sampling capabilities. We fabricated a prototype complementary metal oxide semiconductor (CMOS) image sensor with quasi pixel-wise exposure timing that can realize nonuniform space-time sampling. The prototype sensor can reset exposures independently by columns and fix these amount of exposure by rows for each 8x8 pixel block. This CMOS sensor is not fully controllable via the pixels, and has line-dependent controls, but it offers flexibility when compared with regular CMOS or charge-coupled device sensors with global or rolling shutters. We propose a method to realize pseudo-random sampling for high-speed video acquisition that uses the flexibility of the CMOS sensor. We reconstruct the high-speed video sequence from the images produced by pseudo-random sampling using an over-complete dictionary.

  12. High-speed Imaging of Global Surface Temperature Distributions on Hypersonic Ballistic-Range Projectiles

    Science.gov (United States)

    Wilder, Michael C.; Reda, Daniel C.

    2004-01-01

    The NASA-Ames ballistic range provides a unique capability for aerothermodynamic testing of configurations in hypersonic, real-gas, free-flight environments. The facility can closely simulate conditions at any point along practically any trajectory of interest experienced by a spacecraft entering an atmosphere. Sub-scale models of blunt atmospheric entry vehicles are accelerated by a two-stage light-gas gun to speeds as high as 20 times the speed of sound to fly ballistic trajectories through an 24 m long vacuum-rated test section. The test-section pressure (effective altitude), the launch velocity of the model (flight Mach number), and the test-section working gas (planetary atmosphere) are independently variable. The model travels at hypersonic speeds through a quiescent test gas, creating a strong bow-shock wave and real-gas effects that closely match conditions achieved during actual atmospheric entry. The challenge with ballistic range experiments is to obtain quantitative surface measurements from a model traveling at hypersonic speeds. The models are relatively small (less than 3.8 cm in diameter), which limits the spatial resolution possible with surface mounted sensors. Furthermore, since the model is in flight, surface-mounted sensors require some form of on-board telemetry, which must survive the massive acceleration loads experienced during launch (up to 500,000 gravities). Finally, the model and any on-board instrumentation will be destroyed at the terminal wall of the range. For these reasons, optical measurement techniques are the most practical means of acquiring data. High-speed thermal imaging has been employed in the Ames ballistic range to measure global surface temperature distributions and to visualize the onset of transition to turbulent-flow on the forward regions of hypersonic blunt bodies. Both visible wavelength and infrared high-speed cameras are in use. The visible wavelength cameras are intensified CCD imagers capable of integration

  13. Thermoelectric infrared imaging sensors for automotive applications

    Science.gov (United States)

    Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto

    2004-07-01

    This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.

  14. CMOS image sensor-based immunodetection by refractive-index change.

    Science.gov (United States)

    Devadhasan, Jasmine P; Kim, Sanghyo

    2012-01-01

    A complementary metal oxide semiconductor (CMOS) image sensor is an intriguing technology for the development of a novel biosensor. Indeed, the CMOS image sensor mechanism concerning the detection of the antigen-antibody (Ag-Ab) interaction at the nanoscale has been ambiguous so far. To understand the mechanism, more extensive research has been necessary to achieve point-of-care diagnostic devices. This research has demonstrated a CMOS image sensor-based analysis of cardiovascular disease markers, such as C-reactive protein (CRP) and troponin I, Ag-Ab interactions on indium nanoparticle (InNP) substrates by simple photon count variation. The developed sensor is feasible to detect proteins even at a fg/mL concentration under ordinary room light. Possible mechanisms, such as dielectric constant and refractive-index changes, have been studied and proposed. A dramatic change in the refractive index after protein adsorption on an InNP substrate was observed to be a predominant factor involved in CMOS image sensor-based immunoassay.

  15. Development of integrated semiconductor optical sensors for functional brain imaging

    Science.gov (United States)

    Lee, Thomas T.

    Optical imaging of neural activity is a widely accepted technique for imaging brain function in the field of neuroscience research, and has been used to study the cerebral cortex in vivo for over two decades. Maps of brain activity are obtained by monitoring intensity changes in back-scattered light, called Intrinsic Optical Signals (IOS), that correspond to fluctuations in blood oxygenation and volume associated with neural activity. Current imaging systems typically employ bench-top equipment including lamps and CCD cameras to study animals using visible light. Such systems require the use of anesthetized or immobilized subjects with craniotomies, which imposes limitations on the behavioral range and duration of studies. The ultimate goal of this work is to overcome these limitations by developing a single-chip semiconductor sensor using arrays of sources and detectors operating at near-infrared (NIR) wavelengths. A single-chip implementation, combined with wireless telemetry, will eliminate the need for immobilization or anesthesia of subjects and allow in vivo studies of free behavior. NIR light offers additional advantages because it experiences less absorption in animal tissue than visible light, which allows for imaging through superficial tissues. This, in turn, reduces or eliminates the need for traumatic surgery and enables long-term brain-mapping studies in freely-behaving animals. This dissertation concentrates on key engineering challenges of implementing the sensor. This work shows the feasibility of using a GaAs-based array of vertical-cavity surface emitting lasers (VCSELs) and PIN photodiodes for IOS imaging. I begin with in-vivo studies of IOS imaging through the skull in mice, and use these results along with computer simulations to establish minimum performance requirements for light sources and detectors. I also evaluate the performance of a current commercial VCSEL for IOS imaging, and conclude with a proposed prototype sensor.

  16. Noise analysis of a novel hybrid active-passive pixel sensor for medical X-ray imaging

    International Nuclear Information System (INIS)

    Safavian, N.; Izadi, M.H.; Sultana, A.; Wu, D.; Karim, K.S.; Nathan, A.; Rowlands, J.A.

    2009-01-01

    Passive pixel sensor (PPS) is one of the most widely used architectures in large area amorphous silicon (a-Si) flat panel imagers. It consists of a detector and a thin film transistor (TFT) acting as a readout switch. While the PPS is advantageous in terms of providing a simple and small architecture suitable for high-resolution imaging, it directly exposes the signal to the noise of data line and external readout electronics, causing significant increase in the minimum readable sensor input signal. In this work we present the operation and noise performance of a hybrid 3-TFT current programmed, current output active pixel sensor (APS) suitable for real-time X-ray imaging. The pixel circuit extends the application of a-Si TFT from conventional switching element to on-pixel amplifier for enhanced signal-to-noise ratio and higher imager dynamic range. The capability of operation in both passive and active modes as well as being able to compensate for inherent instabilities of the TFTs makes the architecture a good candidate for X-ray imaging modalities with a wide range of incoming X-ray intensities. Measurement and theoretical calculations reveal a value for input refferd noise below the 1000 electron noise limit for real-time fluoroscopy. (copyright 2009 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  17. Range-Free Localization Schemes for Large Scale Sensor Networks

    National Research Council Canada - National Science Library

    He, Tian; Huang, Chengdu; Blum, Brain M; Stankovic, John A; Abdelzaher, Tarek

    2003-01-01

    .... Because coarse accuracy is sufficient for most sensor network applications, solutions in range-free localization are being pursued as a cost-effective alternative to more expensive range-based approaches...

  18. Optimization of CMOS image sensor utilizing variable temporal multisampling partial transfer technique to achieve full-frame high dynamic range with superior low light and stop motion capability

    Science.gov (United States)

    Kabir, Salman; Smith, Craig; Armstrong, Frank; Barnard, Gerrit; Schneider, Alex; Guidash, Michael; Vogelsang, Thomas; Endsley, Jay

    2018-03-01

    Differential binary pixel technology is a threshold-based timing, readout, and image reconstruction method that utilizes the subframe partial charge transfer technique in a standard four-transistor (4T) pixel CMOS image sensor to achieve a high dynamic range video with stop motion. This technology improves low light signal-to-noise ratio (SNR) by up to 21 dB. The method is verified in silicon using a Taiwan Semiconductor Manufacturing Company's 65 nm 1.1 μm pixel technology 1 megapixel test chip array and is compared with a traditional 4 × oversampling technique using full charge transfer to show low light SNR superiority of the presented technology.

  19. Scintillator high-gain avalanche rushing photoconductor active-matrix flat panel imager: zero-spatial frequency x-ray imaging properties of the solid-state SHARP sensor structure.

    Science.gov (United States)

    Wronski, M; Zhao, W; Tanioka, K; Decrescenzo, G; Rowlands, J A

    2012-11-01

    The authors are investigating the feasibility of a new type of solid-state x-ray imaging sensor with programmable avalanche gain: scintillator high-gain avalanche rushing photoconductor active matrix flat panel imager (SHARP-AMFPI). The purpose of the present work is to investigate the inherent x-ray detection properties of SHARP and demonstrate its wide dynamic range through programmable gain. A distributed resistive layer (DRL) was developed to maintain stable avalanche gain operation in a solid-state HARP. The signal and noise properties of the HARP-DRL for optical photon detection were investigated as a function of avalanche gain both theoretically and experimentally, and the results were compared with HARP tube (with electron beam readout) used in previous investigations of zero spatial frequency performance of SHARP. For this new investigation, a solid-state SHARP x-ray image sensor was formed by direct optical coupling of the HARP-DRL with a structured cesium iodide (CsI) scintillator. The x-ray sensitivity of this sensor was measured as a function of avalanche gain and the results were compared with the sensitivity of HARP-DRL measured optically. The dynamic range of HARP-DRL with variable avalanche gain was investigated for the entire exposure range encountered in radiography∕fluoroscopy (R∕F) applications. The signal from HARP-DRL as a function of electric field showed stable avalanche gain, and the noise associated with the avalanche process agrees well with theory and previous measurements from a HARP tube. This result indicates that when coupled with CsI for x-ray detection, the additional noise associated with avalanche gain in HARP-DRL is negligible. The x-ray sensitivity measurements using the SHARP sensor produced identical avalanche gain dependence on electric field as the optical measurements with HARP-DRL. Adjusting the avalanche multiplication gain in HARP-DRL enabled a very wide dynamic range which encompassed all clinically relevant

  20. Scintillator high-gain avalanche rushing photoconductor active-matrix flat panel imager: Zero-spatial frequency x-ray imaging properties of the solid-state SHARP sensor structure

    International Nuclear Information System (INIS)

    Wronski, M.; Zhao, W.; Tanioka, K.; DeCrescenzo, G.; Rowlands, J. A.

    2012-01-01

    Purpose: The authors are investigating the feasibility of a new type of solid-state x-ray imaging sensor with programmable avalanche gain: scintillator high-gain avalanche rushing photoconductor active matrix flat panel imager (SHARP-AMFPI). The purpose of the present work is to investigate the inherent x-ray detection properties of SHARP and demonstrate its wide dynamic range through programmable gain. Methods: A distributed resistive layer (DRL) was developed to maintain stable avalanche gain operation in a solid-state HARP. The signal and noise properties of the HARP-DRL for optical photon detection were investigated as a function of avalanche gain both theoretically and experimentally, and the results were compared with HARP tube (with electron beam readout) used in previous investigations of zero spatial frequency performance of SHARP. For this new investigation, a solid-state SHARP x-ray image sensor was formed by direct optical coupling of the HARP-DRL with a structured cesium iodide (CsI) scintillator. The x-ray sensitivity of this sensor was measured as a function of avalanche gain and the results were compared with the sensitivity of HARP-DRL measured optically. The dynamic range of HARP-DRL with variable avalanche gain was investigated for the entire exposure range encountered in radiography/fluoroscopy (R/F) applications. Results: The signal from HARP-DRL as a function of electric field showed stable avalanche gain, and the noise associated with the avalanche process agrees well with theory and previous measurements from a HARP tube. This result indicates that when coupled with CsI for x-ray detection, the additional noise associated with avalanche gain in HARP-DRL is negligible. The x-ray sensitivity measurements using the SHARP sensor produced identical avalanche gain dependence on electric field as the optical measurements with HARP-DRL. Adjusting the avalanche multiplication gain in HARP-DRL enabled a very wide dynamic range which encompassed all

  1. Vision communications based on LED array and imaging sensor

    Science.gov (United States)

    Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    In this paper, we propose a brand new communication concept, called as "vision communication" based on LED array and image sensor. This system consists of LED array as a transmitter and digital device which include image sensor such as CCD and CMOS as receiver. In order to transmit data, the proposed communication scheme simultaneously uses the digital image processing and optical wireless communication scheme. Therefore, the cognitive communication scheme is possible with the help of recognition techniques used in vision system. By increasing data rate, our scheme can use LED array consisting of several multi-spectral LEDs. Because arranged each LED can emit multi-spectral optical signal such as visible, infrared and ultraviolet light, the increase of data rate is possible similar to WDM and MIMO skills used in traditional optical and wireless communications. In addition, this multi-spectral capability also makes it possible to avoid the optical noises in communication environment. In our vision communication scheme, the data packet is composed of Sync. data and information data. Sync. data is used to detect the transmitter area and calibrate the distorted image snapshots obtained by image sensor. By making the optical rate of LED array be same with the frame rate (frames per second) of image sensor, we can decode the information data included in each image snapshot based on image processing and optical wireless communication techniques. Through experiment based on practical test bed system, we confirm the feasibility of the proposed vision communications based on LED array and image sensor.

  2. An ultrasensitive method of real time pH monitoring with complementary metal oxide semiconductor image sensor.

    Science.gov (United States)

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2015-02-09

    CMOS sensors are becoming a powerful tool in the biological and chemical field. In this work, we introduce a new approach on quantifying various pH solutions with a CMOS image sensor. The CMOS image sensor based pH measurement produces high-accuracy analysis, making it a truly portable and user friendly system. pH indicator blended hydrogel matrix was fabricated as a thin film to the accurate color development. A distinct color change of red, green and blue (RGB) develops in the hydrogel film by applying various pH solutions (pH 1-14). The semi-quantitative pH evolution was acquired by visual read out. Further, CMOS image sensor absorbs the RGB color intensity of the film and hue value converted into digital numbers with the aid of an analog-to-digital converter (ADC) to determine the pH ranges of solutions. Chromaticity diagram and Euclidean distance represent the RGB color space and differentiation of pH ranges, respectively. This technique is applicable to sense the various toxic chemicals and chemical vapors by situ sensing. Ultimately, the entire approach can be integrated into smartphone and operable with the user friendly manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. CMOS-sensors for energy-resolved X-ray imaging

    International Nuclear Information System (INIS)

    Doering, D.; Amar-Youcef, S.; Deveaux, M.; Linnik, B.; Müntz, C.; Stroth, Joachim; Baudot, J.; Dulinski, W.; Kachel, M.

    2016-01-01

    Due to their low noise, CMOS Monolithic Active Pixel Sensors are suited to sense X-rays with a few keV quantum energy, which is of interest for high resolution X-ray imaging. Moreover, the good energy resolution of the silicon sensors might be used to measure this quantum energy. Combining both features with the good spatial resolution of CMOS sensors opens the potential to build ''color sensitive' X-ray cameras. Taking such colored images is hampered by the need to operate the CMOS sensors in a single photon counting mode, which restricts the photon flux capability of the sensors. More importantly, the charge sharing between the pixels smears the potentially good energy resolution of the sensors. Based on our experience with CMOS sensors for charged particle tracking, we studied techniques to overcome the latter by means of an offline processing of the data obtained from a CMOS sensor prototype. We found that the energy resolution of the pixels can be recovered at the expense of reduced quantum efficiency. We will introduce the results of our study and discuss the feasibility of taking colored X-ray pictures with CMOS sensors

  4. Advanced data visualization and sensor fusion: Conversion of techniques from medical imaging to Earth science

    Science.gov (United States)

    Savage, Richard C.; Chen, Chin-Tu; Pelizzari, Charles; Ramanathan, Veerabhadran

    1993-01-01

    Hughes Aircraft Company and the University of Chicago propose to transfer existing medical imaging registration algorithms to the area of multi-sensor data fusion. The University of Chicago's algorithms have been successfully demonstrated to provide pixel by pixel comparison capability for medical sensors with different characteristics. The research will attempt to fuse GOES (Geostationary Operational Environmental Satellite), AVHRR (Advanced Very High Resolution Radiometer), and SSM/I (Special Sensor Microwave Imager) sensor data which will benefit a wide range of researchers. The algorithms will utilize data visualization and algorithm development tools created by Hughes in its EOSDIS (Earth Observation SystemData/Information System) prototyping. This will maximize the work on the fusion algorithms since support software (e.g. input/output routines) will already exist. The research will produce a portable software library with documentation for use by other researchers.

  5. Study of photoconductor-based radiological image sensors

    International Nuclear Information System (INIS)

    Beaumont, Francois

    1989-01-01

    Because of the evolution of medical imaging techniques to digital Systems, it is necessary to replace radiological film which has many drawbacks, by a detector quite as efficient and quickly giving a digitizable signal. The purpose of this thesis is to find new X-ray digital imaging processes using photoconductor materials such as amorphous selenium. After reviewing the principle of direct radiology and functions to be served by the X-ray sensor (i.e. detection, memory, assignment, visualization), we explain specification. We especially show the constraints due to the object to be radiographed (condition of minimal exposure), and to the reading signal (electronic noise detection associated with a reading frequency). As a result of this study, a first photoconductor sensor could be designed. Its principle is based on photo-carrier trapping at dielectric-photoconductor structure interface. The reading System needs the scanning of a laser beam upon the sensor surface. The dielectric-photoconductor structure enabled us to estimate the possibilities offered by the sensor and to build a complete x-ray imaging System. The originality of thermo-dielectric sensor, that was next studied, is to allow a thermal assignment reading. The chosen System consists in varying the ferroelectric polymer capacity whose dielectric permittivity is weak at room temperature. The thermo-dielectric material was studied by thermal or Joule effect stimulation. During our experiments, trapping was found in a sensor made of amorphous selenium between two electrodes. This new effect was performed and enabled us to expose a first interpretation. Eventually, the comparison of these new sensor concepts with radiological film shows the advantage of the proposed solution. (author) [fr

  6. Hydrogen peroxide sensor: Uniformly decorated silver nanoparticles on polypyrrole for wide detection range

    Science.gov (United States)

    Nia, Pooria Moozarm; Meng, Woi Pei; Alias, Y.

    2015-12-01

    Electrochemically synthesized polypyrrole (PPy) decorated with silver nanoparticles (AgNPs) was prepared and used as a nonenzymatic sensor for hydrogen peroxide (H2O2) detection. Polypyrrole was fabricated through electrodeposition, while silver nanoparticles were deposited on polypyrrole by the same technique. The field emission scanning electron microscopy (FESEM) images showed that the electrodeposited AgNPs were aligned along the PPy uniformly and the mean particle size of AgNPs is around 25 nm. The electrocatalytic activity of AgNPs-PPy-GCE toward H2O2 was studied using chronoamperometry and cyclic voltammetry. The first linear section was in the range of 0.1-5 mM with a limit of detection of 0.115 μmol l-1 and the second linear section was raised to 120 mM with a correlation factor of 0.256 μmol l-1 (S/N of 3). Moreover, the sensor presented excellent stability, selectivity, repeatability and reproducibility. These excellent performances make AgNPs-PPy/GCE an ideal nonenzymatic H2O2 sensor.

  7. Fabricating Optical Fiber Imaging Sensors Using Inkjet Printing Technology: a pH Sensor Proof-of-Concept

    Energy Technology Data Exchange (ETDEWEB)

    Carter, J C; Alvis, R M; Brown, S B; Langry, K C; Wilson, T S; McBride, M T; Myrick, M L; Cox, W R; Grove, M E; Colston, B W

    2005-03-01

    We demonstrate the feasibility of using Drop-on-Demand microjet printing technology for fabricating imaging sensors by reproducibly printing an array of photopolymerizable sensing elements, containing a pH sensitive indicator, on the surface of an optical fiber image guide. The reproducibility of the microjet printing process is excellent for microdot (i.e. micron-sized polymer) sensor diameter (92.2 {+-} 2.2 microns), height (35.0 {+-} 1.0 microns), and roundness (0.00072 {+-} 0.00023). pH sensors were evaluated in terms of pH sensing ability ({le}2% sensor variation), response time, and hysteresis using a custom fluorescence imaging system. In addition, the microjet technique has distinct advantages over other fabrication methods, which are discussed in detail.

  8. CMOS Imaging of Pin-Printed Xerogel-Based Luminescent Sensor Microarrays.

    Science.gov (United States)

    Yao, Lei; Yung, Ka Yi; Khan, Rifat; Chodavarapu, Vamsy P; Bright, Frank V

    2010-12-01

    We present the design and implementation of a luminescence-based miniaturized multisensor system using pin-printed xerogel materials which act as host media for chemical recognition elements. We developed a CMOS imager integrated circuit (IC) to image the luminescence response of the xerogel-based sensor array. The imager IC uses a 26 × 20 (520 elements) array of active pixel sensors and each active pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. The imager includes a correlated double sampling circuit and pixel address/digital control circuit; the image data is read-out as coded serial signal. The sensor system uses a light-emitting diode (LED) to excite the target analyte responsive luminophores doped within discrete xerogel-based sensor elements. As a prototype, we developed a 4 × 4 (16 elements) array of oxygen (O 2 ) sensors. Each group of 4 sensor elements in the array (arranged in a row) is designed to provide a different and specific sensitivity to the target gaseous O 2 concentration. This property of multiple sensitivities is achieved by using a strategic mix of two oxygen sensitive luminophores ([Ru(dpp) 3 ] 2+ and ([Ru(bpy) 3 ] 2+ ) in each pin-printed xerogel sensor element. The CMOS imager consumes an average power of 8 mW operating at 1 kHz sampling frequency driven at 5 V. The developed prototype system demonstrates a low cost and miniaturized luminescence multisensor system.

  9. Lightning Imaging Sensor (LIS) on TRMM Science Data V4

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lightning Imaging Sensor (LIS) Science Data was collected by the Lightning Imaging Sensor (LIS), which was an instrument on the Tropical Rainfall Measurement...

  10. Fingerprint image reconstruction for swipe sensor using Predictive Overlap Method

    Directory of Open Access Journals (Sweden)

    Mardiansyah Ahmad Zafrullah

    2018-01-01

    Full Text Available Swipe sensor is one of many biometric authentication sensor types that widely applied to embedded devices. The sensor produces an overlap on every pixel block of the image, so the picture requires a reconstruction process before heading to the feature extraction process. Conventional reconstruction methods require extensive computation, causing difficult to apply to embedded devices that have limited computing process. In this paper, image reconstruction is proposed using predictive overlap method, which determines the image block shift from the previous set of change data. The experiments were performed using 36 images generated by a swipe sensor with 128 x 8 pixels size of the area, where each image has an overlap in each block. The results reveal computation can increase up to 86.44% compared with conventional methods, with accuracy decreasing to 0.008% in average.

  11. The challenge of sCMOS image sensor technology to EMCCD

    Science.gov (United States)

    Chang, Weijing; Dai, Fang; Na, Qiyue

    2018-02-01

    In the field of low illumination image sensor, the noise of the latest scientific-grade CMOS image sensor is close to EMCCD, and the industry thinks it has the potential to compete and even replace EMCCD. Therefore we selected several typical sCMOS and EMCCD image sensors and cameras to compare their performance parameters. The results show that the signal-to-noise ratio of sCMOS is close to EMCCD, and the other parameters are superior. But signal-to-noise ratio is very important for low illumination imaging, and the actual imaging results of sCMOS is not ideal. EMCCD is still the first choice in the high-performance application field.

  12. Imaging intracellular pH in live cells with a genetically encoded red fluorescent protein sensor.

    Science.gov (United States)

    Tantama, Mathew; Hung, Yin Pun; Yellen, Gary

    2011-07-06

    Intracellular pH affects protein structure and function, and proton gradients underlie the function of organelles such as lysosomes and mitochondria. We engineered a genetically encoded pH sensor by mutagenesis of the red fluorescent protein mKeima, providing a new tool to image intracellular pH in live cells. This sensor, named pHRed, is the first ratiometric, single-protein red fluorescent sensor of pH. Fluorescence emission of pHRed peaks at 610 nm while exhibiting dual excitation peaks at 440 and 585 nm that can be used for ratiometric imaging. The intensity ratio responds with an apparent pK(a) of 6.6 and a >10-fold dynamic range. Furthermore, pHRed has a pH-responsive fluorescence lifetime that changes by ~0.4 ns over physiological pH values and can be monitored with single-wavelength two-photon excitation. After characterizing the sensor, we tested pHRed's ability to monitor intracellular pH by imaging energy-dependent changes in cytosolic and mitochondrial pH.

  13. A bio-image sensor for simultaneous detection of multi-neurotransmitters.

    Science.gov (United States)

    Lee, You-Na; Okumura, Koichi; Horio, Tomoko; Iwata, Tatsuya; Takahashi, Kazuhiro; Hattori, Toshiaki; Sawada, Kazuaki

    2018-03-01

    We report here a new bio-image sensor for simultaneous detection of spatial and temporal distribution of multi-neurotransmitters. It consists of multiple enzyme-immobilized membranes on a 128 × 128 pixel array with read-out circuit. Apyrase and acetylcholinesterase (AChE), as selective elements, are used to recognize adenosine 5'-triphosphate (ATP) and acetylcholine (ACh), respectively. To enhance the spatial resolution, hydrogen ion (H + ) diffusion barrier layers are deposited on top of the bio-image sensor and demonstrated their prevention capability. The results are used to design the space among enzyme-immobilized pixels and the null H + sensor to minimize the undesired signal overlap by H + diffusion. Using this bio-image sensor, we can obtain H + diffusion-independent imaging of concentration gradients of ATP and ACh in real-time. The sensing characteristics, such as sensitivity and detection of limit, are determined experimentally. With the proposed bio-image sensor the possibility exists for customizable monitoring of the activities of various neurochemicals by using different kinds of proton-consuming or generating enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Range Information Characterization of the Hokuyo UST-20LX LIDAR Sensor

    Directory of Open Access Journals (Sweden)

    Matthew A. Cooper

    2018-05-01

    Full Text Available This paper presents a study on the data measurements that the Hokuyo UST-20LX Laser Rangefinder produces, which compiles into an overall characterization of the LiDAR sensor relative to indoor environments. The range measurements, beam divergence, angular resolution, error effect due to some common painted and wooden surfaces, and the error due to target surface orientation are analyzed. It was shown that using a statistical average of sensor measurements provides a more accurate range measurement. It was also shown that the major source of errors for the Hokuyo UST-20LX sensor was caused by something that will be referred to as “mixed pixels”. Additional error sources are target surface material, and the range relative to the sensor. The purpose of this paper was twofold: (1 to describe a series of tests that can be performed to characterize various aspects of a LIDAR system from a user perspective, and (2 present a detailed characterization of the commonly-used Hokuyo UST-20LX LIDAR sensor.

  15. Design and Performance Analysis of an Intrinsically Safe Ultrasonic Ranging Sensor.

    Science.gov (United States)

    Zhang, Hongjuan; Wang, Yu; Zhang, Xu; Wang, Dong; Jin, Baoquan

    2016-06-13

    In flammable or explosive environments, an ultrasonic sensor for distance measurement poses an important engineering safety challenge, because the driving circuit uses an intermediate frequency transformer as an impedance transformation element, in which the produced heat or spark is available for ignition. In this paper, an intrinsically safe ultrasonic ranging sensor is designed and implemented. The waterproof piezoelectric transducer with integrated transceiver is chosen as an energy transducing element. Then a novel transducer driving circuit is designed based on an impedance matching method considering safety spark parameters to replace an intermediate frequency transformer. Then, an energy limiting circuit is developed to achieve dual levels of over-voltage and over-current protection. The detail calculation and evaluation are executed and the electrical characteristics are analyzed to verify the intrinsic safety of the driving circuit. Finally, an experimental platform of the ultrasonic ranging sensor system is constructed, which involves short-circuit protection. Experimental results show that the proposed ultrasonic ranging sensor is excellent in both ranging performance and intrinsic safety.

  16. Experimental single-chip color HDTV image acquisition system with 8M-pixel CMOS image sensor

    Science.gov (United States)

    Shimamoto, Hiroshi; Yamashita, Takayuki; Funatsu, Ryohei; Mitani, Kohji; Nojiri, Yuji

    2006-02-01

    We have developed an experimental single-chip color HDTV image acquisition system using 8M-pixel CMOS image sensor. The sensor has 3840 × 2160 effective pixels and is progressively scanned at 60 frames per second. We describe the color filter array and interpolation method to improve image quality with a high-pixel-count single-chip sensor. We also describe an experimental image acquisition system we used to measured spatial frequency characteristics in the horizontal direction. The results indicate good prospects for achieving a high quality single chip HDTV camera that reduces pseudo signals and maintains high spatial frequency characteristics within the frequency band for HDTV.

  17. IR sensitivity enhancement of CMOS Image Sensor with diffractive light trapping pixels.

    Science.gov (United States)

    Yokogawa, Sozo; Oshiyama, Itaru; Ikeda, Harumi; Ebiko, Yoshiki; Hirano, Tomoyuki; Saito, Suguru; Oinoue, Takashi; Hagimoto, Yoshiya; Iwamoto, Hayato

    2017-06-19

    We report on the IR sensitivity enhancement of back-illuminated CMOS Image Sensor (BI-CIS) with 2-dimensional diffractive inverted pyramid array structure (IPA) on crystalline silicon (c-Si) and deep trench isolation (DTI). FDTD simulations of semi-infinite thick c-Si having 2D IPAs on its surface whose pitches over 400 nm shows more than 30% improvement of light absorption at λ = 850 nm and the maximum enhancement of 43% with the 540 nm pitch at the wavelength is confirmed. A prototype BI-CIS sample with pixel size of 1.2 μm square containing 400 nm pitch IPAs shows 80% sensitivity enhancement at λ = 850 nm compared to the reference sample with flat surface. This is due to diffraction with the IPA and total reflection at the pixel boundary. The NIR images taken by the demo camera equip with a C-mount lens show 75% sensitivity enhancement in the λ = 700-1200 nm wavelength range with negligible spatial resolution degradation. Light trapping CIS pixel technology promises to improve NIR sensitivity and appears to be applicable to many different image sensor applications including security camera, personal authentication, and range finding Time-of-Flight camera with IR illuminations.

  18. RSA/Legacy Wind Sensor Comparison. Part 2; Eastern Range

    Science.gov (United States)

    Short, David A.; Wheeler, Mark M.

    2006-01-01

    This report describes a comparison of data from ultrasonic and propeller-and-vane anemometers on 5 wind towers at Kennedy Space Center and Cape Canaveral Air Force Station. The ultrasonic sensors are scheduled to replace the Legacy propeller-and-vane sensors under the Range Standardization and Automation (RSA) program. Because previous studies have noted differences between peak wind speeds reported by mechanical and ultrasonic wind sensors, the latter having no moving parts, the 30th and 45th Weather Squadrons wanted to understand possible differences between the two sensor types. The period-of-record was 13-30 May 2005, A total of 357,626 readings of 1-minute average and peak wind speed/direction from each sensor type were used. Statistics of differences in speed and direction were used to identify 15 out of 19 RSA sensors having the most consistent performance, with respect to the Legacy sensors. RSA average wind speed data from these 15 showed a small positive bias of 0.38 kts. A slightly larger positive bias of 0.94 kts was found in the RSA peak wind speed.

  19. Performance study of double SOI image sensors

    Science.gov (United States)

    Miyoshi, T.; Arai, Y.; Fujita, Y.; Hamasaki, R.; Hara, K.; Ikegami, Y.; Kurachi, I.; Nishimura, R.; Ono, S.; Tauchi, K.; Tsuboyama, T.; Yamada, M.

    2018-02-01

    Double silicon-on-insulator (DSOI) sensors composed of two thin silicon layers and one thick silicon layer have been developed since 2011. The thick substrate consists of high resistivity silicon with p-n junctions while the thin layers are used as SOI-CMOS circuitry and as shielding to reduce the back-gate effect and crosstalk between the sensor and the circuitry. In 2014, a high-resolution integration-type pixel sensor, INTPIX8, was developed based on the DSOI concept. This device is fabricated using a Czochralski p-type (Cz-p) substrate in contrast to a single SOI (SSOI) device having a single thin silicon layer and a Float Zone p-type (FZ-p) substrate. In the present work, X-ray spectra of both DSOI and SSOI sensors were obtained using an Am-241 radiation source at four gain settings. The gain of the DSOI sensor was found to be approximately three times that of the SSOI device because the coupling capacitance is reduced by the DSOI structure. An X-ray imaging demonstration was also performed and high spatial resolution X-ray images were obtained.

  20. Special Sensor Microwave Imager/Sounder (SSMIS) Sensor Data Record (SDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager/Sounder (SSMIS) is a series of passive microwave conically scanning imagers and sounders onboard the DMSP satellites beginning...

  1. APPLICATION OF SENSOR FUSION TO IMPROVE UAV IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    S. Jabari

    2017-08-01

    Full Text Available Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan camera along with either a colour camera or a four-band multi-spectral (MS camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC. We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  2. Application of Sensor Fusion to Improve Uav Image Classification

    Science.gov (United States)

    Jabari, S.; Fathollahi, F.; Zhang, Y.

    2017-08-01

    Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  3. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  4. Miniature large range multi-axis force-torque sensor for biomechanical applications

    International Nuclear Information System (INIS)

    Brookhuis, R A; Sanders, R G P; Ma, K; Lammerink, T S J; De Boer, M J; Krijnen, G J M; Wiegerink, R J

    2015-01-01

    A miniature force sensor for the measurement of forces and moments at a human fingertip is designed and realized. Thin silicon pillars inside the sensor provide in-plane guidance for shear force measurement and provide the spring constant in normal direction. A corrugated silicon ring around the force sensitive area provides the spring constant in shear direction and seals the interior of the sensor. To detect all load components, capacitive read-out is used. A novel electrode pattern results in a large shear force sensitivity. The fingertip force sensor has a wide force range of up to 60 N in normal direction, ± 30 N in shear direction and a torque range of ± 25 N mm. (paper)

  5. Simplified wide dynamic range CMOS image sensor with 3t APS reset-drain actuation

    OpenAIRE

    Carlos Augusto de Moraes Cruz

    2014-01-01

    Um sensor de imagem é uma matriz de pequenas células fotossensíveis chamadas sensores de pixeis. Um pixel, elemento el fotográfico (picture) pix, é a menor porção de uma imagem. Assim o sensor de pixel é a menor célula de um sensor de imagem, capaz de detectar um ponto singular da imagem. Este ponto é então usado para reconstruir um quadro completo de imagem. Sensores de imagem CMOS são atualmente largamente utilizados tanto em câmeras profissionais como em aparelhos moveis em geral como celu...

  6. Enhanced Strain Measurement Range of an FBG Sensor Embedded in Seven-Wire Steel Strands.

    Science.gov (United States)

    Kim, Jae-Min; Kim, Chul-Min; Choi, Song-Yi; Lee, Bang Yeon

    2017-07-18

    FBG sensors offer many advantages, such as a lack of sensitivity to electromagnetic waves, small size, high durability, and high sensitivity. However, their maximum strain measurement range is lower than the yield strain range (about 1.0%) of steel strands when embedded in steel strands. This study proposes a new FBG sensing technique in which an FBG sensor is recoated with polyimide and protected by a polyimide tube in an effort to enhance the maximum strain measurement range of FBG sensors embedded in strands. The validation test results showed that the proposed FBG sensing technique has a maximum strain measurement range of 1.73% on average, which is 1.73 times higher than the yield strain of the strands. It was confirmed that recoating the FBG sensor with polyimide and protecting the FBG sensor using a polyimide tube could effectively enhance the maximum strain measurement range of FBG sensors embedded in strands.

  7. An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability.

    Science.gov (United States)

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U

    2015-03-06

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  8. Research-grade CMOS image sensors for demanding space applications

    Science.gov (United States)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2017-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid- 90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  9. Autonomous vision networking: miniature wireless sensor networks with imaging technology

    Science.gov (United States)

    Messinger, Gioia; Goldberg, Giora

    2006-09-01

    The recent emergence of integrated PicoRadio technology, the rise of low power, low cost, System-On-Chip (SOC) CMOS imagers, coupled with the fast evolution of networking protocols and digital signal processing (DSP), created a unique opportunity to achieve the goal of deploying large-scale, low cost, intelligent, ultra-low power distributed wireless sensor networks for the visualization of the environment. Of all sensors, vision is the most desired, but its applications in distributed sensor networks have been elusive so far. Not any more. The practicality and viability of ultra-low power vision networking has been proven and its applications are countless, from security, and chemical analysis to industrial monitoring, asset tracking and visual recognition, vision networking represents a truly disruptive technology applicable to many industries. The presentation discusses some of the critical components and technologies necessary to make these networks and products affordable and ubiquitous - specifically PicoRadios, CMOS imagers, imaging DSP, networking and overall wireless sensor network (WSN) system concepts. The paradigm shift, from large, centralized and expensive sensor platforms, to small, low cost, distributed, sensor networks, is possible due to the emergence and convergence of a few innovative technologies. Avaak has developed a vision network that is aided by other sensors such as motion, acoustic and magnetic, and plans to deploy it for use in military and commercial applications. In comparison to other sensors, imagers produce large data files that require pre-processing and a certain level of compression before these are transmitted to a network server, in order to minimize the load on the network. Some of the most innovative chemical detectors currently in development are based on sensors that change color or pattern in the presence of the desired analytes. These changes are easily recorded and analyzed by a CMOS imager and an on-board DSP processor

  10. Single-event transient imaging with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor.

    Science.gov (United States)

    Mochizuki, Futa; Kagawa, Keiichiro; Okihara, Shin-ichiro; Seo, Min-Woong; Zhang, Bo; Takasawa, Taishi; Yasutomi, Keita; Kawahito, Shoji

    2016-02-22

    In the work described in this paper, an image reproduction scheme with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor was demonstrated. The sensor captures an object by compressing a sequence of images with focal-plane temporally random-coded shutters, followed by reconstruction of time-resolved images. Because signals are modulated pixel-by-pixel during capturing, the maximum frame rate is defined only by the charge transfer speed and can thus be higher than those of conventional ultra-high-speed cameras. The frame rate and optical efficiency of the multi-aperture scheme are discussed. To demonstrate the proposed imaging method, a 5×3 multi-aperture image sensor was fabricated. The average rising and falling times of the shutters were 1.53 ns and 1.69 ns, respectively. The maximum skew among the shutters was 3 ns. The sensor observed plasma emission by compressing it to 15 frames, and a series of 32 images at 200 Mfps was reconstructed. In the experiment, by correcting disparities and considering temporal pixel responses, artifacts in the reconstructed images were reduced. An improvement in PSNR from 25.8 dB to 30.8 dB was confirmed in simulations.

  11. Ageing effects on image sensors due to terrestrial cosmic radiation

    NARCIS (Netherlands)

    Nampoothiri, G.G.; Horemans, M.L.R.; Theuwissen, A.J.P.

    2011-01-01

    We analyze the “ageing” effect on image sensors introduced by neutrons present in natural (terrestrial) cosmic environment. The results obtained at sea level are corroborated for the first time with accelerated neutron beam tests and for various image sensor operation conditions. The results reveal

  12. CMOS Imaging of Temperature Effects on Pin-Printed Xerogel Sensor Microarrays.

    Science.gov (United States)

    Lei Yao; Ka Yi Yung; Chodavarapu, Vamsy P; Bright, Frank V

    2011-04-01

    In this paper, we study the effect of temperature on the operation and performance of a xerogel-based sensor microarrays coupled to a complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC) that images the photoluminescence response from the sensor microarray. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. A correlated double sampling circuit and pixel address/digital control/signal integration circuit are also implemented on-chip. The CMOS imager data are read out as a serial coded signal. The sensor system uses a light-emitting diode to excite target analyte responsive organometallic luminophores doped within discrete xerogel-based sensor elements. As a proto type, we developed a 3 × 3 (9 elements) array of oxygen (O2) sensors. Each group of three sensor elements in the array (arranged in a column) is designed to provide a different and specific sensitivity to the target gaseous O2 concentration. This property of multiple sensitivities is achieved by using a mix of two O2 sensitive luminophores in each pin-printed xerogel sensor element. The CMOS imager is designed to be low noise and consumes a static power of 320.4 μW and an average dynamic power of 624.6 μW when operating at 100-Hz sampling frequency and 1.8-V dc power supply.

  13. Hydrogen peroxide sensor: Uniformly decorated silver nanoparticles on polypyrrole for wide detection range

    International Nuclear Information System (INIS)

    Nia, Pooria Moozarm; Meng, Woi Pei; Alias, Y.

    2015-01-01

    Graphical abstract: - Highlights: • Electrochemical method was used for depositing silver nanoparticles and polypyrrole. • Silver nanoparticles (25 nm) were uniformly decorated on electrodeposited polypyrrole. • (Ag(NH 3 ) 2 OH) precursor showed better electrochemical performance than (AgNO 3 ). • The sensor showed superior performance toward H 2 O 2 . - Abstract: Electrochemically synthesized polypyrrole (PPy) decorated with silver nanoparticles (AgNPs) was prepared and used as a nonenzymatic sensor for hydrogen peroxide (H 2 O 2 ) detection. Polypyrrole was fabricated through electrodeposition, while silver nanoparticles were deposited on polypyrrole by the same technique. The field emission scanning electron microscopy (FESEM) images showed that the electrodeposited AgNPs were aligned along the PPy uniformly and the mean particle size of AgNPs is around 25 nm. The electrocatalytic activity of AgNPs-PPy-GCE toward H 2 O 2 was studied using chronoamperometry and cyclic voltammetry. The first linear section was in the range of 0.1–5 mM with a limit of detection of 0.115 μmol l −1 and the second linear section was raised to 120 mM with a correlation factor of 0.256 μmol l −1 (S/N of 3). Moreover, the sensor presented excellent stability, selectivity, repeatability and reproducibility. These excellent performances make AgNPs-PPy/GCE an ideal nonenzymatic H 2 O 2 sensor.

  14. ANALYSIS OF SPECTRAL CHARACTERISTICS AMONG DIFFERENT SENSORS BY USE OF SIMULATED RS IMAGES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This research, by use of RS image-simulating method, simulated apparent reflectance images at sensor level and ground-reflectance images of SPOT-HRV,CBERS-CCD,Landsat-TM and NOAA14-AVHRR' s corresponding bands. These images were used to analyze sensor's differences caused by spectral sensitivity and atmospheric impacts. The differences were analyzed on Normalized Difference Vegetation Index(NDVI). The results showed that the differences of sensors' spectral characteristics cause changes of their NDVI and reflectance. When multiple sensors' data are applied to digital analysis, the error should be taken into account. Atmospheric effect makes NDVI smaller, and atn~pheric correction has the tendency of increasing NDVI values. The reflectance and their NDVIs of different sensors can be used to analyze the differences among sensor' s features. The spectral analysis method based on RS simulated images can provide a new way to design the spectral characteristics of new sensors.

  15. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    Science.gov (United States)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported

  16. Full Waveform Analysis for Long-Range 3D Imaging Laser Radar

    Directory of Open Access Journals (Sweden)

    Wallace AndrewM

    2010-01-01

    Full Text Available The new generation of 3D imaging systems based on laser radar (ladar offers significant advantages in defense and security applications. In particular, it is possible to retrieve 3D shape information directly from the scene and separate a target from background or foreground clutter by extracting a narrow depth range from the field of view by range gating, either in the sensor or by postprocessing. We discuss and demonstrate the applicability of full-waveform ladar to produce multilayer 3D imagery, in which each pixel produces a complex temporal response that describes the scene structure. Such complexity caused by multiple and distributed reflection arises in many relevant scenarios, for example in viewing partially occluded targets, through semitransparent materials (e.g., windows and through distributed reflective media such as foliage. We demonstrate our methodology on 3D image data acquired by a scanning time-of-flight system, developed in our own laboratories, which uses the time-correlated single-photon counting technique.

  17. Spatially digitized tactile pressure sensors with tunable sensitivity and sensing range.

    Science.gov (United States)

    Choi, Eunsuk; Sul, Onejae; Hwang, Soonhyung; Cho, Joonhyung; Chun, Hyunsuk; Kim, Hongjun; Lee, Seung-Beck

    2014-10-24

    When developing an electronic skin with touch sensation, an array of tactile pressure sensors with various ranges of pressure detection need to be integrated. This requires low noise, highly reliable sensors with tunable sensing characteristics. We demonstrate the operation of tactile pressure sensors that utilize the spatial distribution of contact electrodes to detect various ranges of tactile pressures. The device consists of a suspended elastomer diaphragm, with a carbon nanotube thin-film on the bottom, which makes contact with the electrodes on the substrate with applied pressure. The electrodes separated by set distances become connected in sequence with tactile pressure, enabling consecutive electrodes to produce a signal. Thus, the pressure is detected not by how much of a signal is produced but by which of the electrodes is registering an output. By modulating the diaphragm diameter, and suspension height, it was possible to tune the pressure sensitivity and sensing range. Also, adding a fingerprint ridge structure enabled the sensor to detect the periodicity of sub-millimeter grating patterns on a silicon wafer.

  18. Planoconcave optical microresonator sensors for photoacoustic imaging: pushing the limits of sensitivity (Conference Presentation)

    Science.gov (United States)

    Guggenheim, James A.; Zhang, Edward Z.; Beard, Paul C.

    2016-03-01

    Most photoacoustic scanners use piezoelectric detectors but these have two key limitations. Firstly, they are optically opaque, inhibiting backward mode operation. Secondly, it is difficult to achieve adequate detection sensitivity with the small element sizes needed to provide near-omnidirectional response as required for tomographic imaging. Planar Fabry-Perot (FP) ultrasound sensing etalons can overcome both of these limitations and have proved extremely effective for superficial (beam. However, this has the disadvantage that beam walk-off due to the divergence of the beam fundamentally limits the etalon finesse and thus sensitivity - in essence, the problem is one of insufficient optical confinement. To overcome this, novel planoconcave micro-resonator sensors have been fabricated using precision ink-jet printed polymer domes with curvatures matching that of the laser wavefront. By providing near-perfect beam confinement, we show that it is possible to approach the maximum theoretical limit for finesse (f) imposed by the etalon mirror reflectivities (e.g. f=400 for R=99.2% in contrast to a typical planar sensor value of fbeam walk-off, viable sensors can be made with significantly greater thickness than planar FP sensors. This provides an additional sensitivity gain for deep tissue imaging applications such as breast imaging where detection bandwidths in the low MHz can be tolerated. For example, for a 250 μm thick planoconcave sensor with a -3dB bandwidth of 5MHz, the measured NEP was 4 Pa. This NEP is comparable to that provided by mm scale piezoelectric detectors used for breast imaging applications but with more uniform frequency response characteristics and an order-of-magnitude smaller element size. Following previous proof-of-concept work, several important advances towards practical application have been made. A family of sensors with bandwidths ranging from 3MHz to 20MHz have been fabricated and characterised. A novel interrogation scheme based on

  19. Ultrahigh sensitivity endoscopic camera using a new CMOS image sensor: providing with clear images under low illumination in addition to fluorescent images.

    Science.gov (United States)

    Aoki, Hisae; Yamashita, Hiromasa; Mori, Toshiyuki; Fukuyo, Tsuneo; Chiba, Toshio

    2014-11-01

    We developed a new ultrahigh-sensitive CMOS camera using a specific sensor that has a wide range of spectral sensitivity characteristics. The objective of this study is to present our updated endoscopic technology that has successfully integrated two innovative functions; ultrasensitive imaging as well as advanced fluorescent viewing. Two different experiments were conducted. One was carried out to evaluate the function of the ultrahigh-sensitive camera. The other was to test the availability of the newly developed sensor and its performance as a fluorescence endoscope. In both studies, the distance from the endoscopic tip to the target was varied and those endoscopic images in each setting were taken for further comparison. In the first experiment, the 3-CCD camera failed to display the clear images under low illumination, and the target was hardly seen. In contrast, the CMOS camera was able to display the targets regardless of the camera-target distance under low illumination. Under high illumination, imaging quality given by both cameras was quite alike. In the second experiment as a fluorescence endoscope, the CMOS camera was capable of clearly showing the fluorescent-activated organs. The ultrahigh sensitivity CMOS HD endoscopic camera is expected to provide us with clear images under low illumination in addition to the fluorescent images under high illumination in the field of laparoscopic surgery.

  20. Novel birefringence interrogation for Sagnac loop interferometer sensor with unlimited linear measurement range.

    Science.gov (United States)

    He, Haijun; Shao, Liyang; Qian, Heng; Zhang, Xinpu; Liang, Jiawei; Luo, Bin; Pan, Wei; Yan, Lianshan

    2017-03-20

    A novel demodulation method for Sagnac loop interferometer based sensor has been proposed and demonstrated, by unwrapping the phase changes with birefringence interrogation. A temperature sensor based on Sagnac loop interferometer has been used to verify the feasibility of the proposed method. Several tests with 40 °C temperature range have been accomplished with a great linearity of 0.9996 in full range. The proposed scheme is universal for all Sagnac loop interferometer based sensors and it has unlimited linear measurable range which overwhelming the conventional demodulation method with peak/dip tracing. Furthermore, the influence of the wavelength sampling interval and wavelength span on the demodulation error has been discussed in this work. The proposed interrogation method has a great significance for Sagnac loop interferometer sensor and it might greatly enhance the availability of this type of sensors in practical application.

  1. Energy-Efficient Algorithm for Sensor Networks with Non-Uniform Maximum Transmission Range

    Directory of Open Access Journals (Sweden)

    Yimin Yu

    2011-06-01

    Full Text Available In wireless sensor networks (WSNs, the energy hole problem is a key factor affecting the network lifetime. In a circular multi-hop sensor network (modeled as concentric coronas, the optimal transmission ranges of all coronas can effectively improve network lifetime. In this paper, we investigate WSNs with non-uniform maximum transmission ranges, where sensor nodes deployed in different regions may differ in their maximum transmission range. Then, we propose an Energy-efficient algorithm for Non-uniform Maximum Transmission range (ENMT, which can search approximate optimal transmission ranges of all coronas in order to prolong network lifetime. Furthermore, the simulation results indicate that ENMT performs better than other algorithms.

  2. CSRQ: Communication-Efficient Secure Range Queries in Two-Tiered Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hua Dai

    2016-02-01

    Full Text Available In recent years, we have seen many applications of secure query in two-tiered wireless sensor networks. Storage nodes are responsible for storing data from nearby sensor nodes and answering queries from Sink. It is critical to protect data security from a compromised storage node. In this paper, the Communication-efficient Secure Range Query (CSRQ—a privacy and integrity preserving range query protocol—is proposed to prevent attackers from gaining information of both data collected by sensor nodes and queries issued by Sink. To preserve privacy and integrity, in addition to employing the encoding mechanisms, a novel data structure called encrypted constraint chain is proposed, which embeds the information of integrity verification. Sink can use this encrypted constraint chain to verify the query result. The performance evaluation shows that CSRQ has lower communication cost than the current range query protocols.

  3. Wearable Wide-Range Strain Sensors Based on Ionic Liquids and Monitoring of Human Activities

    Directory of Open Access Journals (Sweden)

    Shao-Hui Zhang

    2017-11-01

    Full Text Available Wearable sensors for detection of human activities have encouraged the development of highly elastic sensors. In particular, to capture subtle and large-scale body motion, stretchable and wide-range strain sensors are highly desired, but still a challenge. Herein, a highly stretchable and transparent stain sensor based on ionic liquids and elastic polymer has been developed. The as-obtained sensor exhibits impressive stretchability with wide-range strain (from 0.1% to 400%, good bending properties and high sensitivity, whose gauge factor can reach 7.9. Importantly, the sensors show excellent biological compatibility and succeed in monitoring the diverse human activities ranging from the complex large-scale multidimensional motions to subtle signals, including wrist, finger and elbow joint bending, finger touch, breath, speech, swallow behavior and pulse wave.

  4. Study of x-ray CCD image sensor and application

    Science.gov (United States)

    Wang, Shuyun; Li, Tianze

    2008-12-01

    In this paper, we expounded the composing, specialty, parameter, its working process, key techniques and methods for charge coupled devices (CCD) twice value treatment. Disposal process for CCD video signal quantification was expatiated; X-ray image intensifier's constitutes, function of constitutes, coupling technique of X-ray image intensifier and CCD were analyzed. We analyzed two effective methods to reduce the harm to human beings when X-ray was used in the medical image. One was to reduce X-ray's radiation and adopt to intensify the image penetrated by X-ray to gain the same effect. The other was to use the image sensor to transfer the images to the safe area for observation. On this base, a new method was presented that CCD image sensor and X-ray image intensifier were combined organically. A practical medical X-ray photo electricity system was designed which can be used in the records and time of the human's penetrating images. The system was mainly made up with the medical X-ray, X-ray image intensifier, CCD vidicon with high resolution, image processor, display and so on. Its characteristics are: change the invisible X-ray into the visible light image; output the vivid images; short image recording time etc. At the same time we analyzed the main aspects which affect the system's resolution. Medical photo electricity system using X-ray image sensor can reduce the X-ray harm to human sharply when it is used in the medical diagnoses. At last we analyzed and looked forward the system's application in medical engineering and the related fields.

  5. A Wireless Sensor Network for Vineyard Monitoring That Uses Image Processing

    Science.gov (United States)

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis. PMID:22163948

  6. A wireless sensor network for vineyard monitoring that uses image processing.

    Science.gov (United States)

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis.

  7. RSA/Legacy Wind Sensor Comparison. Part 1; Western Range

    Science.gov (United States)

    Short, David A.; Wheeler, Mark M.

    2006-01-01

    This report describes a comparison of data from ultrasonic and cup-and-vane anemometers on 5 wind towers at Vandenberg AFB. The ultrasonic sensors are scheduled to replace the Legacy cup-and-vane sensors under the Range Standardization and Automation (RSA) program. Because previous studies have noted differences between peak wind speeds reported by mechanical and ultrasonic wind sensors, the latter having no moving parts, the 30th and 45th Weather Squadrons wanted to understand possible differences between the two sensor types. The period-of-record was 13-30 May 2005. A total of 153,961 readings of I-minute average and peak wind speed/direction from each sensor type were used. Statistics of differences in speed and direction were used to identify 18 out of 34 RSA sensors having the most consistent performance, with respect to the Legacy sensors. Data from these 18 were used to form a composite comparison. A small positive bias in the composite RSA average wind speed increased from +0.5 kts at 15 kts, to +1 kt at 25 kts. A slightly larger positive bias in the RSA peak wind speed increased from +1 kt at 15 kts, to +2 kts at 30 kts.

  8. A High-Dynamic-Range Optical Remote Sensing Imaging Method for Digital TDI CMOS

    Directory of Open Access Journals (Sweden)

    Taiji Lan

    2017-10-01

    Full Text Available The digital time delay integration (digital TDI technology of the complementary metal-oxide-semiconductor (CMOS image sensor has been widely adopted and developed in the optical remote sensing field. However, the details of targets that have low illumination or low contrast in scenarios of high contrast are often drowned out because of the superposition of multi-stage images in digital domain multiplies the read noise and the dark noise, thus limiting the imaging dynamic range. Through an in-depth analysis of the information transfer model of digital TDI, this paper attempts to explore effective ways to overcome this issue. Based on the evaluation and analysis of multi-stage images, the entropy-maximized adaptive histogram equalization (EMAHE algorithm is proposed to improve the ability of images to express the details of dark or low-contrast targets. Furthermore, in this paper, an image fusion method is utilized based on gradient pyramid decomposition and entropy weighting of different TDI stage images, which can improve the detection ability of the digital TDI CMOS for complex scenes with high contrast, and obtain images that are suitable for recognition by the human eye. The experimental results show that the proposed methods can effectively improve the high-dynamic-range imaging (HDRI capability of the digital TDI CMOS. The obtained images have greater entropy and average gradients.

  9. Low-power high-accuracy micro-digital sun sensor by means of a CMOS image sensor

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.

    2013-01-01

    A micro-digital sun sensor (?DSS) is a sun detector which senses a satellite’s instant attitude angle with respect to the sun. The core of this sensor is a system-on-chip imaging chip which is referred to as APS+. The APS+ integrates a CMOS active pixel sensor (APS) array of 368×368??pixels , a

  10. Camera sensor arrangement for crop/weed detection accuracy in agronomic images.

    Science.gov (United States)

    Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

    2013-04-02

    In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.

  11. An Ultra-Low Power CMOS Image Sensor with On-Chip Energy Harvesting and Power Management Capability

    Directory of Open Access Journals (Sweden)

    Ismail Cevik

    2015-03-01

    Full Text Available An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT-based power management system (PMS is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  12. Space-based infrared sensors of space target imaging effect analysis

    Science.gov (United States)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  13. Expanding the dynamic measurement range for polymeric nanoparticle pH sensors

    DEFF Research Database (Denmark)

    Sun, Honghao; Almdal, Kristoffer; Andresen, Thomas Lars

    2011-01-01

    Conventional optical nanoparticle pH sensors that are designed for ratiometric measurements in cells have been based on utilizing one sensor fluorophore and one reference fluorophore in each nanoparticle, which results in a relatively narrow dynamic measurement range. This results in substantial...

  14. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    Science.gov (United States)

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  15. The lucky image-motion prediction for simple scene observation based soft-sensor technology

    Science.gov (United States)

    Li, Yan; Su, Yun; Hu, Bin

    2015-08-01

    High resolution is important to earth remote sensors, while the vibration of the platforms of the remote sensors is a major factor restricting high resolution imaging. The image-motion prediction and real-time compensation are key technologies to solve this problem. For the reason that the traditional autocorrelation image algorithm cannot meet the demand for the simple scene image stabilization, this paper proposes to utilize soft-sensor technology in image-motion prediction, and focus on the research of algorithm optimization in imaging image-motion prediction. Simulations results indicate that the improving lucky image-motion stabilization algorithm combining the Back Propagation Network (BP NN) and support vector machine (SVM) is the most suitable for the simple scene image stabilization. The relative error of the image-motion prediction based the soft-sensor technology is below 5%, the training computing speed of the mathematical predication model is as fast as the real-time image stabilization in aerial photography.

  16. Hydrogen peroxide sensor: Uniformly decorated silver nanoparticles on polypyrrole for wide detection range

    Energy Technology Data Exchange (ETDEWEB)

    Nia, Pooria Moozarm, E-mail: pooriamn@yahoo.com; Meng, Woi Pei, E-mail: pmwoi@um.edu.my; Alias, Y., E-mail: yatimah70@um.edu.my

    2015-12-01

    Graphical abstract: - Highlights: • Electrochemical method was used for depositing silver nanoparticles and polypyrrole. • Silver nanoparticles (25 nm) were uniformly decorated on electrodeposited polypyrrole. • (Ag(NH{sub 3}){sub 2}OH) precursor showed better electrochemical performance than (AgNO{sub 3}). • The sensor showed superior performance toward H{sub 2}O{sub 2}. - Abstract: Electrochemically synthesized polypyrrole (PPy) decorated with silver nanoparticles (AgNPs) was prepared and used as a nonenzymatic sensor for hydrogen peroxide (H{sub 2}O{sub 2}) detection. Polypyrrole was fabricated through electrodeposition, while silver nanoparticles were deposited on polypyrrole by the same technique. The field emission scanning electron microscopy (FESEM) images showed that the electrodeposited AgNPs were aligned along the PPy uniformly and the mean particle size of AgNPs is around 25 nm. The electrocatalytic activity of AgNPs-PPy-GCE toward H{sub 2}O{sub 2} was studied using chronoamperometry and cyclic voltammetry. The first linear section was in the range of 0.1–5 mM with a limit of detection of 0.115 μmol l{sup −1} and the second linear section was raised to 120 mM with a correlation factor of 0.256 μmol l{sup −1} (S/N of 3). Moreover, the sensor presented excellent stability, selectivity, repeatability and reproducibility. These excellent performances make AgNPs-PPy/GCE an ideal nonenzymatic H{sub 2}O{sub 2} sensor.

  17. Nitrogen-rich functional groups carbon nanoparticles based fluorescent pH sensor with broad-range responding for environmental and live cells applications.

    Science.gov (United States)

    Shi, Bingfang; Su, Yubin; Zhang, Liangliang; Liu, Rongjun; Huang, Mengjiao; Zhao, Shulin

    2016-08-15

    A nitrogen-rich functional groups carbon nanoparticles (N-CNs) based fluorescent pH sensor with a broad-range responding was prepared by one-pot hydrothermal treatment of melamine and triethanolamine. The as-prepared N-CNs exhibited excellent photoluminesence properties with an absolute quantum yield (QY) of 11.0%. Furthermore, the N-CNs possessed a broad-range pH response. The linear pH response range was 3.0 to 12.0, which is much wider than that of previously reported fluorescent pH sensors. The possible mechanism for the pH-sensitive response of the N-CNs was ascribed to photoinduced electron transfer (PET). Cell toxicity experiment showed that the as-prepared N-CNs exhibited low cytotoxicity and excellent biocompatibility with the cell viabilities of more than 87%. The proposed N-CNs-based pH sensor was used for pH monitoring of environmental water samples, and pH fluorescence imaging of live T24 cells. The N-CNs is promising as a convenient and general fluorescent pH sensor for environmental monitoring and bioimaging applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Laser Doppler Blood Flow Imaging Using a CMOS Imaging Sensor with On-Chip Signal Processing

    Directory of Open Access Journals (Sweden)

    Cally Gill

    2013-09-01

    Full Text Available The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  19. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    Science.gov (United States)

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  20. Transducer-based fiber Bragg grating high-temperature sensor with enhanced range and stability

    Science.gov (United States)

    Mamidi, Venkata Reddy; Kamineni, Srimannarayana; Ravinuthala, Lakshmi Narayana Sai Prasad; Tumu, Venkatappa Rao

    2017-09-01

    Fiber Bragg grating (FBG)-based high-temperature sensor with enhanced-temperature range and stability has been developed and tested. The sensor consists of an FBG and a mechanical transducer, which furnishes a linear temperature-dependent tensile strain on FBG by means of differential linear thermal expansion of two different ceramic materials. The designed sensor is tested over a range: 20°C to 1160°C and is expected to measure up to 1500°C.

  1. Quality Factor Effect on the Wireless Range of Microstrip Patch Antenna Strain Sensors

    Directory of Open Access Journals (Sweden)

    Ali Daliri

    2014-01-01

    Full Text Available Recently introduced passive wireless strain sensors based on microstrip patch antennas have shown great potential for reliable health and usage monitoring in aerospace and civil industries. However, the wireless interrogation range of these sensors is limited to few centimeters, which restricts their practical application. This paper presents an investigation on the effect of circular microstrip patch antenna (CMPA design on the quality factor and the maximum practical wireless reading range of the sensor. The results reveal that by using appropriate substrate materials the interrogation distance of the CMPA sensor can be increased four-fold, from the previously reported 5 to 20 cm, thus improving considerably the viability of this type of wireless sensors for strain measurement and damage detection.

  2. Quality Factor Effect on the Wireless Range of Microstrip Patch Antenna Strain Sensors

    Science.gov (United States)

    Daliri, Ali; Galehdar, Amir; Rowe, Wayne S. T.; John, Sabu; Wang, Chun H.; Ghorbani, Kamran

    2014-01-01

    Recently introduced passive wireless strain sensors based on microstrip patch antennas have shown great potential for reliable health and usage monitoring in aerospace and civil industries. However, the wireless interrogation range of these sensors is limited to few centimeters, which restricts their practical application. This paper presents an investigation on the effect of circular microstrip patch antenna (CMPA) design on the quality factor and the maximum practical wireless reading range of the sensor. The results reveal that by using appropriate substrate materials the interrogation distance of the CMPA sensor can be increased four-fold, from the previously reported 5 to 20 cm, thus improving considerably the viability of this type of wireless sensors for strain measurement and damage detection. PMID:24451457

  3. Wireless image-data transmission from an implanted image sensor through a living mouse brain by intra body communication

    Science.gov (United States)

    Hayami, Hajime; Takehara, Hiroaki; Nagata, Kengo; Haruta, Makito; Noda, Toshihiko; Sasagawa, Kiyotaka; Tokuda, Takashi; Ohta, Jun

    2016-04-01

    Intra body communication technology allows the fabrication of compact implantable biomedical sensors compared with RF wireless technology. In this paper, we report the fabrication of an implantable image sensor of 625 µm width and 830 µm length and the demonstration of wireless image-data transmission through a brain tissue of a living mouse. The sensor was designed to transmit output signals of pixel values by pulse width modulation (PWM). The PWM signals from the sensor transmitted through a brain tissue were detected by a receiver electrode. Wireless data transmission of a two-dimensional image was successfully demonstrated in a living mouse brain. The technique reported here is expected to provide useful methods of data transmission using micro sized implantable biomedical sensors.

  4. CMOS SPAD-based image sensor for single photon counting and time of flight imaging

    OpenAIRE

    Dutton, Neale Arthur William

    2016-01-01

    The facility to capture the arrival of a single photon, is the fundamental limit to the detection of quantised electromagnetic radiation. An image sensor capable of capturing a picture with this ultimate optical and temporal precision is the pinnacle of photo-sensing. The creation of high spatial resolution, single photon sensitive, and time-resolved image sensors in complementary metal oxide semiconductor (CMOS) technology offers numerous benefits in a wide field of applications....

  5. An ultrasensitive strain sensor with a wide strain range based on graphene armour scales.

    Science.gov (United States)

    Yang, Yi-Fan; Tao, Lu-Qi; Pang, Yu; Tian, He; Ju, Zhen-Yi; Wu, Xiao-Ming; Yang, Yi; Ren, Tian-Ling

    2018-06-12

    An ultrasensitive strain sensor with a wide strain range based on graphene armour scales is demonstrated in this paper. The sensor shows an ultra-high gauge factor (GF, up to 1054) and a wide strain range (ε = 26%), both of which present an advantage compared to most other flexible sensors. Moreover, the sensor is developed by a simple fabrication process. Due to the excellent performance, this strain sensor can meet the demands of subtle, large and complex human motion monitoring, which indicates its tremendous application potential in health monitoring, mechanical control, real-time motion monitoring and so on.

  6. The effect of split pixel HDR image sensor technology on MTF measurements

    Science.gov (United States)

    Deegan, Brian M.

    2014-03-01

    Split-pixel HDR sensor technology is particularly advantageous in automotive applications, because the images are captured simultaneously rather than sequentially, thereby reducing motion blur. However, split pixel technology introduces artifacts in MTF measurement. To achieve a HDR image, raw images are captured from both large and small sub-pixels, and combined to make the HDR output. In some cases, a large sub-pixel is used for long exposure captures, and a small sub-pixel for short exposures, to extend the dynamic range. The relative size of the photosensitive area of the pixel (fill factor) plays a very significant role in the output MTF measurement. Given an identical scene, the MTF will be significantly different, depending on whether you use the large or small sub-pixels i.e. a smaller fill factor (e.g. in the short exposure sub-pixel) will result in higher MTF scores, but significantly greater aliasing. Simulations of split-pixel sensors revealed that, when raw images from both sub-pixels are combined, there is a significant difference in rising edge (i.e. black-to-white transition) and falling edge (white-to-black) reproduction. Experimental results showed a difference of ~50% in measured MTF50 between the falling and rising edges of a slanted edge test chart.

  7. Technical guidance for the development of a solid state image sensor for human low vision image warping

    Science.gov (United States)

    Vanderspiegel, Jan

    1994-01-01

    This report surveys different technologies and approaches to realize sensors for image warping. The goal is to study the feasibility, technical aspects, and limitations of making an electronic camera with special geometries which implements certain transformations for image warping. This work was inspired by the research done by Dr. Juday at NASA Johnson Space Center on image warping. The study has looked into different solid-state technologies to fabricate image sensors. It is found that among the available technologies, CMOS is preferred over CCD technology. CMOS provides more flexibility to design different functions into the sensor, is more widely available, and is a lower cost solution. By using an architecture with row and column decoders one has the added flexibility of addressing the pixels at random, or read out only part of the image.

  8. Sparse Detector Imaging Sensor with Two-Class Silhouette Classification

    Directory of Open Access Journals (Sweden)

    David Russomanno

    2008-12-01

    Full Text Available This paper presents the design and test of a simple active near-infrared sparse detector imaging sensor. The prototype of the sensor is novel in that it can capture remarkable silhouettes or profiles of a wide-variety of moving objects, including humans, animals, and vehicles using a sparse detector array comprised of only sixteen sensing elements deployed in a vertical configuration. The prototype sensor was built to collect silhouettes for a variety of objects and to evaluate several algorithms for classifying the data obtained from the sensor into two classes: human versus non-human. Initial tests show that the classification of individually sensed objects into two classes can be achieved with accuracy greater than ninety-nine percent (99% with a subset of the sixteen detectors using a representative dataset consisting of 512 signatures. The prototype also includes a Webservice interface such that the sensor can be tasked in a network-centric environment. The sensor appears to be a low-cost alternative to traditional, high-resolution focal plane array imaging sensors for some applications. After a power optimization study, appropriate packaging, and testing with more extensive datasets, the sensor may be a good candidate for deployment in vast geographic regions for a myriad of intelligent electronic fence and persistent surveillance applications, including perimeter security scenarios.

  9. Highly curved image sensors: a practical approach for improved optical performance.

    Science.gov (United States)

    Guenter, Brian; Joshi, Neel; Stoakley, Richard; Keefe, Andrew; Geary, Kevin; Freeman, Ryan; Hundley, Jake; Patterson, Pamela; Hammon, David; Herrera, Guillermo; Sherman, Elena; Nowak, Andrew; Schubert, Randall; Brewer, Peter; Yang, Louis; Mott, Russell; McKnight, Geoff

    2017-06-12

    The significant optical and size benefits of using a curved focal surface for imaging systems have been well studied yet never brought to market for lack of a high-quality, mass-producible, curved image sensor. In this work we demonstrate that commercial silicon CMOS image sensors can be thinned and formed into accurate, highly curved optical surfaces with undiminished functionality. Our key development is a pneumatic forming process that avoids rigid mechanical constraints and suppresses wrinkling instabilities. A combination of forming-mold design, pressure membrane elastic properties, and controlled friction forces enables us to gradually contact the die at the corners and smoothly press the sensor into a spherical shape. Allowing the die to slide into the concave target shape enables a threefold increase in the spherical curvature over prior approaches having mechanical constraints that resist deformation, and create a high-stress, stretch-dominated state. Our process creates a bridge between the high precision and low-cost but planar CMOS process, and ideal non-planar component shapes such as spherical imagers for improved optical systems. We demonstrate these curved sensors in prototype cameras with custom lenses, measuring exceptional resolution of 3220 line-widths per picture height at an aperture of f/1.2 and nearly 100% relative illumination across the field. Though we use a 1/2.3" format image sensor in this report, we also show this process is generally compatible with many state of the art imaging sensor formats. By example, we report photogrammetry test data for an APS-C sized silicon die formed to a 30° subtended spherical angle. These gains in sharpness and relative illumination enable a new generation of ultra-high performance, manufacturable, digital imaging systems for scientific, industrial, and artistic use.

  10. Shack-Hartmann centroid detection method based on high dynamic range imaging and normalization techniques

    International Nuclear Information System (INIS)

    Vargas, Javier; Gonzalez-Fernandez, Luis; Quiroga, Juan Antonio; Belenguer, Tomas

    2010-01-01

    In the optical quality measuring process of an optical system, including diamond-turning components, the use of a laser light source can produce an undesirable speckle effect in a Shack-Hartmann (SH) CCD sensor. This speckle noise can deteriorate the precision and accuracy of the wavefront sensor measurement. Here we present a SH centroid detection method founded on computer-based techniques and capable of measurement in the presence of strong speckle noise. The method extends the dynamic range imaging capabilities of the SH sensor through the use of a set of different CCD integration times. The resultant extended range spot map is normalized to accurately obtain the spot centroids. The proposed method has been applied to measure the optical quality of the main optical system (MOS) of the mid-infrared instrument telescope smulator. The wavefront at the exit of this optical system is affected by speckle noise when it is illuminated by a laser source and by air turbulence because it has a long back focal length (3017 mm). Using the proposed technique, the MOS wavefront error was measured and satisfactory results were obtained.

  11. Self-amplified CMOS image sensor using a current-mode readout circuit

    Science.gov (United States)

    Santos, Patrick M.; de Lima Monteiro, Davies W.; Pittet, Patrick

    2014-05-01

    The feature size of the CMOS processes decreased during the past few years and problems such as reduced dynamic range have become more significant in voltage-mode pixels, even though the integration of more functionality inside the pixel has become easier. This work makes a contribution on both sides: the possibility of a high signal excursion range using current-mode circuits together with functionality addition by making signal amplification inside the pixel. The classic 3T pixel architecture was rebuild with small modifications to integrate a transconductance amplifier providing a current as an output. The matrix with these new pixels will operate as a whole large transistor outsourcing an amplified current that will be used for signal processing. This current is controlled by the intensity of the light received by the matrix, modulated pixel by pixel. The output current can be controlled by the biasing circuits to achieve a very large range of output signal levels. It can also be controlled with the matrix size and this permits a very high degree of freedom on the signal level, observing the current densities inside the integrated circuit. In addition, the matrix can operate at very small integration times. Its applications would be those in which fast imaging processing, high signal amplification are required and low resolution is not a major problem, such as UV image sensors. Simulation results will be presented to support: operation, control, design, signal excursion levels and linearity for a matrix of pixels that was conceived using this new concept of sensor.

  12. Dark current spectroscopy of space and nuclear environment induced displacement damage defects in pinned photodiode based CMOS image sensors

    International Nuclear Information System (INIS)

    Belloir, Jean-Marc

    2016-01-01

    CMOS image sensors are envisioned for an increasing number of high-end scientific imaging applications such as space imaging or nuclear experiments. Indeed, the performance of high-end CMOS image sensors has dramatically increased in the past years thanks to the unceasing improvements of microelectronics, and these image sensors have substantial advantages over CCDs which make them great candidates to replace CCDs in future space missions. However, in space and nuclear environments, CMOS image sensors must face harsh radiation which can rapidly degrade their electro-optical performances. In particular, the protons, electrons and ions travelling in space or the fusion neutrons from nuclear experiments can displace silicon atoms in the pixels and break the crystalline structure. These displacement damage effects lead to the formation of stable defects and to the introduction of states in the forbidden bandgap of silicon, which can allow the thermal generation of electron-hole pairs. Consequently, non ionizing radiation leads to a permanent increase of the dark current of the pixels and thus a decrease of the image sensor sensitivity and dynamic range. The aim of the present work is to extend the understanding of the effect of displacement damage on the dark current increase of CMOS image sensors. In particular, this work focuses on the shape of the dark current distribution depending on the particle type, energy and fluence but also on the image sensor physical parameters. Thanks to the many conditions tested, an empirical model for the prediction of the dark current distribution induced by displacement damage in nuclear or space environments is experimentally validated and physically justified. Another central part of this work consists in using the dark current spectroscopy technique for the first time on irradiated CMOS image sensors to detect and characterize radiation-induced silicon bulk defects. Many types of defects are detected and two of them are identified

  13. RADIOMETRIC NORMALIZATION OF LARGE AIRBORNE IMAGE DATA SETS ACQUIRED BY DIFFERENT SENSOR TYPES

    Directory of Open Access Journals (Sweden)

    S. Gehrke

    2016-06-01

    Full Text Available Generating seamless mosaics of aerial images is a particularly challenging task when the mosaic comprises a large number of im-ages, collected over longer periods of time and with different sensors under varying imaging conditions. Such large mosaics typically consist of very heterogeneous image data, both spatially (different terrain types and atmosphere and temporally (unstable atmo-spheric properties and even changes in land coverage. We present a new radiometric normalization or, respectively, radiometric aerial triangulation approach that takes advantage of our knowledge about each sensor’s properties. The current implementation supports medium and large format airborne imaging sensors of the Leica Geosystems family, namely the ADS line-scanner as well as DMC and RCD frame sensors. A hierarchical modelling – with parameters for the overall mosaic, the sensor type, different flight sessions, strips and individual images – allows for adaptation to each sensor’s geometric and radiometric properties. Additional parameters at different hierarchy levels can compensate radiome-tric differences of various origins to compensate for shortcomings of the preceding radiometric sensor calibration as well as BRDF and atmospheric corrections. The final, relative normalization is based on radiometric tie points in overlapping images, absolute radiometric control points and image statistics. It is computed in a global least squares adjustment for the entire mosaic by altering each image’s histogram using a location-dependent mathematical model. This model involves contrast and brightness corrections at radiometric fix points with bilinear interpolation for corrections in-between. The distribution of the radiometry fixes is adaptive to each image and generally increases with image size, hence enabling optimal local adaptation even for very long image strips as typi-cally captured by a line-scanner sensor. The normalization approach is implemented in

  14. Indoor and Outdoor Depth Imaging of Leaves With Time-of-Flight and Stereo Vision Sensors

    DEFF Research Database (Denmark)

    Kazmi, Wajahat; Foix, Sergi; Alenya, Guilliem

    2014-01-01

    In this article we analyze the response of Time-of-Flight (ToF) cameras (active sensors) for close range imaging under three different illumination conditions and compare the results with stereo vision (passive) sensors. ToF cameras are sensitive to ambient light and have low resolution but deliver...... poorly under sunlight. Stereo vision is comparatively more robust to ambient illumination and provides high resolution depth data but is constrained by texture of the object along with computational efficiency. Graph cut based stereo correspondence algorithm can better retrieve the shape of the leaves...

  15. Photon detection with CMOS sensors for fast imaging

    International Nuclear Information System (INIS)

    Baudot, J.; Dulinski, W.; Winter, M.; Barbier, R.; Chabanat, E.; Depasse, P.; Estre, N.

    2009-01-01

    Pixel detectors employed in high energy physics aim to detect single minimum ionizing particle with micrometric positioning resolution. Monolithic CMOS sensors succeed in this task thanks to a low equivalent noise charge per pixel of around 10 to 15 e - , and a pixel pitch varying from 10 to a few 10 s of microns. Additionally, due to the possibility for integration of some data treatment in the sensor itself, readout times of 100μs have been reached for 100 kilo-pixels sensors. These aspects of CMOS sensors are attractive for applications in photon imaging. For X-rays of a few keV, the efficiency is limited to a few % due to the thin sensitive volume. For visible photons, the back-thinned version of CMOS sensor is sensitive to low intensity sources, of a few hundred photons. When a back-thinned CMOS sensor is combined with a photo-cathode, a new hybrid detector results (EBCMOS) and operates as a fast single photon imager. The first EBCMOS was produced in 2007 and demonstrated single photon counting with low dark current capability in laboratory conditions. It has been compared, in two different biological laboratories, with existing CCD-based 2D cameras for fluorescence microscopy. The current EBCMOS sensitivity and frame rate is comparable to existing EMCCDs. On-going developments aim at increasing this frame rate by, at least, an order of magnitude. We report in conclusion, the first test of a new CMOS sensor, LUCY, which reaches 1000 frames per second.

  16. Imaging moving objects from multiply scattered waves and multiple sensors

    International Nuclear Information System (INIS)

    Miranda, Analee; Cheney, Margaret

    2013-01-01

    In this paper, we develop a linearized imaging theory that combines the spatial, temporal and spectral components of multiply scattered waves as they scatter from moving objects. In particular, we consider the case of multiple fixed sensors transmitting and receiving information from multiply scattered waves. We use a priori information about the multipath background. We use a simple model for multiple scattering, namely scattering from a fixed, perfectly reflecting (mirror) plane. We base our image reconstruction and velocity estimation technique on a modification of a filtered backprojection method that produces a phase-space image. We plot examples of point-spread functions for different geometries and waveforms, and from these plots, we estimate the resolution in space and velocity. Through this analysis, we are able to identify how the imaging system depends on parameters such as bandwidth and number of sensors. We ultimately show that enhanced phase-space resolution for a distribution of moving and stationary targets in a multipath environment may be achieved using multiple sensors. (paper)

  17. Implementation of large area CMOS image sensor module using the precision align inspection

    International Nuclear Information System (INIS)

    Kim, Byoung Wook; Kim, Toung Ju; Ryu, Cheol Woo; Lee, Kyung Yong; Kim, Jin Soo; Kim, Myung Soo; Cho, Gyu Seong

    2014-01-01

    This paper describes a large area CMOS image sensor module Implementation using the precision align inspection program. This work is needed because wafer cutting system does not always have high precision. The program check more than 8 point of sensor edges and align sensors with moving table. The size of a 2×1 butted CMOS image sensor module which except for the size of PCB is 170 mm×170 mm. And the pixel size is 55 μm×55 μm and the number of pixels is 3,072×3,072. The gap between the two CMOS image sensor module was arranged in less than one pixel size

  18. Implementation of large area CMOS image sensor module using the precision align inspection

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Wook; Kim, Toung Ju; Ryu, Cheol Woo [Radiation Imaging Technology Center, JBTP, Iksan (Korea, Republic of); Lee, Kyung Yong; Kim, Jin Soo [Nano Sol-Tech INC., Iksan (Korea, Republic of); Kim, Myung Soo; Cho, Gyu Seong [Dept. of Nuclear and Quantum Engineering, KAIST, Daejeon (Korea, Republic of)

    2014-12-15

    This paper describes a large area CMOS image sensor module Implementation using the precision align inspection program. This work is needed because wafer cutting system does not always have high precision. The program check more than 8 point of sensor edges and align sensors with moving table. The size of a 2×1 butted CMOS image sensor module which except for the size of PCB is 170 mm×170 mm. And the pixel size is 55 μm×55 μm and the number of pixels is 3,072×3,072. The gap between the two CMOS image sensor module was arranged in less than one pixel size.

  19. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Kiyotaka Sasagawa

    2010-12-01

    Full Text Available In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors’ architecture on the basis of the type of electric measurement or imaging functionalities.

  20. Improved laser-based triangulation sensor with enhanced range and resolution through adaptive optics-based active beam control.

    Science.gov (United States)

    Reza, Syed Azer; Khwaja, Tariq Shamim; Mazhar, Mohsin Ali; Niazi, Haris Khan; Nawab, Rahma

    2017-07-20

    Various existing target ranging techniques are limited in terms of the dynamic range of operation and measurement resolution. These limitations arise as a result of a particular measurement methodology, the finite processing capability of the hardware components deployed within the sensor module, and the medium through which the target is viewed. Generally, improving the sensor range adversely affects its resolution and vice versa. Often, a distance sensor is designed for an optimal range/resolution setting depending on its intended application. Optical triangulation is broadly classified as a spatial-signal-processing-based ranging technique and measures target distance from the location of the reflected spot on a position sensitive detector (PSD). In most triangulation sensors that use lasers as a light source, beam divergence-which severely affects sensor measurement range-is often ignored in calculations. In this paper, we first discuss in detail the limitations to ranging imposed by beam divergence, which, in effect, sets the sensor dynamic range. Next, we show how the resolution of laser-based triangulation sensors is limited by the interpixel pitch of a finite-sized PSD. In this paper, through the use of tunable focus lenses (TFLs), we propose a novel design of a triangulation-based optical rangefinder that improves both the sensor resolution and its dynamic range through adaptive electronic control of beam propagation parameters. We present the theory and operation of the proposed sensor and clearly demonstrate a range and resolution improvement with the use of TFLs. Experimental results in support of our claims are shown to be in strong agreement with theory.

  1. X-ray detectors based on image sensors

    International Nuclear Information System (INIS)

    Costa, A.P.R.

    1983-01-01

    X-ray detectors based on image sensors are described and a comparison is made between the advantages and the disadvantages of such a kind of detectors with the position sensitive detectors. (L.C.) [pt

  2. Multi sensor satellite imagers for commercial remote sensing

    Science.gov (United States)

    Cronje, T.; Burger, H.; Du Plessis, J.; Du Toit, J. F.; Marais, L.; Strumpfer, F.

    2005-10-01

    This paper will discuss and compare recent refractive and catodioptric imager designs developed and manufactured at SunSpace for Multi Sensor Satellite Imagers with Panchromatic, Multi-spectral, Area and Hyperspectral sensors on a single Focal Plane Array (FPA). These satellite optical systems were designed with applications to monitor food supplies, crop yield and disaster monitoring in mind. The aim of these imagers is to achieve medium to high resolution (2.5m to 15m) spatial sampling, wide swaths (up to 45km) and noise equivalent reflectance (NER) values of less than 0.5%. State-of-the-art FPA designs are discussed and address the choice of detectors to achieve these performances. Special attention is given to thermal robustness and compactness, the use of folding prisms to place multiple detectors in a large FPA and a specially developed process to customize the spectral selection with the need to minimize mass, power and cost. A refractive imager with up to 6 spectral bands (6.25m GSD) and a catodioptric imager with panchromatic (2.7m GSD), multi-spectral (6 bands, 4.6m GSD), hyperspectral (400nm to 2.35μm, 200 bands, 15m GSD) sensors on the same FPA will be discussed. Both of these imagers are also equipped with real time video view finding capabilities. The electronic units could be subdivided into the Front-End Electronics and Control Electronics with analogue and digital signal processing. A dedicated Analogue Front-End is used for Correlated Double Sampling (CDS), black level correction, variable gain and up to 12-bit digitizing and high speed LVDS data link to a mass memory unit.

  3. Improved linearity using harmonic error rejection in a full-field range imaging system

    Science.gov (United States)

    Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.

    2008-02-01

    Full field range imaging cameras are used to simultaneously measure the distance for every pixel in a given scene using an intensity modulated illumination source and a gain modulated receiver array. The light is reflected from an object in the scene, and the modulation envelope experiences a phase shift proportional to the target distance. Ideally the waveforms are sinusoidal, allowing the phase, and hence object range, to be determined from four measurements using an arctangent function. In practice these waveforms are often not perfectly sinusoidal, and in some cases square waveforms are instead used to simplify the electronic drive requirements. The waveforms therefore commonly contain odd harmonics which contribute a nonlinear error to the phase determination, and therefore an error in the range measurement. We have developed a unique sampling method to cancel the effect of these harmonics, with the results showing an order of magnitude improvement in the measurement linearity without the need for calibration or lookup tables, while the acquisition time remains unchanged. The technique can be applied to existing range imaging systems without having to change or modify the complex illumination or sensor systems, instead only requiring a change to the signal generation and timing electronics.

  4. Miniature infrared hyperspectral imaging sensor for airborne applications

    Science.gov (United States)

    Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl

    2017-05-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each

  5. High-content analysis of single cells directly assembled on CMOS sensor based on color imaging.

    Science.gov (United States)

    Tanaka, Tsuyoshi; Saeki, Tatsuya; Sunaga, Yoshihiko; Matsunaga, Tadashi

    2010-12-15

    A complementary metal oxide semiconductor (CMOS) image sensor was applied to high-content analysis of single cells which were assembled closely or directly onto the CMOS sensor surface. The direct assembling of cell groups on CMOS sensor surface allows large-field (6.66 mm×5.32 mm in entire active area of CMOS sensor) imaging within a second. Trypan blue-stained and non-stained cells in the same field area on the CMOS sensor were successfully distinguished as white- and blue-colored images under white LED light irradiation. Furthermore, the chemiluminescent signals of each cell were successfully visualized as blue-colored images on CMOS sensor only when HeLa cells were placed directly on the micro-lens array of the CMOS sensor. Our proposed approach will be a promising technique for real-time and high-content analysis of single cells in a large-field area based on color imaging. Copyright © 2010 Elsevier B.V. All rights reserved.

  6. Precipitable water and surface humidity over global oceans from special sensor microwave imager and European Center for Medium Range Weather Forecasts

    Science.gov (United States)

    Liu, W. T.; Tang, Wenqing; Wentz, Frank J.

    1992-01-01

    Global fields of precipitable water W from the special sensor microwave imager were compared with those from the European Center for Medium Range Weather Forecasts (ECMWF) model. They agree over most ocean areas; both data sets capture the two annual cycles examined and the interannual anomalies during an ENSO episode. They show significant differences in the dry air masses over the eastern tropical-subtropical oceans, particularly in the Southern Hemisphere. In these regions, comparisons with radiosonde data indicate that overestimation by the ECMWF model accounts for a large part of the differences. As a check on the W differences, surface-level specific humidity Q derived from W, using a statistical relation, was compared with Q from the ECMWF model. The differences in Q were found to be consistent with the differences in W, indirectly validating the Q-W relation. In both W and Q, SSMI was able to discern clearly the equatorial extension of the tongues of dry air in the eastern tropical ocean, while both ECMWF and climatological fields have reduced spatial gradients and weaker intensity.

  7. Highly Sensitive Multifilament Fiber Strain Sensors with Ultrabroad Sensing Range for Textile Electronics.

    Science.gov (United States)

    Lee, Jaehong; Shin, Sera; Lee, Sanggeun; Song, Jaekang; Kang, Subin; Han, Heetak; Kim, SeulGee; Kim, Seunghoe; Seo, Jungmok; Kim, DaeEun; Lee, Taeyoon

    2018-05-22

    Highly stretchable fiber strain sensors are one of the most important components for various applications in wearable electronics, electronic textiles, and biomedical electronics. Herein, we present a facile approach for fabricating highly stretchable and sensitive fiber strain sensors by embedding Ag nanoparticles into a stretchable fiber with a multifilament structure. The multifilament structure and Ag-rich shells of the fiber strain sensor enable the sensor to simultaneously achieve both a high sensitivity and largely wide sensing range despite its simple fabrication process and components. The fiber strain sensor simultaneously exhibits ultrahigh gauge factors (∼9.3 × 10 5 and ∼659 in the first stretching and subsequent stretching, respectively), a very broad strain-sensing range (450 and 200% for the first and subsequent stretching, respectively), and high durability for more than 10 000 stretching cycles. The fiber strain sensors can also be readily integrated into a glove to control a hand robot and effectively applied to monitor the large volume expansion of a balloon and a pig bladder for an artificial bladder system, thereby demonstrating the potential of the fiber strain sensors as candidates for electronic textiles, wearable electronics, and biomedical engineering.

  8. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors.

    Science.gov (United States)

    Malinowski, Pawel E; Georgitzikis, Epimitheas; Maes, Jorick; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-12-10

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III-V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10 -6 A/cm² at -2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.

  9. A 75-ps Gated CMOS Image Sensor with Low Parasitic Light Sensitivity.

    Science.gov (United States)

    Zhang, Fan; Niu, Hanben

    2016-06-29

    In this study, a 40 × 48 pixel global shutter complementary metal-oxide-semiconductor (CMOS) image sensor with an adjustable shutter time as low as 75 ps was implemented using a 0.5-μm mixed-signal CMOS process. The implementation consisted of a continuous contact ring around each p+/n-well photodiode in the pixel array in order to apply sufficient light shielding. The parasitic light sensitivity of the in-pixel storage node was measured to be 1/8.5 × 10⁷ when illuminated by a 405-nm diode laser and 1/1.4 × 10⁴ when illuminated by a 650-nm diode laser. The pixel pitch was 24 μm, the size of the square p+/n-well photodiode in each pixel was 7 μm per side, the measured random readout noise was 217 e(-) rms, and the measured dynamic range of the pixel of the designed chip was 5500:1. The type of gated CMOS image sensor (CIS) that is proposed here can be used in ultra-fast framing cameras to observe non-repeatable fast-evolving phenomena.

  10. Object-Oriented Hierarchy Radiation Consistency for Different Temporal and Different Sensor Images

    Directory of Open Access Journals (Sweden)

    Nan Su

    2018-02-01

    Full Text Available In the paper, we propose a novel object-oriented hierarchy radiation consistency method for dense matching of different temporal and different sensor data in the 3D reconstruction. For different temporal images, our illumination consistency method is proposed to solve both the illumination uniformity for a single image and the relative illumination normalization for image pairs. Especially in the relative illumination normalization step, singular value equalization and linear relationship of the invariant pixels is combined used for the initial global illumination normalization and the object-oriented refined illumination normalization in detail, respectively. For different sensor images, we propose the union group sparse method, which is based on improving the original group sparse model. The different sensor images are set to a similar smoothness level by the same threshold of singular value from the union group matrix. Our method comprehensively considered the influence factors on the dense matching of the different temporal and different sensor stereoscopic image pairs to simultaneously improve the illumination consistency and the smoothness consistency. The radiation consistency experimental results verify the effectiveness and superiority of the proposed method by comparing two other methods. Moreover, in the dense matching experiment of the mixed stereoscopic image pairs, our method has more advantages for objects in the urban area.

  11. Intelligent Luminance Control of Lighting Systems Based on Imaging Sensor Feedback

    Directory of Open Access Journals (Sweden)

    Haoting Liu

    2017-02-01

    Full Text Available An imaging sensor-based intelligent Light Emitting Diode (LED lighting system for desk use is proposed. In contrast to the traditional intelligent lighting system, such as the photosensitive resistance sensor-based or the infrared sensor-based system, the imaging sensor can realize a finer perception of the environmental light; thus it can guide a more precise lighting control. Before this system works, first lots of typical imaging lighting data of the desk application are accumulated. Second, a series of subjective and objective Lighting Effect Evaluation Metrics (LEEMs are defined and assessed for these datasets above. Then the cluster benchmarks of these objective LEEMs can be obtained. Third, both a single LEEM-based control and a multiple LEEMs-based control are developed to realize a kind of optimal luminance tuning. When this system works, first it captures the lighting image using a wearable camera. Then it computes the objective LEEMs of the captured image and compares them with the cluster benchmarks of the objective LEEMs. Finally, the single LEEM-based or the multiple LEEMs-based control can be implemented to get a kind of optimal lighting effect. Many experiment results have shown the proposed system can tune the LED lamp automatically according to environment luminance changes.

  12. Retinal fundus imaging with a plenoptic sensor

    Science.gov (United States)

    Thurin, Brice; Bloch, Edward; Nousias, Sotiris; Ourselin, Sebastien; Keane, Pearse; Bergeles, Christos

    2018-02-01

    Vitreoretinal surgery is moving towards 3D visualization of the surgical field. This require acquisition system capable of recording such 3D information. We propose a proof of concept imaging system based on a light-field camera where an array of micro-lenses is placed in front of a conventional sensor. With a single snapshot, a stack of images focused at different depth are produced on the fly, which provides enhanced depth perception for the surgeon. Difficulty in depth localization of features and frequent focus-change during surgery are making current vitreoretinal heads-up surgical imaging systems cumbersome to use. To improve the depth perception and eliminate the need to manually refocus on the instruments during the surgery, we designed and implemented a proof-of-concept ophthalmoscope equipped with a commercial light-field camera. The sensor of our camera is composed of an array of micro-lenses which are projecting an array of overlapped micro-images. We show that with a single light-field snapshot we can digitally refocus between the retina and a tool located in front of the retina or display an extended depth-of-field image where everything is in focus. The design and system performances of the plenoptic fundus camera are detailed. We will conclude by showing in vivo data recorded with our device.

  13. Two-dimensional pixel image lag simulation and optimization in a 4-T CMOS image sensor

    Energy Technology Data Exchange (ETDEWEB)

    Yu Junting; Li Binqiao; Yu Pingping; Xu Jiangtao [School of Electronics Information Engineering, Tianjin University, Tianjin 300072 (China); Mou Cun, E-mail: xujiangtao@tju.edu.c [Logistics Management Office, Hebei University of Technology, Tianjin 300130 (China)

    2010-09-15

    Pixel image lag in a 4-T CMOS image sensor is analyzed and simulated in a two-dimensional model. Strategies of reducing image lag are discussed from transfer gate channel threshold voltage doping adjustment, PPD N-type doping dose/implant tilt adjustment and transfer gate operation voltage adjustment for signal electron transfer. With the computer analysis tool ISE-TCAD, simulation results show that minimum image lag can be obtained at a pinned photodiode n-type doping dose of 7.0 x 10{sup 12} cm{sup -2}, an implant tilt of -2{sup 0}, a transfer gate channel doping dose of 3.0 x 10{sup 12} cm{sup -2} and an operation voltage of 3.4 V. The conclusions of this theoretical analysis can be a guideline for pixel design to improve the performance of 4-T CMOS image sensors. (semiconductor devices)

  14. 77 FR 26787 - Certain CMOS Image Sensors and Products Containing Same; Notice of Receipt of Complaint...

    Science.gov (United States)

    2012-05-07

    ... INTERNATIONAL TRADE COMMISSION [Docket No. 2895] Certain CMOS Image Sensors and Products.... International Trade Commission has received a complaint entitled Certain CMOS Image Sensors and Products... importation, and the sale within the United States after importation of certain CMOS image sensors and...

  15. CMOS image sensor with contour enhancement

    Science.gov (United States)

    Meng, Liya; Lai, Xiaofeng; Chen, Kun; Yuan, Xianghui

    2010-10-01

    Imitating the signal acquisition and processing of vertebrate retina, a CMOS image sensor with bionic pre-processing circuit is designed. Integration of signal-process circuit on-chip can reduce the requirement of bandwidth and precision of the subsequent interface circuit, and simplify the design of the computer-vision system. This signal pre-processing circuit consists of adaptive photoreceptor, spatial filtering resistive network and Op-Amp calculation circuit. The adaptive photoreceptor unit with a dynamic range of approximately 100 dB has a good self-adaptability for the transient changes in light intensity instead of intensity level itself. Spatial low-pass filtering resistive network used to mimic the function of horizontal cell, is composed of the horizontal resistor (HRES) circuit and OTA (Operational Transconductance Amplifier) circuit. HRES circuit, imitating dendrite of the neuron cell, comprises of two series MOS transistors operated in weak inversion region. Appending two diode-connected n-channel transistors to a simple transconductance amplifier forms the OTA Op-Amp circuit, which provides stable bias voltage for the gate of MOS transistors in HRES circuit, while serves as an OTA voltage follower to provide input voltage for the network nodes. The Op-Amp calculation circuit with a simple two-stage Op-Amp achieves the image contour enhancing. By adjusting the bias voltage of the resistive network, the smoothing effect can be tuned to change the effect of image's contour enhancement. Simulations of cell circuit and 16×16 2D circuit array are implemented using CSMC 0.5μm DPTM CMOS process.

  16. Design and Implementation of a Novel Compatible Encoding Scheme in the Time Domain for Image Sensor Communication

    Directory of Open Access Journals (Sweden)

    Trang Nguyen

    2016-05-01

    Full Text Available This paper presents a modulation scheme in the time domain based on On-Off-Keying and proposes various compatible supports for different types of image sensors. The content of this article is a sub-proposal to the IEEE 802.15.7r1 Task Group (TG7r1 aimed at Optical Wireless Communication (OWC using an image sensor as the receiver. The compatibility support is indispensable for Image Sensor Communications (ISC because the rolling shutter image sensors currently available have different frame rates, shutter speeds, sampling rates, and resolutions. However, focusing on unidirectional communications (i.e., data broadcasting, beacons, an asynchronous communication prototype is also discussed in the paper. Due to the physical limitations associated with typical image sensors (including low and varying frame rates, long exposures, and low shutter speeds, the link speed performance is critically considered. Based on the practical measurement of camera response to modulated light, an operating frequency range is suggested along with the similar system architecture, decoding procedure, and algorithms. A significant feature of our novel data frame structure is that it can support both typical frame rate cameras (in the oversampling mode as well as very low frame rate cameras (in the error detection mode for a camera whose frame rate is lower than the transmission packet rate. A high frame rate camera, i.e., no less than 20 fps, is supported in an oversampling mode in which a majority voting scheme for decoding data is applied. A low frame rate camera, i.e., when the frame rate drops to less than 20 fps at some certain time, is supported by an error detection mode in which any missing data sub-packet is detected in decoding and later corrected by external code. Numerical results and valuable analysis are also included to indicate the capability of the proposed schemes.

  17. Methods and apparatuses for detection of radiation with semiconductor image sensors

    Science.gov (United States)

    Cogliati, Joshua Joseph

    2018-04-10

    A semiconductor image sensor is repeatedly exposed to high-energy photons while a visible light obstructer is in place to block visible light from impinging on the sensor to generate a set of images from the exposures. A composite image is generated from the set of images with common noise substantially removed so the composite image includes image information corresponding to radiated pixels that absorbed at least some energy from the high-energy photons. The composite image is processed to determine a set of bright points in the composite image, each bright point being above a first threshold. The set of bright points is processed to identify lines with two or more bright points that include pixels therebetween that are above a second threshold and identify a presence of the high-energy particles responsive to a number of lines.

  18. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    Science.gov (United States)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  19. Giga-pixel lensfree holographic microscopy and tomography using color image sensors.

    Directory of Open Access Journals (Sweden)

    Serhan O Isikman

    Full Text Available We report Giga-pixel lensfree holographic microscopy and tomography using color sensor-arrays such as CMOS imagers that exhibit Bayer color filter patterns. Without physically removing these color filters coated on the sensor chip, we synthesize pixel super-resolved lensfree holograms, which are then reconstructed to achieve ~350 nm lateral resolution, corresponding to a numerical aperture of ~0.8, across a field-of-view of ~20.5 mm(2. This constitutes a digital image with ~0.7 Billion effective pixels in both amplitude and phase channels (i.e., ~1.4 Giga-pixels total. Furthermore, by changing the illumination angle (e.g., ± 50° and scanning a partially-coherent light source across two orthogonal axes, super-resolved images of the same specimen from different viewing angles are created, which are then digitally combined to synthesize tomographic images of the object. Using this dual-axis lensfree tomographic imager running on a color sensor-chip, we achieve a 3D spatial resolution of ~0.35 µm × 0.35 µm × ~2 µm, in x, y and z, respectively, creating an effective voxel size of ~0.03 µm(3 across a sample volume of ~5 mm(3, which is equivalent to >150 Billion voxels. We demonstrate the proof-of-concept of this lensfree optical tomographic microscopy platform on a color CMOS image sensor by creating tomograms of micro-particles as well as a wild-type C. elegans nematode.

  20. CMOS Active-Pixel Image Sensor With Simple Floating Gates

    Science.gov (United States)

    Fossum, Eric R.; Nakamura, Junichi; Kemeny, Sabrina E.

    1996-01-01

    Experimental complementary metal-oxide/semiconductor (CMOS) active-pixel image sensor integrated circuit features simple floating-gate structure, with metal-oxide/semiconductor field-effect transistor (MOSFET) as active circuit element in each pixel. Provides flexibility of readout modes, no kTC noise, and relatively simple structure suitable for high-density arrays. Features desirable for "smart sensor" applications.

  1. Development of a 750x750 pixels CMOS imager sensor for tracking applications

    Science.gov (United States)

    Larnaudie, Franck; Guardiola, Nicolas; Saint-Pé, Olivier; Vignon, Bruno; Tulet, Michel; Davancens, Robert; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Estribeau, Magali

    2017-11-01

    Solid-state optical sensors are now commonly used in space applications (navigation cameras, astronomy imagers, tracking sensors...). Although the charge-coupled devices are still widely used, the CMOS image sensor (CIS), which performances are continuously improving, is a strong challenger for Guidance, Navigation and Control (GNC) systems. This paper describes a 750x750 pixels CMOS image sensor that has been specially designed and developed for star tracker and tracking sensor applications. Such detector, that is featuring smart architecture enabling very simple and powerful operations, is built using the AMIS 0.5μm CMOS technology. It contains 750x750 rectangular pixels with 20μm pitch. The geometry of the pixel sensitive zone is optimized for applications based on centroiding measurements. The main feature of this device is the on-chip control and timing function that makes the device operation easier by drastically reducing the number of clocks to be applied. This powerful function allows the user to operate the sensor with high flexibility: measurement of dark level from masked lines, direct access to the windows of interest… A temperature probe is also integrated within the CMOS chip allowing a very precise measurement through the video stream. A complete electro-optical characterization of the sensor has been performed. The major parameters have been evaluated: dark current and its uniformity, read-out noise, conversion gain, Fixed Pattern Noise, Photo Response Non Uniformity, quantum efficiency, Modulation Transfer Function, intra-pixel scanning. The characterization tests are detailed in the paper. Co60 and protons irradiation tests have been also carried out on the image sensor and the results are presented. The specific features of the 750x750 image sensor such as low power CMOS design (3.3V, power consumption<100mW), natural windowing (that allows efficient and robust tracking algorithms), simple proximity electronics (because of the on

  2. 77 FR 74513 - Certain CMOS Image Sensors and Products Containing Same; Investigations: Terminations...

    Science.gov (United States)

    2012-12-14

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-846] Certain CMOS Image Sensors and Products Containing Same; Investigations: Terminations, Modifications and Rulings AGENCY: U.S... United States after importation of certain CMOS image sensors and products containing the same based on...

  3. Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data

    Directory of Open Access Journals (Sweden)

    Hongxia Wang

    2018-05-01

    Full Text Available It is a challenge to distinguish between different cloud types because of the complexity and diversity of cloud coverage, which is a significant clutter source that impacts on target detection and identification from the images of space-based infrared sensors. In this paper, a novel strategy for cloud classification in wide-swath passive sensor images is developed, which is aided by narrow-swath active sensor data. The strategy consists of three steps, that is, the orbit registration, most matching donor pixel selection, and cloud type assignment for each recipient pixel. A new criterion for orbit registration is proposed so as to improve the matching accuracy. The most matching donor pixel is selected via the Euclidean distance and the square sum of the radiance relative differences between the recipient and the potential donor pixels. Each recipient pixel is then assigned a cloud type that corresponds to the most matching donor. The cloud classification of the Moderate Resolution Imaging Spectroradiometer (MODIS images is performed with the aid of the data from Cloud Profiling Radar (CPR. The results are compared with the CloudSat product 2B-CLDCLASS, as well as those that are obtained using the method of the International Satellite Cloud Climatology Project (ISCCP, which demonstrates the superior classification performance of the proposed strategy.

  4. Nanocomposite-Based Microstructured Piezoresistive Pressure Sensors for Low-Pressure Measurement Range

    Directory of Open Access Journals (Sweden)

    Vasileios Mitrakos

    2018-01-01

    Full Text Available Piezoresistive pressure sensors capable of detecting ranges of low compressive stresses have been successfully fabricated and characterised. The 5.5 × 5 × 1.6 mm3 sensors consist of a planar aluminium top electrode and a microstructured bottom electrode containing a two-by-two array of truncated pyramids with a piezoresistive composite layer sandwiched in-between. The responses of two different piezocomposite materials, a Multiwalled Carbon Nanotube (MWCNT-elastomer composite and a Quantum Tunneling Composite (QTC, have been characterised as a function of applied pressure and effective contact area. The MWCNT piezoresistive composite-based sensor was able to detect pressures as low as 200 kPa. The QTC-based sensor was capable of detecting pressures as low as 50 kPa depending on the contact area of the bottom electrode. Such sensors could find useful applications requiring the detection of small compressive loads such as those encountered in haptic sensing or robotics.

  5. Large dynamic range pressure sensor based on two semicircle-holes microstructured fiber.

    Science.gov (United States)

    Liu, Zhengyong; Htein, Lin; Lee, Kang-Kuen; Lau, Kin-Tak; Tam, Hwa-Yaw

    2018-01-08

    This paper presents a sensitive and large dynamic range pressure sensor based on a novel birefringence microstructured optical fiber (MOF) deployed in a Sagnac interferometer configuration. The MOF has two large semicircle holes in the cladding and a rectangular strut with germanium-doped core in the center. The fiber structure permits surrounding pressure to induce large effective index difference between the two polarized modes. The calculated and measured group birefringence of the fiber are 1.49 × 10 -4 , 1.23 × 10 -4 , respectively, at the wavelength of 1550 nm. Experimental results shown that the pressure sensitivity of the sensor varied from 45,000 pm/MPa to 50,000 pm/MPa, and minimum detectable pressure of 80 Pa and dynamic range of better than 116 dB could be achieved with the novel fiber sensor. The proposed sensor could be used in harsh environment and is an ideal candidate for downhole applications where high pressure measurement at elevated temperature up to 250 °C is needed.

  6. Assessment and Calibration of a RGB-D Camera (Kinect v2 Sensor Towards a Potential Use for Close-Range 3D Modeling

    Directory of Open Access Journals (Sweden)

    Elise Lachat

    2015-10-01

    Full Text Available In the last decade, RGB-D cameras - also called range imaging cameras - have known a permanent evolution. Because of their limited cost and their ability to measure distances at a high frame rate, such sensors are especially appreciated for applications in robotics or computer vision. The Kinect v1 (Microsoft release in November 2010 promoted the use of RGB-D cameras, so that a second version of the sensor arrived on the market in July 2014. Since it is possible to obtain point clouds of an observed scene with a high frequency, one could imagine applying this type of sensors to answer to the need for 3D acquisition. However, due to the technology involved, some questions have to be considered such as, for example, the suitability and accuracy of RGB-D cameras for close range 3D modeling. In that way, the quality of the acquired data represents a major axis. In this paper, the use of a recent Kinect v2 sensor to reconstruct small objects in three dimensions has been investigated. To achieve this goal, a survey of the sensor characteristics as well as a calibration approach are relevant. After an accuracy assessment of the produced models, the benefits and drawbacks of Kinect v2 compared to the first version of the sensor and then to photogrammetry are discussed.

  7. Radio frequency (RF) time-of-flight ranging for wireless sensor networks

    International Nuclear Information System (INIS)

    Thorbjornsen, B; White, N M; Brown, A D; Reeve, J S

    2010-01-01

    Position information of nodes within wireless sensor networks (WSNs) is often a requirement in order to make use of the data recorded by the sensors themselves. On deployment the nodes normally have no prior knowledge of their position and thus a locationing mechanism is required to determine their positions. In this paper, we describe a method to determine the point-to-point range between sensor nodes as part of the locationing process. A two-way time-of-flight (TOF) ranging scheme is presented using narrow-band RF. The frequency difference between the transceivers involved with the point-to-point measurement is used to obtain a sub-clock TOF phase offset measurement in order to achieve high resolution TOF measurements. The ranging algorithm has been developed and prototyped on a TI CC2430 development kit with no additional hardware being required. Performance results have been obtained for the line-of-sight (LOS), non-line-of-sight (NLOS) and indoor conditions. Accuracy is typically better than 7.0 m RMS for the LOS condition over 250.0 m and 15.8 m RMS for the NLOS condition over 120.0 m using a 100 sample average. Indoor accuracy is measured to 1.7 m RMS using a 1000 sample average over 8.0 m. Ranging error is linear and does not increase with the increased transmitter–receiver distance. Our TOA ranging scheme demonstrates a novel system where resolution and accuracy are time dependent in comparison with alternative frequency-dependent methods using narrow-band RF

  8. Low-Power Smart Imagers for Vision-Enabled Sensor Networks

    CERN Document Server

    Fernández-Berni, Jorge; Rodríguez-Vázquez, Ángel

    2012-01-01

    This book presents a comprehensive, systematic approach to the development of vision system architectures that employ sensory-processing concurrency and parallel processing to meet the autonomy challenges posed by a variety of safety and surveillance applications.  Coverage includes a thorough analysis of resistive diffusion networks embedded within an image sensor array. This analysis supports a systematic approach to the design of spatial image filters and their implementation as vision chips in CMOS technology. The book also addresses system-level considerations pertaining to the embedding of these vision chips into vision-enabled wireless sensor networks.  Describes a system-level approach for designing of vision devices and  embedding them into vision-enabled, wireless sensor networks; Surveys state-of-the-art, vision-enabled WSN nodes; Includes details of specifications and challenges of vision-enabled WSNs; Explains architectures for low-energy CMOS vision chips with embedded, programmable spatial f...

  9. A multimodal image sensor system for identifying water stress in grapevines

    Science.gov (United States)

    Zhao, Yong; Zhang, Qin; Li, Minzan; Shao, Yongni; Zhou, Jianfeng; Sun, Hong

    2012-11-01

    Water stress is one of the most common limitations of fruit growth. Water is the most limiting resource for crop growth. In grapevines, as well as in other fruit crops, fruit quality benefits from a certain level of water deficit which facilitates to balance vegetative and reproductive growth and the flow of carbohydrates to reproductive structures. A multi-modal sensor system was designed to measure the reflectance signature of grape plant surfaces and identify different water stress levels in this paper. The multi-modal sensor system was equipped with one 3CCD camera (three channels in R, G, and IR). The multi-modal sensor can capture and analyze grape canopy from its reflectance features, and identify the different water stress levels. This research aims at solving the aforementioned problems. The core technology of this multi-modal sensor system could further be used as a decision support system that combines multi-modal sensory data to improve plant stress detection and identify the causes of stress. The images were taken by multi-modal sensor which could output images in spectral bands of near-infrared, green and red channel. Based on the analysis of the acquired images, color features based on color space and reflectance features based on image process method were calculated. The results showed that these parameters had the potential as water stress indicators. More experiments and analysis are needed to validate the conclusion.

  10. Heterodyne range imaging as an alternative to photogrammetry

    Science.gov (United States)

    Dorrington, Adrian; Cree, Michael; Carnegie, Dale; Payne, Andrew; Conroy, Richard

    2007-01-01

    Solid-state full-field range imaging technology, capable of determining the distance to objects in a scene simultaneously for every pixel in an image, has recently achieved sub-millimeter distance measurement precision. With this level of precision, it is becoming practical to use this technology for high precision three-dimensional metrology applications. Compared to photogrammetry, range imaging has the advantages of requiring only one viewing angle, a relatively short measurement time, and simplistic fast data processing. In this paper we fist review the range imaging technology, then describe an experiment comparing both photogrammetric and range imaging measurements of a calibration block with attached retro-reflective targets. The results show that the range imaging approach exhibits errors of approximately 0.5 mm in-plane and almost 5 mm out-of-plane; however, these errors appear to be mostly systematic. We then proceed to examine the physical nature and characteristics of the image ranging technology and discuss the possible causes of these systematic errors. Also discussed is the potential for further system characterization and calibration to compensate for the range determination and other errors, which could possibly lead to three-dimensional measurement precision approaching that of photogrammetry.

  11. Image sensor system with bio-inspired efficient coding and adaptation.

    Science.gov (United States)

    Okuno, Hirotsugu; Yagi, Tetsuya

    2012-08-01

    We designed and implemented an image sensor system equipped with three bio-inspired coding and adaptation strategies: logarithmic transform, local average subtraction, and feedback gain control. The system comprises a field-programmable gate array (FPGA), a resistive network, and active pixel sensors (APS), whose light intensity-voltage characteristics are controllable. The system employs multiple time-varying reset voltage signals for APS in order to realize multiple logarithmic intensity-voltage characteristics, which are controlled so that the entropy of the output image is maximized. The system also employs local average subtraction and gain control in order to obtain images with an appropriate contrast. The local average is calculated by the resistive network instantaneously. The designed system was successfully used to obtain appropriate images of objects that were subjected to large changes in illumination.

  12. CMOS image sensors: State-of-the-art

    Science.gov (United States)

    Theuwissen, Albert J. P.

    2008-09-01

    This paper gives an overview of the state-of-the-art of CMOS image sensors. The main focus is put on the shrinkage of the pixels : what is the effect on the performance characteristics of the imagers and on the various physical parameters of the camera ? How is the CMOS pixel architecture optimized to cope with the negative performance effects of the ever-shrinking pixel size ? On the other hand, the smaller dimensions in CMOS technology allow further integration on column level and even on pixel level. This will make CMOS imagers even smarter that they are already.

  13. High-speed particle tracking in microscopy using SPAD image sensors

    Science.gov (United States)

    Gyongy, Istvan; Davies, Amy; Miguelez Crespo, Allende; Green, Andrew; Dutton, Neale A. W.; Duncan, Rory R.; Rickman, Colin; Henderson, Robert K.; Dalgarno, Paul A.

    2018-02-01

    Single photon avalanche diodes (SPADs) are used in a wide range of applications, from fluorescence lifetime imaging microscopy (FLIM) to time-of-flight (ToF) 3D imaging. SPAD arrays are becoming increasingly established, combining the unique properties of SPADs with widefield camera configurations. Traditionally, the photosensitive area (fill factor) of SPAD arrays has been limited by the in-pixel digital electronics. However, recent designs have demonstrated that by replacing the complex digital pixel logic with simple binary pixels and external frame summation, the fill factor can be increased considerably. A significant advantage of such binary SPAD arrays is the high frame rates offered by the sensors (>100kFPS), which opens up new possibilities for capturing ultra-fast temporal dynamics in, for example, life science cellular imaging. In this work we consider the use of novel binary SPAD arrays in high-speed particle tracking in microscopy. We demonstrate the tracking of fluorescent microspheres undergoing Brownian motion, and in intra-cellular vesicle dynamics, at high frame rates. We thereby show how binary SPAD arrays can offer an important advance in live cell imaging in such fields as intercellular communication, cell trafficking and cell signaling.

  14. Passive Wireless Temperature Sensors with Enhanced Sensitivity and Range, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal describes the development of passive surface acoustic wave (SAW) temperature sensors with enhanced sensitivity and detection range for NASA application...

  15. Using polynomials to simplify fixed pattern noise and photometric correction of logarithmic CMOS image sensors.

    Science.gov (United States)

    Li, Jing; Mahmoodi, Alireza; Joseph, Dileepan

    2015-10-16

    An important class of complementary metal-oxide-semiconductor (CMOS) image sensors are those where pixel responses are monotonic nonlinear functions of light stimuli. This class includes various logarithmic architectures, which are easily capable of wide dynamic range imaging, at video rates, but which are vulnerable to image quality issues. To minimize fixed pattern noise (FPN) and maximize photometric accuracy, pixel responses must be calibrated and corrected due to mismatch and process variation during fabrication. Unlike literature approaches, which employ circuit-based models of varying complexity, this paper introduces a novel approach based on low-degree polynomials. Although each pixel may have a highly nonlinear response, an approximately-linear FPN calibration is possible by exploiting the monotonic nature of imaging. Moreover, FPN correction requires only arithmetic, and an optimal fixed-point implementation is readily derived, subject to a user-specified number of bits per pixel. Using a monotonic spline, involving cubic polynomials, photometric calibration is also possible without a circuit-based model, and fixed-point photometric correction requires only a look-up table. The approach is experimentally validated with a logarithmic CMOS image sensor and is compared to a leading approach from the literature. The novel approach proves effective and efficient.

  16. Evaluation of the AN/SAY-1 Thermal Imaging Sensor System

    National Research Council Canada - National Science Library

    Smith, John G; Middlebrook, Christopher T

    2002-01-01

    The AN/SAY-1 Thermal Imaging Sensor System "TISS" was developed to provide surface ships with a day/night imaging capability to detect low radar reflective, small cross-sectional area targets such as floating mines...

  17. 77 FR 33488 - Certain CMOS Image Sensors and Products Containing Same; Institution of Investigation Pursuant to...

    Science.gov (United States)

    2012-06-06

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-846] Certain CMOS Image Sensors and... image sensors and products containing same by reason of infringement of certain claims of U.S. Patent No... image sensors and products containing same that infringe one or more of claims 1 and 2 of the `126...

  18. BIOME: An Ecosystem Remote Sensor Based on Imaging Interferometry

    Science.gov (United States)

    Peterson, David L.; Hammer, Philip; Smith, William H.; Lawless, James G. (Technical Monitor)

    1994-01-01

    Until recent times, optical remote sensing of ecosystem properties from space has been limited to broad band multispectral scanners such as Landsat and AVHRR. While these sensor data can be used to derive important information about ecosystem parameters, they are very limited for measuring key biogeochemical cycling parameters such as the chemical content of plant canopies. Such parameters, for example the lignin and nitrogen contents, are potentially amenable to measurements by very high spectral resolution instruments using a spectroscopic approach. Airborne sensors based on grating imaging spectrometers gave the first promise of such potential but the recent decision not to deploy the space version has left the community without many alternatives. In the past few years, advancements in high performance deep well digital sensor arrays coupled with a patented design for a two-beam interferometer has produced an entirely new design for acquiring imaging spectroscopic data at the signal to noise levels necessary for quantitatively estimating chemical composition (1000:1 at 2 microns). This design has been assembled as a laboratory instrument and the principles demonstrated for acquiring remote scenes. An airborne instrument is in production and spaceborne sensors being proposed. The instrument is extremely promising because of its low cost, lower power requirements, very low weight, simplicity (no moving parts), and high performance. For these reasons, we have called it the first instrument optimized for ecosystem studies as part of a Biological Imaging and Observation Mission to Earth (BIOME).

  19. Low-Power Low-Noise CMOS Imager Design : In Micro-Digital Sun Sensor Application

    NARCIS (Netherlands)

    Xie, N.

    2012-01-01

    A digital sun sensor is superior to an analog sun sensor in aspects of resolution, albedo immunity, and integration. The proposed Micro-Digital Sun Sensor (µDSS) is an autonomous digital sun sensor which is implemented by means of a CMOS image sensor, which is named APS+. The µDSS is designed

  20. Six-axis force–torque sensor with a large range for biomechanical applications

    International Nuclear Information System (INIS)

    + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >Brookhuis, R A; + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >Droogendijk, H; + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >De Boer, M J; + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >Sanders, R G P; + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >Lammerink, T S J; + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >Wiegerink, R J; + Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" data-affiliation=" (MESA+ Institute for Nanotechnology, University of Twente, Enschede (Netherlands))" >Krijnen, G J M

    2014-01-01

    A silicon six-axis force–torque sensor is designed and realized to be used for measurement of the power transfer between the human body and the environment. Capacitive read-out is used to detect all axial force components and all torque components simultaneously. Small electrode gaps in combination with mechanical amplification by the sensor structure result in a high sensitivity. The miniature sensor has a wide force range of up to 50 N in normal direction, 10 N in shear direction and 25 N mm of maximum torque around each axis. (paper)

  1. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... with a multivariate analysis technique from RGB pictures. The color information is also transformed to hue, saturation and intensity components. Both sets of image features are combined with traditional process measurements to obtain an inferential model by partial least squares (PLS) regression. A dynamic PLS model...... oxides (NOx) emission of cement kilns. On-site tests demonstrate improved performance over soft sensors based on conventional process measurements only....

  2. Imaging properties of small-pixel spectroscopic x-ray detectors based on cadmium telluride sensors

    International Nuclear Information System (INIS)

    Koenig, Thomas; Schulze, Julia; Zuber, Marcus; Rink, Kristian; Oelfke, Uwe; Butzer, Jochen; Hamann, Elias; Cecilia, Angelica; Zwerger, Andreas; Fauler, Alex; Fiederle, Michael

    2012-01-01

    Spectroscopic x-ray imaging by means of photon counting detectors has received growing interest during the past years. Critical to the image quality of such devices is their pixel pitch and the sensor material employed. This paper describes the imaging properties of Medipix2 MXR multi-chip assemblies bump bonded to 1 mm thick CdTe sensors. Two systems were investigated with pixel pitches of 110 and 165 μm, which are in the order of the mean free path lengths of the characteristic x-rays produced in their sensors. Peak widths were found to be almost constant across the energy range of 10 to 60 keV, with values of 2.3 and 2.2 keV (FWHM) for the two pixel pitches. The average number of pixels responding to a single incoming photon are about 1.85 and 1.45 at 60 keV, amounting to detective quantum efficiencies of 0.77 and 0.84 at a spatial frequency of zero. Energy selective CT acquisitions are presented, and the two pixel pitches' abilities to discriminate between iodine and gadolinium contrast agents are examined. It is shown that the choice of the pixel pitch translates into a minimum contrast agent concentration for which material discrimination is still possible. We finally investigate saturation effects at high x-ray fluxes and conclude with the finding that higher maximum count rates come at the cost of a reduced energy resolution. (paper)

  3. Broadband image sensor array based on graphene-CMOS integration

    Science.gov (United States)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  4. Extended Special Sensor Microwave Imager (SSM/I) Sensor Data Record (SDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager (SSM/I) is a seven-channel linearly polarized passive microwave radiometer that operates at frequencies of 19.36 (vertically and...

  5. Short-Range Noncontact Sensors for Healthcare and Other Emerging Applications: A Review

    Directory of Open Access Journals (Sweden)

    Changzhan Gu

    2016-07-01

    Full Text Available Short-range noncontact sensors are capable of remotely detecting the precise movements of the subjects or wirelessly estimating the distance from the sensor to the subject. They find wide applications in our day lives such as noncontact vital sign detection of heart beat and respiration, sleep monitoring, occupancy sensing, and gesture sensing. In recent years, short-range noncontact sensors are attracting more and more efforts from both academia and industry due to their vast applications. Compared to other radar architectures such as pulse radar and frequency-modulated continuous-wave (FMCW radar, Doppler radar is gaining more popularity in terms of system integration and low-power operation. This paper reviews the recent technical advances in Doppler radars for healthcare applications, including system hardware improvement, digital signal processing, and chip integration. This paper also discusses the hybrid FMCW-interferometry radars and the emerging applications and the future trends.

  6. A CMOS image sensor with row and column profiling means

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.; Wang, X.; Leijtens, J.A.P.; Hakkesteegt, H.; Jansen, H.

    2008-01-01

    This paper describes the implementation and firstmeasurement results of a new way that obtains row and column profile data from a CMOS Image Sensor, which is developed for a micro-Digital Sun Sensor (μDSS).The basic profiling action is achieved by the pixels with p-type MOS transistors which realize

  7. High dynamic range image acquisition based on multiplex cameras

    Science.gov (United States)

    Zeng, Hairui; Sun, Huayan; Zhang, Tinghua

    2018-03-01

    High dynamic image is an important technology of photoelectric information acquisition, providing higher dynamic range and more image details, and it can better reflect the real environment, light and color information. Currently, the method of high dynamic range image synthesis based on different exposure image sequences cannot adapt to the dynamic scene. It fails to overcome the effects of moving targets, resulting in the phenomenon of ghost. Therefore, a new high dynamic range image acquisition method based on multiplex cameras system was proposed. Firstly, different exposure images sequences were captured with the camera array, using the method of derivative optical flow based on color gradient to get the deviation between images, and aligned the images. Then, the high dynamic range image fusion weighting function was established by combination of inverse camera response function and deviation between images, and was applied to generated a high dynamic range image. The experiments show that the proposed method can effectively obtain high dynamic images in dynamic scene, and achieves good results.

  8. Optical Inspection In Hostile Industrial Environments: Single-Sensor VS. Imaging Methods

    Science.gov (United States)

    Cielo, P.; Dufour, M.; Sokalski, A.

    1988-11-01

    On-line and unsupervised industrial inspection for quality control and process monitoring is increasingly required in the modern automated factory. Optical techniques are particularly well suited to industrial inspection in hostile environments because of their noncontact nature, fast response time and imaging capabilities. Optical sensors can be used for remote inspection of high temperature products or otherwise inaccessible parts, provided they are in a line-of-sight relation with the sensor. Moreover, optical sensors are much easier to adapt to a variety of part shapes, position or orientation and conveyor speeds as compared to contact-based sensors. This is an important requirement in a flexible automation environment. A number of choices are possible in the design of optical inspection systems. General-purpose two-dimensional (2-D) or three-dimensional (3-D) imaging techniques have advanced very rapidly in the last years thanks to a substantial research effort as well as to the availability of increasingly powerful and affordable hardware and software. Imaging can be realized using 2-D arrays or simpler one-dimensional (1-D) line-array detectors. Alternatively, dedicated single-spot sensors require a smaller amount of data processing and often lead to robust sensors which are particularly appropriate to on-line operation in hostile industrial environments. Many specialists now feel that dedicated sensors or clusters of sensors are often more effective for specific industrial automation and control tasks, at least in the short run. This paper will discuss optomechanical and electro-optical choices with reference to the design of a number of on-line inspection sensors which have been recently developed at our institute. Case studies will include real-time surface roughness evaluation on polymer cables extruded at high speed, surface characterization of hot-rolled or galvanized-steel sheets, temperature evaluation and pinhole detection in aluminum foil, multi

  9. SAW passive wireless sensor-RFID tags with enhanced range, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal describes the development of passive wireless surface acoustic wave (SAW) RFID sensor-tags with enhanced range for remote monitoring of large groups of...

  10. AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data

    Directory of Open Access Journals (Sweden)

    Daniel Scheffler

    2017-07-01

    Full Text Available Geospatial co-registration is a mandatory prerequisite when dealing with remote sensing data. Inter- or intra-sensoral misregistration will negatively affect any subsequent image analysis, specifically when processing multi-sensoral or multi-temporal data. In recent decades, many algorithms have been developed to enable manual, semi- or fully automatic displacement correction. Especially in the context of big data processing and the development of automated processing chains that aim to be applicable to different remote sensing systems, there is a strong need for efficient, accurate and generally usable co-registration. Here, we present AROSICS (Automated and Robust Open-Source Image Co-Registration Software, a Python-based open-source software including an easy-to-use user interface for automatic detection and correction of sub-pixel misalignments between various remote sensing datasets. It is independent of spatial or spectral characteristics and robust against high degrees of cloud coverage and spectral and temporal land cover dynamics. The co-registration is based on phase correlation for sub-pixel shift estimation in the frequency domain utilizing the Fourier shift theorem in a moving-window manner. A dense grid of spatial shift vectors can be created and automatically filtered by combining various validation and quality estimation metrics. Additionally, the software supports the masking of, e.g., clouds and cloud shadows to exclude such areas from spatial shift detection. The software has been tested on more than 9000 satellite images acquired by different sensors. The results are evaluated exemplarily for two inter-sensoral and two intra-sensoral use cases and show registration results in the sub-pixel range with root mean square error fits around 0.3 pixels and better.

  11. Short-Range Sensor for Underwater Robot Navigation using Line-lasers and Vision

    DEFF Research Database (Denmark)

    Hansen, Peter Nicholas; Nielsen, Mikkel Cornelius; Christensen, David Johan

    2015-01-01

    This paper investigates a minimalistic laser-based range sensor, used for underwater inspection by Autonomous Underwater Vehicles (AUV). This range detection system system comprise two lasers projecting vertical lines, parallel to a camera’s viewing axis, into the environment. Using both lasers...

  12. Fast regional readout CMOS Image Sensor for dynamic MLC tracking

    Science.gov (United States)

    Zin, H.; Harris, E.; Osmond, J.; Evans, P.

    2014-03-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  13. Fast regional readout CMOS image sensor for dynamic MLC tracking

    International Nuclear Information System (INIS)

    Zin, H; Harris, E; Osmond, J; Evans, P

    2014-01-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ∼400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  14. Ultra-high resolution coded wavefront sensor

    KAUST Repository

    Wang, Congli; Dun, Xiong; Fu, Qiang; Heidrich, Wolfgang

    2017-01-01

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor

  15. Quantitative Analysis of Range Image Patches by NEB Method

    Directory of Open Access Journals (Sweden)

    Wang Wen

    2017-01-01

    Full Text Available In this paper we analyze sampled high dimensional data with the NEB method from a range image database. Select a large random sample of log-valued, high contrast, normalized, 8×8 range image patches from the Brown database. We make a density estimator and we establish 1-dimensional cell complexes from the range image patch data. We find topological properties of 8×8 range image patches, prove that there exist two types of subsets of 8×8 range image patches modelled as a circle.

  16. High Resolution and Large Dynamic Range Resonant Pressure Sensor Based on Q-Factor Measurement

    Science.gov (United States)

    Gutierrez, Roman C. (Inventor); Stell, Christopher B. (Inventor); Tang, Tony K. (Inventor); Vorperian, Vatche (Inventor); Wilcox, Jaroslava (Inventor); Shcheglov, Kirill (Inventor); Kaiser, William J. (Inventor)

    2000-01-01

    A pressure sensor has a high degree of accuracy over a wide range of pressures. Using a pressure sensor relying upon resonant oscillations to determine pressure, a driving circuit drives such a pressure sensor at resonance and tracks resonant frequency and amplitude shifts with changes in pressure. Pressure changes affect the Q-factor of the resonating portion of the pressure sensor. Such Q-factor changes are detected by the driving/sensing circuit which in turn tracks the changes in resonant frequency to maintain the pressure sensor at resonance. Changes in the Q-factor are reflected in changes of amplitude of the resonating pressure sensor. In response, upon sensing the changes in the amplitude, the driving circuit changes the force or strength of the electrostatic driving signal to maintain the resonator at constant amplitude. The amplitude of the driving signals become a direct measure of the changes in pressure as the operating characteristics of the resonator give rise to a linear response curve for the amplitude of the driving signal. Pressure change resolution is on the order of 10(exp -6) torr over a range spanning from 7,600 torr to 10(exp -6) torr. No temperature compensation for the pressure sensor of the present invention is foreseen. Power requirements for the pressure sensor are generally minimal due to the low-loss mechanical design of the resonating pressure sensor and the simple control electronics.

  17. VLC-based indoor location awareness using LED light and image sensors

    Science.gov (United States)

    Lee, Seok-Ju; Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    Recently, indoor LED lighting can be considered for constructing green infra with energy saving and additionally providing LED-IT convergence services such as visible light communication (VLC) based location awareness and navigation services. For example, in case of large complex shopping mall, location awareness to navigate the destination is very important issue. However, the conventional navigation using GPS is not working indoors. Alternative location service based on WLAN has a problem that the position accuracy is low. For example, it is difficult to estimate the height exactly. If the position error of the height is greater than the height between floors, it may cause big problem. Therefore, conventional navigation is inappropriate for indoor navigation. Alternative possible solution for indoor navigation is VLC based location awareness scheme. Because indoor LED infra will be definitely equipped for providing lighting functionality, indoor LED lighting has a possibility to provide relatively high accuracy of position estimation combined with VLC technology. In this paper, we provide a new VLC based positioning system using visible LED lights and image sensors. Our system uses location of image sensor lens and location of reception plane. By using more than two image sensor, we can determine transmitter position less than 1m position error. Through simulation, we verify the validity of the proposed VLC based new positioning system using visible LED light and image sensors.

  18. Coded aperture detector: an image sensor with sub 20-nm pixel resolution.

    Science.gov (United States)

    Miyakawa, Ryan; Mayer, Rafael; Wojdyla, Antoine; Vannier, Nicolas; Lesser, Ian; Aron-Dine, Shifrah; Naulleau, Patrick

    2014-08-11

    We describe the coded aperture detector, a novel image sensor based on uniformly redundant arrays (URAs) with customizable pixel size, resolution, and operating photon energy regime. In this sensor, a coded aperture is scanned laterally at the image plane of an optical system, and the transmitted intensity is measured by a photodiode. The image intensity is then digitally reconstructed using a simple convolution. We present results from a proof-of-principle optical prototype, demonstrating high-fidelity image sensing comparable to a CCD. A 20-nm half-pitch URA fabricated by the Center for X-ray Optics (CXRO) nano-fabrication laboratory is presented that is suitable for high-resolution image sensing at EUV and soft X-ray wavelengths.

  19. The Design of a Single-Bit CMOS Image Sensor for Iris Recognition Applications

    Directory of Open Access Journals (Sweden)

    Keunyeol Park

    2018-02-01

    Full Text Available This paper presents a single-bit CMOS image sensor (CIS that uses a data processing technique with an edge detection block for simple iris segmentation. In order to recognize the iris image, the image sensor conventionally captures high-resolution image data in digital code, extracts the iris data, and then compares it with a reference image through a recognition algorithm. However, in this case, the frame rate decreases by the time required for digital signal conversion of multi-bit digital data through the analog-to-digital converter (ADC in the CIS. In order to reduce the overall processing time as well as the power consumption, we propose a data processing technique with an exclusive OR (XOR logic gate to obtain single-bit and edge detection image data instead of multi-bit image data through the ADC. In addition, we propose a logarithmic counter to efficiently measure single-bit image data that can be applied to the iris recognition algorithm. The effective area of the proposed single-bit image sensor (174 × 144 pixel is 2.84 mm2 with a 0.18 μm 1-poly 4-metal CMOS image sensor process. The power consumption of the proposed single-bit CIS is 2.8 mW with a 3.3 V of supply voltage and 520 frame/s of the maximum frame rates. The error rate of the ADC is 0.24 least significant bit (LSB on an 8-bit ADC basis at a 50 MHz sampling frequency.

  20. The Design of a Single-Bit CMOS Image Sensor for Iris Recognition Applications.

    Science.gov (United States)

    Park, Keunyeol; Song, Minkyu; Kim, Soo Youn

    2018-02-24

    This paper presents a single-bit CMOS image sensor (CIS) that uses a data processing technique with an edge detection block for simple iris segmentation. In order to recognize the iris image, the image sensor conventionally captures high-resolution image data in digital code, extracts the iris data, and then compares it with a reference image through a recognition algorithm. However, in this case, the frame rate decreases by the time required for digital signal conversion of multi-bit digital data through the analog-to-digital converter (ADC) in the CIS. In order to reduce the overall processing time as well as the power consumption, we propose a data processing technique with an exclusive OR (XOR) logic gate to obtain single-bit and edge detection image data instead of multi-bit image data through the ADC. In addition, we propose a logarithmic counter to efficiently measure single-bit image data that can be applied to the iris recognition algorithm. The effective area of the proposed single-bit image sensor (174 × 144 pixel) is 2.84 mm² with a 0.18 μm 1-poly 4-metal CMOS image sensor process. The power consumption of the proposed single-bit CIS is 2.8 mW with a 3.3 V of supply voltage and 520 frame/s of the maximum frame rates. The error rate of the ADC is 0.24 least significant bit (LSB) on an 8-bit ADC basis at a 50 MHz sampling frequency.

  1. Crop status sensing system by multi-spectral imaging sensor, 1: Image processing and paddy field sensing

    International Nuclear Information System (INIS)

    Ishii, K.; Sugiura, R.; Fukagawa, T.; Noguchi, N.; Shibata, Y.

    2006-01-01

    The objective of the study is to construct a sensing system for precision farming. A Multi-Spectral Imaging Sensor (MSIS), which can obtain three images (G. R and NIR) simultaneously, was used for detecting growth status of plants. The sensor was mounted on an unmanned helicopter. An image processing method for acquiring information of crop status with high accuracy was developed. Crop parameters that were measured include SPAD, leaf height, and stems number. Both direct seeding variety and transplant variety of paddy rice were adopted in the research. The result of a field test showed that crop status of both varieties could be detected with sufficient accuracy to apply to precision farming

  2. Accuracy of Shack-Hartmann wavefront sensor using a coherent wound fibre image bundle

    Science.gov (United States)

    Zheng, Jessica R.; Goodwin, Michael; Lawrence, Jon

    2018-03-01

    Shack-Hartmannwavefront sensors using wound fibre image bundles are desired for multi-object adaptive optical systems to provide large multiplex positioned by Starbugs. The use of a large-sized wound fibre image bundle provides the flexibility to use more sub-apertures wavefront sensor for ELTs. These compact wavefront sensors take advantage of large focal surfaces such as the Giant Magellan Telescope. The focus of this paper is to study the wound fibre image bundle structure defects effect on the centroid measurement accuracy of a Shack-Hartmann wavefront sensor. We use the first moment centroid method to estimate the centroid of a focused Gaussian beam sampled by a simulated bundle. Spot estimation accuracy with wound fibre image bundle and its structure impact on wavefront measurement accuracy statistics are addressed. Our results show that when the measurement signal-to-noise ratio is high, the centroid measurement accuracy is dominated by the wound fibre image bundle structure, e.g. tile angle and gap spacing. For the measurement with low signal-to-noise ratio, its accuracy is influenced by the read noise of the detector instead of the wound fibre image bundle structure defects. We demonstrate this both with simulation and experimentally. We provide a statistical model of the centroid and wavefront error of a wound fibre image bundle found through experiment.

  3. Image interpolation and denoising for division of focal plane sensors using Gaussian processes.

    Science.gov (United States)

    Gilboa, Elad; Cunningham, John P; Nehorai, Arye; Gruev, Viktor

    2014-06-16

    Image interpolation and denoising are important techniques in image processing. These methods are inherent to digital image acquisition as most digital cameras are composed of a 2D grid of heterogeneous imaging sensors. Current polarization imaging employ four different pixelated polarization filters, commonly referred to as division of focal plane polarization sensors. The sensors capture only partial information of the true scene, leading to a loss of spatial resolution as well as inaccuracy of the captured polarization information. Interpolation is a standard technique to recover the missing information and increase the accuracy of the captured polarization information. Here we focus specifically on Gaussian process regression as a way to perform a statistical image interpolation, where estimates of sensor noise are used to improve the accuracy of the estimated pixel information. We further exploit the inherent grid structure of this data to create a fast exact algorithm that operates in ����(N(3/2)) (vs. the naive ���� (N³)), thus making the Gaussian process method computationally tractable for image data. This modeling advance and the enabling computational advance combine to produce significant improvements over previously published interpolation methods for polarimeters, which is most pronounced in cases of low signal-to-noise ratio (SNR). We provide the comprehensive mathematical model as well as experimental results of the GP interpolation performance for division of focal plane polarimeter.

  4. Efficient demodulation scheme for rolling-shutter-patterning of CMOS image sensor based visible light communications.

    Science.gov (United States)

    Chen, Chia-Wei; Chow, Chi-Wai; Liu, Yang; Yeh, Chien-Hung

    2017-10-02

    Recently even the low-end mobile-phones are equipped with a high-resolution complementary-metal-oxide-semiconductor (CMOS) image sensor. This motivates using a CMOS image sensor for visible light communication (VLC). Here we propose and demonstrate an efficient demodulation scheme to synchronize and demodulate the rolling shutter pattern in image sensor based VLC. The implementation algorithm is discussed. The bit-error-rate (BER) performance and processing latency are evaluated and compared with other thresholding schemes.

  5. Atmospheric turbulence and sensor system effects on biometric algorithm performance

    Science.gov (United States)

    Espinola, Richard L.; Leonard, Kevin R.; Byrd, Kenneth A.; Potvin, Guy

    2015-05-01

    Biometric technologies composed of electro-optical/infrared (EO/IR) sensor systems and advanced matching algorithms are being used in various force protection/security and tactical surveillance applications. To date, most of these sensor systems have been widely used in controlled conditions with varying success (e.g., short range, uniform illumination, cooperative subjects). However the limiting conditions of such systems have yet to be fully studied for long range applications and degraded imaging environments. Biometric technologies used for long range applications will invariably suffer from the effects of atmospheric turbulence degradation. Atmospheric turbulence causes blur, distortion and intensity fluctuations that can severely degrade image quality of electro-optic and thermal imaging systems and, for the case of biometrics technology, translate to poor matching algorithm performance. In this paper, we evaluate the effects of atmospheric turbulence and sensor resolution on biometric matching algorithm performance. We use a subset of the Facial Recognition Technology (FERET) database and a commercial algorithm to analyze facial recognition performance on turbulence degraded facial images. The goal of this work is to understand the feasibility of long-range facial recognition in degraded imaging conditions, and the utility of camera parameter trade studies to enable the design of the next generation biometrics sensor systems.

  6. Robust image registration for multiple exposure high dynamic range image synthesis

    Science.gov (United States)

    Yao, Susu

    2011-03-01

    Image registration is an important preprocessing technique in high dynamic range (HDR) image synthesis. This paper proposed a robust image registration method for aligning a group of low dynamic range images (LDR) that are captured with different exposure times. Illumination change and photometric distortion between two images would result in inaccurate registration. We propose to transform intensity image data into phase congruency to eliminate the effect of the changes in image brightness and use phase cross correlation in the Fourier transform domain to perform image registration. Considering the presence of non-overlapped regions due to photometric distortion, evolutionary programming is applied to search for the accurate translation parameters so that the accuracy of registration is able to be achieved at a hundredth of a pixel level. The proposed algorithm works well for under and over-exposed image registration. It has been applied to align LDR images for synthesizing high quality HDR images..

  7. A 10-bit column-parallel cyclic ADC for high-speed CMOS image sensors

    International Nuclear Information System (INIS)

    Han Ye; Li Quanliang; Shi Cong; Wu Nanjian

    2013-01-01

    This paper presents a high-speed column-parallel cyclic analog-to-digital converter (ADC) for a CMOS image sensor. A correlated double sampling (CDS) circuit is integrated in the ADC, which avoids a stand-alone CDS circuit block. An offset cancellation technique is also introduced, which reduces the column fixed-pattern noise (FPN) effectively. One single channel ADC with an area less than 0.02 mm 2 was implemented in a 0.13 μm CMOS image sensor process. The resolution of the proposed ADC is 10-bit, and the conversion rate is 1.6 MS/s. The measured differential nonlinearity and integral nonlinearity are 0.89 LSB and 6.2 LSB together with CDS, respectively. The power consumption from 3.3 V supply is only 0.66 mW. An array of 48 10-bit column-parallel cyclic ADCs was integrated into an array of CMOS image sensor pixels. The measured results indicated that the ADC circuit is suitable for high-speed CMOS image sensors. (semiconductor integrated circuits)

  8. Stereo Vision-Based High Dynamic Range Imaging Using Differently-Exposed Image Pair

    Directory of Open Access Journals (Sweden)

    Won-Jae Park

    2017-06-01

    Full Text Available In this paper, a high dynamic range (HDR imaging method based on the stereo vision system is presented. The proposed method uses differently exposed low dynamic range (LDR images captured from a stereo camera. The stereo LDR images are first converted to initial stereo HDR images using the inverse camera response function estimated from the LDR images. However, due to the limited dynamic range of the stereo LDR camera, the radiance values in under/over-exposed regions of the initial main-view (MV HDR image can be lost. To restore these radiance values, the proposed stereo matching and hole-filling algorithms are applied to the stereo HDR images. Specifically, the auxiliary-view (AV HDR image is warped by using the estimated disparity between initial the stereo HDR images and then effective hole-filling is applied to the warped AV HDR image. To reconstruct the final MV HDR, the warped and hole-filled AV HDR image is fused with the initial MV HDR image using the weight map. The experimental results demonstrate objectively and subjectively that the proposed stereo HDR imaging method provides better performance compared to the conventional method.

  9. Performance of a novel wafer scale CMOS active pixel sensor for bio-medical imaging

    International Nuclear Information System (INIS)

    Esposito, M; Evans, P M; Wells, K; Anaxagoras, T; Konstantinidis, A C; Zheng, Y; Speller, R D; Allinson, N M

    2014-01-01

    Recently CMOS active pixels sensors (APSs) have become a valuable alternative to amorphous silicon and selenium flat panel imagers (FPIs) in bio-medical imaging applications. CMOS APSs can now be scaled up to the standard 20 cm diameter wafer size by means of a reticle stitching block process. However, despite wafer scale CMOS APS being monolithic, sources of non-uniformity of response and regional variations can persist representing a significant challenge for wafer scale sensor response. Non-uniformity of stitched sensors can arise from a number of factors related to the manufacturing process, including variation of amplification, variation between readout components, wafer defects and process variations across the wafer due to manufacturing processes. This paper reports on an investigation into the spatial non-uniformity and regional variations of a wafer scale stitched CMOS APS. For the first time a per-pixel analysis of the electro-optical performance of a wafer CMOS APS is presented, to address inhomogeneity issues arising from the stitching techniques used to manufacture wafer scale sensors. A complete model of the signal generation in the pixel array has been provided and proved capable of accounting for noise and gain variations across the pixel array. This novel analysis leads to readout noise and conversion gain being evaluated at pixel level, stitching block level and in regions of interest, resulting in a coefficient of variation ⩽1.9%. The uniformity of the image quality performance has been further investigated in a typical x-ray application, i.e. mammography, showing a uniformity in terms of CNR among the highest when compared with mammography detectors commonly used in clinical practice. Finally, in order to compare the detection capability of this novel APS with the technology currently used (i.e. FPIs), theoretical evaluation of the detection quantum efficiency (DQE) at zero-frequency has been performed, resulting in a higher DQE for this

  10. Performance of a novel wafer scale CMOS active pixel sensor for bio-medical imaging.

    Science.gov (United States)

    Esposito, M; Anaxagoras, T; Konstantinidis, A C; Zheng, Y; Speller, R D; Evans, P M; Allinson, N M; Wells, K

    2014-07-07

    Recently CMOS active pixels sensors (APSs) have become a valuable alternative to amorphous silicon and selenium flat panel imagers (FPIs) in bio-medical imaging applications. CMOS APSs can now be scaled up to the standard 20 cm diameter wafer size by means of a reticle stitching block process. However, despite wafer scale CMOS APS being monolithic, sources of non-uniformity of response and regional variations can persist representing a significant challenge for wafer scale sensor response. Non-uniformity of stitched sensors can arise from a number of factors related to the manufacturing process, including variation of amplification, variation between readout components, wafer defects and process variations across the wafer due to manufacturing processes. This paper reports on an investigation into the spatial non-uniformity and regional variations of a wafer scale stitched CMOS APS. For the first time a per-pixel analysis of the electro-optical performance of a wafer CMOS APS is presented, to address inhomogeneity issues arising from the stitching techniques used to manufacture wafer scale sensors. A complete model of the signal generation in the pixel array has been provided and proved capable of accounting for noise and gain variations across the pixel array. This novel analysis leads to readout noise and conversion gain being evaluated at pixel level, stitching block level and in regions of interest, resulting in a coefficient of variation ⩽1.9%. The uniformity of the image quality performance has been further investigated in a typical x-ray application, i.e. mammography, showing a uniformity in terms of CNR among the highest when compared with mammography detectors commonly used in clinical practice. Finally, in order to compare the detection capability of this novel APS with the technology currently used (i.e. FPIs), theoretical evaluation of the detection quantum efficiency (DQE) at zero-frequency has been performed, resulting in a higher DQE for this

  11. Unsynchronized scanning with a low-cost laser range finder for real-time range imaging

    Science.gov (United States)

    Hatipoglu, Isa; Nakhmani, Arie

    2017-06-01

    Range imaging plays an essential role in many fields: 3D modeling, robotics, heritage, agriculture, forestry, reverse engineering. One of the most popular range-measuring technologies is laser scanner due to its several advantages: long range, high precision, real-time measurement capabilities, and no dependence on lighting conditions. However, laser scanners are very costly. Their high cost prevents widespread use in applications. Due to the latest developments in technology, now, low-cost, reliable, faster, and light-weight 1D laser range finders (LRFs) are available. A low-cost 1D LRF with a scanning mechanism, providing the ability of laser beam steering for additional dimensions, enables to capture a depth map. In this work, we present an unsynchronized scanning with a low-cost LRF to decrease scanning period and reduce vibrations caused by stop-scan in synchronized scanning. Moreover, we developed an algorithm for alignment of unsynchronized raw data and proposed range image post-processing framework. The proposed technique enables to have a range imaging system for a fraction of the price of its counterparts. The results prove that the proposed method can fulfill the need for a low-cost laser scanning for range imaging for static environments because the most significant limitation of the method is the scanning period which is about 2 minutes for 55,000 range points (resolution of 250x220 image). In contrast, scanning the same image takes around 4 minutes in synchronized scanning. Once faster, longer range, and narrow beam LRFs are available, the methods proposed in this work can produce better results.

  12. STUDY ON SHADOW EFFECTS OF VARIOUS FEATURES ON CLOSE RANGE THERMAL IMAGES

    Directory of Open Access Journals (Sweden)

    C. L. Liao

    2012-07-01

    Full Text Available Thermal infrared data become more popular in remote sensing investigation, for it could be acquired both in day and night. The change of temperature has special characteristic in natural environment, so the thermal infrared images could be used in monitoring volcanic landform, the urban development, and disaster prevention. Heat shadow is formed by reflecting radiating capacity which followed the objects. Because of poor spatial resolution of thermal infrared images in satellite sensor, shadow effects were usually ignored. This research focus on discussing the shadow effects of various features, which include metals and nonmetallic materials. An area-based thermal sensor, FLIR-T360 was selected to acquire thermal images. Various features with different emissivity were chosen as reflective surface to obtain thermal shadow in normal atmospheric temperature. Experiments found that the shadow effects depend on the distance between sensors and features, depression angle, object temperature and emissivity of reflective surface. The causes of shadow effects have been altered in the experiment for analyzing the variance in thermal infrared images. The result shows that there were quite different impacts by shadow effects between metals and nonmetallic materials. The further research would be produced a math model to describe the shadow effects of different features in the future work.

  13. Retina-like sensor image coordinates transformation and display

    Science.gov (United States)

    Cao, Fengmei; Cao, Nan; Bai, Tingzhu; Song, Shengyu

    2015-03-01

    For a new kind of retina-like senor camera, the image acquisition, coordinates transformation and interpolation need to be realized. Both of the coordinates transformation and interpolation are computed in polar coordinate due to the sensor's particular pixels distribution. The image interpolation is based on sub-pixel interpolation and its relative weights are got in polar coordinates. The hardware platform is composed of retina-like senor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes the real-time image acquisition, coordinate transformation and interpolation.

  14. Measuring the Contractile Response of Isolated Tissue Using an Image Sensor

    Directory of Open Access Journals (Sweden)

    David Díaz-Martín

    2015-04-01

    Full Text Available Isometric or isotonic transducers have traditionally been used to study the contractile/relaxation effects of drugs on isolated tissues. However, these mechanical sensors are expensive and delicate, and they are associated with certain disadvantages when performing experiments in the laboratory. In this paper, a method that uses an image sensor to measure the contractile effect of drugs on blood vessel rings and other luminal organs is presented. The new method is based on an image-processing algorithm, and it provides a fast, easy and non-expensive way to analyze the effects of such drugs. In our tests, we have obtained dose-response curves from rat aorta rings that are equivalent to those achieved with classical mechanic sensors.

  15. White-light full-field OCT resolution improvement by image sensor colour balance adjustment: numerical simulation

    International Nuclear Information System (INIS)

    Kalyanov, A L; Lychagov, V V; Ryabukho, V P; Smirnov, I V

    2012-01-01

    The possibility of improving white-light full-field optical coherence tomography (OCT) resolution by image sensor colour balance tuning is shown numerically. We calculated the full-width at half-maximum (FWHM) of a coherence pulse registered by a silicon colour image sensor under various colour balance settings. The calculations were made for both a halogen lamp and white LED sources. The results show that the interference pulse width can be reduced by the proper choice of colour balance coefficients. The reduction is up to 18%, as compared with a colour image sensor with regular settings, and up to 20%, as compared with a monochrome sensor. (paper)

  16. Extracellular Bio-imaging of Acetylcholine-stimulated PC12 Cells Using a Calcium and Potassium Multi-ion Image Sensor.

    Science.gov (United States)

    Matsuba, Sota; Kato, Ryo; Okumura, Koichi; Sawada, Kazuaki; Hattori, Toshiaki

    2018-01-01

    In biochemistry, Ca 2+ and K + play essential roles to control signal transduction. Much interest has been focused on ion-imaging, which facilitates understanding of their ion flux dynamics. In this paper, we report a calcium and potassium multi-ion image sensor and its application to living cells (PC12). The multi-ion sensor had two selective plasticized poly(vinyl chloride) membranes containing ionophores. Each region on the sensor responded to only the corresponding ion. The multi-ion sensor has many advantages including not only label-free and real-time measurement but also simultaneous detection of Ca 2+ and K + . Cultured PC12 cells treated with nerve growth factor were prepared, and a practical observation for the cells was conducted with the sensor. After the PC12 cells were stimulated by acetylcholine, only the extracellular Ca 2+ concentration increased while there was no increase in the extracellular K + concentration. Through the practical observation, we demonstrated that the sensor was helpful for analyzing the cell events with changing Ca 2+ and/or K + concentration.

  17. A novel method of range measuring for a mobile robot based on multi-sensor information fusion

    International Nuclear Information System (INIS)

    Zhang Yi; Luo Yuan; Wang Jifeng

    2005-01-01

    The traditional measuring range for a mobile robot is based on a sonar sensor. Because of different working environments, it is very difficult to obtain high precision by using just one single method of range measurement. So, a hybrid sonar sensor and laser scanner method is put forward to overcome these shortcomings. A novel fusion model is proposed based on basic theory and a method of information fusion. An optimal measurement result has been obtained with information fusion from different sensors. After large numbers of experiments and performance analysis, a conclusion can be drawn that the laser scanner and sonar sensor method with multi-sensor information fusion have a higher precision than the single method of sonar. It can also be the same with different environments

  18. A new capacitive long-range displacement nanometer sensor with differential sensing structure based on time-grating

    Science.gov (United States)

    Yu, Zhicheng; Peng, Kai; Liu, Xiaokang; Pu, Hongji; Chen, Ziran

    2018-05-01

    High-precision displacement sensors, which can measure large displacements with nanometer resolution, are key components in many ultra-precision fabrication machines. In this paper, a new capacitive nanometer displacement sensor with differential sensing structure is proposed for long-range linear displacement measurements based on an approach denoted time grating. Analytical models established using electric field coupling theory and an area integral method indicate that common-mode interference will result in a first-harmonic error in the measurement results. To reduce the common-mode interference, the proposed sensor design employs a differential sensing structure, which adopts a second group of induction electrodes spatially separated from the first group of induction electrodes by a half-pitch length. Experimental results based on a prototype sensor demonstrate that the measurement accuracy and the stability of the sensor are substantially improved after adopting the differential sensing structure. Finally, a prototype sensor achieves a measurement accuracy of  ±200 nm over the full 200 mm measurement range of the sensor.

  19. High Dynamic Range Imaging Using Multiple Exposures

    Science.gov (United States)

    Hou, Xinglin; Luo, Haibo; Zhou, Peipei; Zhou, Wei

    2017-06-01

    It is challenging to capture a high-dynamic range (HDR) scene using a low-dynamic range (LDR) camera. This paper presents an approach for improving the dynamic range of cameras by using multiple exposure images of same scene taken under different exposure times. First, the camera response function (CRF) is recovered by solving a high-order polynomial in which only the ratios of the exposures are used. Then, the HDR radiance image is reconstructed by weighted summation of the each radiance maps. After that, a novel local tone mapping (TM) operator is proposed for the display of the HDR radiance image. By solving the high-order polynomial, the CRF can be recovered quickly and easily. Taken the local image feature and characteristic of histogram statics into consideration, the proposed TM operator could preserve the local details efficiently. Experimental result demonstrates the effectiveness of our method. By comparison, the method outperforms other methods in terms of imaging quality.

  20. Optical Imaging Sensors and Systems for Homeland Security Applications

    CERN Document Server

    Javidi, Bahram

    2006-01-01

    Optical and photonic systems and devices have significant potential for homeland security. Optical Imaging Sensors and Systems for Homeland Security Applications presents original and significant technical contributions from leaders of industry, government, and academia in the field of optical and photonic sensors, systems and devices for detection, identification, prevention, sensing, security, verification and anti-counterfeiting. The chapters have recent and technically significant results, ample illustrations, figures, and key references. This book is intended for engineers and scientists in the relevant fields, graduate students, industry managers, university professors, government managers, and policy makers. Advanced Sciences and Technologies for Security Applications focuses on research monographs in the areas of -Recognition and identification (including optical imaging, biometrics, authentication, verification, and smart surveillance systems) -Biological and chemical threat detection (including bios...

  1. Laser Doppler perfusion imaging with a complimentary metal oxide semiconductor image sensor

    NARCIS (Netherlands)

    Serov, Alexander; Steenbergen, Wiendelt; de Mul, F.F.M.

    2002-01-01

    We utilized a complimentary metal oxide semiconductor video camera for fast f low imaging with the laser Doppler technique. A single sensor is used for both observation of the area of interest and measurements of the interference signal caused by dynamic light scattering from moving particles inside

  2. Real-time biochemical sensor based on Raman scattering with CMOS contact imaging.

    Science.gov (United States)

    Muyun Cao; Yuhua Li; Yadid-Pecht, Orly

    2015-08-01

    This work presents a biochemical sensor based on Raman scattering with Complementary metal-oxide-semiconductor (CMOS) contact imaging. This biochemical optical sensor is designed for detecting the concentration of solutions. The system is built with a laser diode, an optical filter, a sample holder and a commercial CMOS sensor. The output of the system is analyzed by an image processing program. The system provides instant measurements with a resolution of 0.2 to 0.4 Mol. This low cost and easy-operated small scale system is useful in chemical, biomedical and environmental labs for quantitative bio-chemical concentration detection with results reported comparable to a highly cost commercial spectrometer.

  3. Decoding mobile-phone image sensor rolling shutter effect for visible light communications

    Science.gov (United States)

    Liu, Yang

    2016-01-01

    Optical wireless communication (OWC) using visible lights, also known as visible light communication (VLC), has attracted significant attention recently. As the traditional OWC and VLC receivers (Rxs) are based on PIN photo-diode or avalanche photo-diode, deploying the complementary metal-oxide-semiconductor (CMOS) image sensor as the VLC Rx is attractive since nowadays nearly every person has a smart phone with embedded CMOS image sensor. However, deploying the CMOS image sensor as the VLC Rx is challenging. In this work, we propose and demonstrate two simple contrast ratio (CR) enhancement schemes to improve the contrast of the rolling shutter pattern. Then we describe their processing algorithms one by one. The experimental results show that both the proposed CR enhancement schemes can significantly mitigate the high-intensity fluctuations of the rolling shutter pattern and improve the bit-error-rate performance.

  4. Flexible Ferroelectric Sensors with Ultrahigh Pressure Sensitivity and Linear Response over Exceptionally Broad Pressure Range.

    Science.gov (United States)

    Lee, Youngoh; Park, Jonghwa; Cho, Soowon; Shin, Young-Eun; Lee, Hochan; Kim, Jinyoung; Myoung, Jinyoung; Cho, Seungse; Kang, Saewon; Baig, Chunggi; Ko, Hyunhyub

    2018-04-24

    Flexible pressure sensors with a high sensitivity over a broad linear range can simplify wearable sensing systems without additional signal processing for the linear output, enabling device miniaturization and low power consumption. Here, we demonstrate a flexible ferroelectric sensor with ultrahigh pressure sensitivity and linear response over an exceptionally broad pressure range based on the material and structural design of ferroelectric composites with a multilayer interlocked microdome geometry. Due to the stress concentration between interlocked microdome arrays and increased contact area in the multilayer design, the flexible ferroelectric sensors could perceive static/dynamic pressure with high sensitivity (47.7 kPa -1 , 1.3 Pa minimum detection). In addition, efficient stress distribution between stacked multilayers enables linear sensing over exceptionally broad pressure range (0.0013-353 kPa) with fast response time (20 ms) and high reliability over 5000 repetitive cycles even at an extremely high pressure of 272 kPa. Our sensor can be used to monitor diverse stimuli from a low to a high pressure range including weak gas flow, acoustic sound, wrist pulse pressure, respiration, and foot pressure with a single device.

  5. High-resolution dynamic pressure sensor array based on piezo-phototronic effect tuned photoluminescence imaging.

    Science.gov (United States)

    Peng, Mingzeng; Li, Zhou; Liu, Caihong; Zheng, Qiang; Shi, Xieqing; Song, Ming; Zhang, Yang; Du, Shiyu; Zhai, Junyi; Wang, Zhong Lin

    2015-03-24

    A high-resolution dynamic tactile/pressure display is indispensable to the comprehensive perception of force/mechanical stimulations such as electronic skin, biomechanical imaging/analysis, or personalized signatures. Here, we present a dynamic pressure sensor array based on pressure/strain tuned photoluminescence imaging without the need for electricity. Each sensor is a nanopillar that consists of InGaN/GaN multiple quantum wells. Its photoluminescence intensity can be modulated dramatically and linearly by small strain (0-0.15%) owing to the piezo-phototronic effect. The sensor array has a high pixel density of 6350 dpi and exceptional small standard deviation of photoluminescence. High-quality tactile/pressure sensing distribution can be real-time recorded by parallel photoluminescence imaging without any cross-talk. The sensor array can be inexpensively fabricated over large areas by semiconductor product lines. The proposed dynamic all-optical pressure imaging with excellent resolution, high sensitivity, good uniformity, and ultrafast response time offers a suitable way for smart sensing, micro/nano-opto-electromechanical systems.

  6. High-speed uncooled MWIR hostile fire indication sensor

    Science.gov (United States)

    Zhang, L.; Pantuso, F. P.; Jin, G.; Mazurenko, A.; Erdtmann, M.; Radhakrishnan, S.; Salerno, J.

    2011-06-01

    Hostile fire indication (HFI) systems require high-resolution sensor operation at extremely high speeds to capture hostile fire events, including rocket-propelled grenades, anti-aircraft artillery, heavy machine guns, anti-tank guided missiles and small arms. HFI must also be conducted in a waveband with large available signal and low background clutter, in particular the mid-wavelength infrared (MWIR). The shortcoming of current HFI sensors in the MWIR is the bandwidth of the sensor is not sufficient to achieve the required frame rate at the high sensor resolution. Furthermore, current HFI sensors require cryogenic cooling that contributes to size, weight, and power (SWAP) in aircraft-mounted applications where these factors are at a premium. Based on its uncooled photomechanical infrared imaging technology, Agiltron has developed a low-SWAP, high-speed MWIR HFI sensor that breaks the bandwidth bottleneck typical of current infrared sensors. This accomplishment is made possible by using a commercial-off-the-shelf, high-performance visible imager as the readout integrated circuit and physically separating this visible imager from the MWIR-optimized photomechanical sensor chip. With this approach, we have achieved high-resolution operation of our MWIR HFI sensor at 1000 fps, which is unprecedented for an uncooled infrared sensor. We have field tested our MWIR HFI sensor for detecting all hostile fire events mentioned above at several test ranges under a wide range of environmental conditions. The field testing results will be presented.

  7. Comparison of the performance of intraoral X-ray sensors using objective image quality assessment.

    Science.gov (United States)

    Hellén-Halme, Kristina; Johansson, Curt; Nilsson, Mats

    2016-05-01

    The main aim of this study was to evaluate the performance of 10 individual sensors of the same make, using objective measures of key image quality parameters. A further aim was to compare 8 brands of sensors. Ten new sensors of 8 different models from 6 manufacturers (i.e., 80 sensors) were included in the study. All sensors were exposed in a standardized way using an X-ray tube voltage of 60 kVp and different exposure times. Sensor response, noise, low-contrast resolution, spatial resolution and uniformity were measured. Individual differences between sensors of the same brand were surprisingly large in some cases. There were clear differences in the characteristics of the different brands of sensors. The largest variations were found for individual sensor response for some of the brands studied. Also, noise level and low contrast resolution showed large variations between brands. Sensors, even of the same brand, vary significantly in their quality. It is thus valuable to establish action levels for the acceptance of newly delivered sensors and to use objective image quality control for commissioning purposes and periodic checks to ensure high performance of individual digital sensors. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. An Approach for Unsupervised Change Detection in Multitemporal VHR Images Acquired by Different Multispectral Sensors

    Directory of Open Access Journals (Sweden)

    Yady Tatiana Solano-Correa

    2018-03-01

    Full Text Available This paper proposes an approach for the detection of changes in multitemporal Very High Resolution (VHR optical images acquired by different multispectral sensors. The proposed approach, which is inspired by a recent framework developed to support the design of change-detection systems for single-sensor VHR remote sensing images, addresses and integrates in the general approach a strategy to effectively deal with multisensor information, i.e., to perform change detection between VHR images acquired by different multispectral sensors on two dates. This is achieved by the definition of procedures for the homogenization of radiometric, spectral and geometric image properties. These procedures map images into a common feature space where the information acquired by different multispectral sensors becomes comparable across time. Although the approach is general, here we optimize it for the detection of changes in vegetation and urban areas by employing features based on linear transformations (Tasseled Caps and Orthogonal Equations, which are shown to be effective for representing the multisensor information in a homogeneous physical way irrespectively of the considered sensor. Experiments on multitemporal images acquired by different VHR satellite systems (i.e., QuickBird, WorldView-2 and GeoEye-1 confirm the effectiveness of the proposed approach.

  9. A low-power CMOS integrated sensor for CO2 detection in the percentage range

    NARCIS (Netherlands)

    Humbert, A.; Tuerlings, B.J.; Hoofman, R.J.O.M.; Tan, Z.; Gravesteijn, D.J.; Pertijs, M.A.P.; Bastiaansen, C.W.M.; Soccol, D.

    2013-01-01

    Within the Catrene project PASTEUR, a low-cost, low-power capacitive carbon dioxide sensor has been developed for tracking CO2 concentration in the percentage range. This paper describes this sensor, which operates at room temperature where it exhibits short response times as well as reversible

  10. Study on polarized optical flow algorithm for imaging bionic polarization navigation micro sensor

    Science.gov (United States)

    Guan, Le; Liu, Sheng; Li, Shi-qi; Lin, Wei; Zhai, Li-yuan; Chu, Jin-kui

    2018-05-01

    At present, both the point source and the imaging polarization navigation devices only can output the angle information, which means that the velocity information of the carrier cannot be extracted from the polarization field pattern directly. Optical flow is an image-based method for calculating the velocity of pixel point movement in an image. However, for ordinary optical flow, the difference in pixel value as well as the calculation accuracy can be reduced in weak light. Polarization imaging technology has the ability to improve both the detection accuracy and the recognition probability of the target because it can acquire the extra polarization multi-dimensional information of target radiation or reflection. In this paper, combining the polarization imaging technique with the traditional optical flow algorithm, a polarization optical flow algorithm is proposed, and it is verified that the polarized optical flow algorithm has good adaptation in weak light and can improve the application range of polarization navigation sensors. This research lays the foundation for day and night all-weather polarization navigation applications in future.

  11. Evaluation of onboard hyperspectral-image compression techniques for a parallel push-broom sensor

    Energy Technology Data Exchange (ETDEWEB)

    Briles, S.

    1996-04-01

    A single hyperspectral imaging sensor can produce frames with spatially-continuous rows of differing, but adjacent, spectral wavelength. If the frame sample-rate of the sensor is such that subsequent hyperspectral frames are spatially shifted by one row, then the sensor can be thought of as a parallel (in wavelength) push-broom sensor. An examination of data compression techniques for such a sensor is presented. The compression techniques are intended to be implemented onboard a space-based platform and to have implementation speeds that match the date rate of the sensor. Data partitions examined extend from individually operating on a single hyperspectral frame to operating on a data cube comprising the two spatial axes and the spectral axis. Compression algorithms investigated utilize JPEG-based image compression, wavelet-based compression and differential pulse code modulation. Algorithm performance is quantitatively presented in terms of root-mean-squared error and root-mean-squared correlation coefficient error. Implementation issues are considered in algorithm development.

  12. Robust Automated Image Co-Registration of Optical Multi-Sensor Time Series Data: Database Generation for Multi-Temporal Landslide Detection

    Directory of Open Access Journals (Sweden)

    Robert Behling

    2014-03-01

    Full Text Available Reliable multi-temporal landslide detection over longer periods of time requires multi-sensor time series data characterized by high internal geometric stability, as well as high relative and absolute accuracy. For this purpose, a new methodology for fully automated co-registration has been developed allowing efficient and robust spatial alignment of standard orthorectified data products originating from a multitude of optical satellite remote sensing data of varying spatial resolution. Correlation-based co-registration uses world-wide available terrain corrected Landsat Level 1T time series data as the spatial reference, ensuring global applicability. The developed approach has been applied to a multi-sensor time series of 592 remote sensing datasets covering an approximately 12,000 km2 area in Southern Kyrgyzstan (Central Asia strongly affected by landslides. The database contains images acquired during the last 26 years by Landsat (ETM, ASTER, SPOT and RapidEye sensors. Analysis of the spatial shifts obtained from co-registration has revealed sensor-specific alignments ranging between 5 m and more than 400 m. Overall accuracy assessment of these alignments has resulted in a high relative image-to-image accuracy of 17 m (RMSE and a high absolute accuracy of 23 m (RMSE for the whole co-registered database, making it suitable for multi-temporal landslide detection at a regional scale in Southern Kyrgyzstan.

  13. Honeywell's Compact, Wide-angle Uv-visible Imaging Sensor

    Science.gov (United States)

    Pledger, D.; Billing-Ross, J.

    1993-01-01

    Honeywell is currently developing the Earth Reference Attitude Determination System (ERADS). ERADS determines attitude by imaging the entire Earth's limb and a ring of the adjacent star field in the 2800-3000 A band of the ultraviolet. This is achieved through the use of a highly nonconventional optical system, an intensifier tube, and a mega-element CCD array. The optics image a 30 degree region in the center of the field, and an outer region typically from 128 to 148 degrees, which can be adjusted up to 180 degrees. Because of the design employed, the illumination at the outer edge of the field is only some 15 percent below that at the center, in contrast to the drastic rolloffs encountered in conventional wide-angle sensors. The outer diameter of the sensor is only 3 in; the volume and weight of the entire system, including processor, are 1000 cc and 6 kg, respectively.

  14. UTOFIA: an underwater time-of-flight image acquisition system

    Science.gov (United States)

    Driewer, Adrian; Abrosimov, Igor; Alexander, Jonathan; Benger, Marc; O'Farrell, Marion; Haugholt, Karl Henrik; Softley, Chris; Thielemann, Jens T.; Thorstensen, Jostein; Yates, Chris

    2017-10-01

    In this article the development of a newly designed Time-of-Flight (ToF) image sensor for underwater applications is described. The sensor is developed as part of the project UTOFIA (underwater time-of-flight image acquisition) funded by the EU within the Horizon 2020 framework. This project aims to develop a camera based on range gating that extends the visible range compared to conventional cameras by a factor of 2 to 3 and delivers real-time range information by means of a 3D video stream. The principle of underwater range gating as well as the concept of the image sensor are presented. Based on measurements on a test image sensor a pixel structure that suits best to the requirements has been selected. Within an extensive characterization underwater the capability of distance measurements in turbid environments is demonstrated.

  15. The Multidimensional Integrated Intelligent Imaging project (MI-3)

    International Nuclear Information System (INIS)

    Allinson, N.; Anaxagoras, T.; Aveyard, J.; Arvanitis, C.; Bates, R.; Blue, A.; Bohndiek, S.; Cabello, J.; Chen, L.; Chen, S.; Clark, A.; Clayton, C.; Cook, E.; Cossins, A.; Crooks, J.; El-Gomati, M.; Evans, P.M.; Faruqi, W.; French, M.; Gow, J.

    2009-01-01

    MI-3 is a consortium of 11 universities and research laboratories whose mission is to develop complementary metal-oxide semiconductor (CMOS) active pixel sensors (APS) and to apply these sensors to a range of imaging challenges. A range of sensors has been developed: On-Pixel Intelligent CMOS (OPIC)-designed for in-pixel intelligence; FPN-designed to develop novel techniques for reducing fixed pattern noise; HDR-designed to develop novel techniques for increasing dynamic range; Vanilla/PEAPS-with digital and analogue modes and regions of interest, which has also been back-thinned; Large Area Sensor (LAS)-a novel, stitched LAS; and eLeNA-which develops a range of low noise pixels. Applications being developed include autoradiography, a gamma camera system, radiotherapy verification, tissue diffraction imaging, X-ray phase-contrast imaging, DNA sequencing and electron microscopy.

  16. The Multidimensional Integrated Intelligent Imaging project (MI-3)

    Energy Technology Data Exchange (ETDEWEB)

    Allinson, N.; Anaxagoras, T. [Vision and Information Engineering, University of Sheffield (United Kingdom); Aveyard, J. [Laboratory for Environmental Gene Regulation, University of Liverpool (United Kingdom); Arvanitis, C. [Radiation Physics, University College, London (United Kingdom); Bates, R.; Blue, A. [Experimental Particle Physics, University of Glasgow (United Kingdom); Bohndiek, S. [Radiation Physics, University College, London (United Kingdom); Cabello, J. [Centre for Vision, Speech and Signal Processing, University of Surrey, Guildford (United Kingdom); Chen, L. [Electron Optics, Applied Electromagnetics and Electron Optics, University of York (United Kingdom); Chen, S. [MRC Laboratory for Molecular Biology, Cambridge (United Kingdom); Clark, A. [STFC Rutherford Appleton Laboratories (United Kingdom); Clayton, C. [Vision and Information Engineering, University of Sheffield (United Kingdom); Cook, E. [Radiation Physics, University College, London (United Kingdom); Cossins, A. [Laboratory for Environmental Gene Regulation, University of Liverpool (United Kingdom); Crooks, J. [STFC Rutherford Appleton Laboratories (United Kingdom); El-Gomati, M. [Electron Optics, Applied Electromagnetics and Electron Optics, University of York (United Kingdom); Evans, P.M. [Institute of Cancer Research, Sutton, Surrey SM2 5PT (United Kingdom)], E-mail: phil.evans@icr.ac.uk; Faruqi, W. [MRC Laboratory for Molecular Biology, Cambridge (United Kingdom); French, M. [STFC Rutherford Appleton Laboratories (United Kingdom); Gow, J. [Imaging for Space and Terrestrial Applications, Brunel University, London (United Kingdom)] (and others)

    2009-06-01

    MI-3 is a consortium of 11 universities and research laboratories whose mission is to develop complementary metal-oxide semiconductor (CMOS) active pixel sensors (APS) and to apply these sensors to a range of imaging challenges. A range of sensors has been developed: On-Pixel Intelligent CMOS (OPIC)-designed for in-pixel intelligence; FPN-designed to develop novel techniques for reducing fixed pattern noise; HDR-designed to develop novel techniques for increasing dynamic range; Vanilla/PEAPS-with digital and analogue modes and regions of interest, which has also been back-thinned; Large Area Sensor (LAS)-a novel, stitched LAS; and eLeNA-which develops a range of low noise pixels. Applications being developed include autoradiography, a gamma camera system, radiotherapy verification, tissue diffraction imaging, X-ray phase-contrast imaging, DNA sequencing and electron microscopy.

  17. Operational calibration and validation of landsat data continuity mission (LDCM) sensors using the image assessment system (IAS)

    Science.gov (United States)

    Micijevic, Esad; Morfitt, Ron

    2010-01-01

    Systematic characterization and calibration of the Landsat sensors and the assessment of image data quality are performed using the Image Assessment System (IAS). The IAS was first introduced as an element of the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) ground segment and recently extended to Landsat 4 (L4) and 5 (L5) Thematic Mappers (TM) and Multispectral Sensors (MSS) on-board the Landsat 1-5 satellites. In preparation for the Landsat Data Continuity Mission (LDCM), the IAS was developed for the Earth Observer 1 (EO-1) Advanced Land Imager (ALI) with a capability to assess pushbroom sensors. This paper describes the LDCM version of the IAS and how it relates to unique calibration and validation attributes of its on-board imaging sensors. The LDCM IAS system will have to handle a significantly larger number of detectors and the associated database than the previous IAS versions. An additional challenge is that the LDCM IAS must handle data from two sensors, as the LDCM products will combine the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) spectral bands.

  18. Thresholded Range Aggregation in Sensor Networks

    DEFF Research Database (Denmark)

    Yiu, Man Lung; Lin, Zhifeng; Mamoulis, Nikos

    2010-01-01

    ' status in each local region. In order to process the (snapshot) TRA query, we develop energy-efficient protocols based on appropriate operators and filters in sensor nodes. The design of these operators and filters is non-trivial, due to the fact that each sensor measurement influences the actual results...

  19. Integrated arrays of air-dielectric graphene transistors as transparent active-matrix pressure sensors for wide pressure ranges.

    Science.gov (United States)

    Shin, Sung-Ho; Ji, Sangyoon; Choi, Seiho; Pyo, Kyoung-Hee; Wan An, Byeong; Park, Jihun; Kim, Joohee; Kim, Ju-Young; Lee, Ki-Suk; Kwon, Soon-Yong; Heo, Jaeyeong; Park, Byong-Guk; Park, Jang-Ung

    2017-03-31

    Integrated electronic circuitries with pressure sensors have been extensively researched as a key component for emerging electronics applications such as electronic skins and health-monitoring devices. Although existing pressure sensors display high sensitivities, they can only be used for specific purposes due to the narrow range of detectable pressure (under tens of kPa) and the difficulty of forming highly integrated arrays. However, it is essential to develop tactile pressure sensors with a wide pressure range in order to use them for diverse application areas including medical diagnosis, robotics or automotive electronics. Here we report an unconventional approach for fabricating fully integrated active-matrix arrays of pressure-sensitive graphene transistors with air-dielectric layers simply formed by folding two opposing panels. Furthermore, this realizes a wide tactile pressure sensing range from 250 Pa to ∼3 MPa. Additionally, fabrication of pressure sensor arrays and transparent pressure sensors are demonstrated, suggesting their substantial promise as next-generation electronics.

  20. Survey on Ranging Sensors and Cooperative Techniques for Relative Positioning of Vehicles

    Directory of Open Access Journals (Sweden)

    Fabian de Ponte Müller

    2017-01-01

    Full Text Available Future driver assistance systems will rely on accurate, reliable and continuous knowledge on the position of other road participants, including pedestrians, bicycles and other vehicles. The usual approach to tackle this requirement is to use on-board ranging sensors inside the vehicle. Radar, laser scanners or vision-based systems are able to detect objects in their line-of-sight. In contrast to these non-cooperative ranging sensors, cooperative approaches follow a strategy in which other road participants actively support the estimation of the relative position. The limitations of on-board ranging sensors regarding their detection range and angle of view and the facility of blockage can be approached by using a cooperative approach based on vehicle-to-vehicle communication. The fusion of both, cooperative and non-cooperative strategies, seems to offer the largest benefits regarding accuracy, availability and robustness. This survey offers the reader a comprehensive review on different techniques for vehicle relative positioning. The reader will learn the important performance indicators when it comes to relative positioning of vehicles, the different technologies that are both commercially available and currently under research, their expected performance and their intrinsic limitations. Moreover, the latest research in the area of vision-based systems for vehicle detection, as well as the latest work on GNSS-based vehicle localization and vehicular communication for relative positioning of vehicles, are reviewed. The survey also includes the research work on the fusion of cooperative and non-cooperative approaches to increase the reliability and the availability.

  1. Image sensor for testing refractive error of eyes

    Science.gov (United States)

    Li, Xiangning; Chen, Jiabi; Xu, Longyun

    2000-05-01

    It is difficult to detect ametropia and anisometropia for children. Image sensor for testing refractive error of eyes does not need the cooperation of children and can be used to do the general survey of ametropia and anisometropia for children. In our study, photographs are recorded by a CCD element in a digital form which can be directly processed by a computer. In order to process the image accurately by digital technique, formula considering the effect of extended light source and the size of lens aperture has been deduced, which is more reliable in practice. Computer simulation of the image sensing is made to verify the fineness of the results.

  2. CCD image sensor induced error in PIV applications

    Science.gov (United States)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  3. CCD image sensor induced error in PIV applications

    International Nuclear Information System (INIS)

    Legrand, M; Nogueira, J; Vargas, A A; Ventas, R; Rodríguez-Hidalgo, M C

    2014-01-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (∼0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described. (paper)

  4. Image interpolation used in three-dimensional range data compression.

    Science.gov (United States)

    Zhang, Shaoze; Zhang, Jianqi; Huang, Xi; Liu, Delian

    2016-05-20

    Advances in the field of three-dimensional (3D) scanning have made the acquisition of 3D range data easier and easier. However, with the large size of 3D range data comes the challenge of storing and transmitting it. To address this challenge, this paper presents a framework to further compress 3D range data using image interpolation. We first use a virtual fringe-projection system to store 3D range data as images, and then apply the interpolation algorithm to the images to reduce their resolution to further reduce the data size. When the 3D range data are needed, the low-resolution image is scaled up to its original resolution by applying the interpolation algorithm, and then the scaled-up image is decoded and the 3D range data are recovered according to the decoded result. Experimental results show that the proposed method could further reduce the data size while maintaining a low rate of error.

  5. Integration of piezo-capacitive and piezo-electric nanoweb based pressure sensors for imaging of static and dynamic pressure distribution.

    Science.gov (United States)

    Jeong, Y J; Oh, T I; Woo, E J; Kim, K J

    2017-07-01

    Recently, highly flexible and soft pressure distribution imaging sensor is in great demand for tactile sensing, gait analysis, ubiquitous life-care based on activity recognition, and therapeutics. In this study, we integrate the piezo-capacitive and piezo-electric nanowebs with the conductive fabric sheets for detecting static and dynamic pressure distributions on a large sensing area. Electrical impedance tomography (EIT) and electric source imaging are applied for reconstructing pressure distribution images from measured current-voltage data on the boundary of the hybrid fabric sensor. We evaluated the piezo-capacitive nanoweb sensor, piezo-electric nanoweb sensor, and hybrid fabric sensor. The results show the feasibility of static and dynamic pressure distribution imaging from the boundary measurements of the fabric sensors.

  6. Infrared Range Sensor Array for 3D Sensing in Robotic Applications

    Directory of Open Access Journals (Sweden)

    Yongtae Do

    2013-04-01

    Full Text Available This paper presents the design and testing of multiple infrared range detectors arranged in a two-dimensional (2D array. The proposed system can collect the sparse three-dimensional (3D data of objects and surroundings for robotics applications. Three kinds of tasks are considered using the system: detecting obstacles that lie ahead of a mobile robot, sensing the ground profile for the safe navigation of a mobile robot, and sensing the shape and position of an object on a conveyor belt for pickup by a robot manipulator. The developed system is potentially a simple alternative to high-resolution (and expensive 3D sensing systems, such as stereo cameras or laser scanners. In addition, the system can provide shape information about target objects and surroundings that cannot be obtained using simple ultrasonic sensors. Laboratory prototypes of the system were built with nine infrared range sensors arranged in a 3×3 array and test results confirmed the validity of system.

  7. A new range-free localisation in wireless sensor networks using support vector machine

    Science.gov (United States)

    Wang, Zengfeng; Zhang, Hao; Lu, Tingting; Sun, Yujuan; Liu, Xing

    2018-02-01

    Location information of sensor nodes is of vital importance for most applications in wireless sensor networks (WSNs). This paper proposes a new range-free localisation algorithm using support vector machine (SVM) and polar coordinate system (PCS), LSVM-PCS. In LSVM-PCS, two sets of classes are first constructed based on sensor nodes' polar coordinates. Using the boundaries of the defined classes, the operation region of WSN field is partitioned into a finite number of polar grids. Each sensor node can be localised into one of the polar grids by executing two localisation algorithms that are developed on the basis of SVM classification. The centre of the resident polar grid is then estimated as the location of the sensor node. In addition, a two-hop mass-spring optimisation (THMSO) is also proposed to further improve the localisation accuracy of LSVM-PCS. In THMSO, both neighbourhood information and non-neighbourhood information are used to refine the sensor node location. The results obtained verify that the proposed algorithm provides a significant improvement over existing localisation methods.

  8. Characteristics of different frequency ranges in scanning electron microscope images

    International Nuclear Information System (INIS)

    Sim, K. S.; Nia, M. E.; Tan, T. L.; Tso, C. P.; Ee, C. S.

    2015-01-01

    We demonstrate a new approach to characterize the frequency range in general scanning electron microscope (SEM) images. First, pure frequency images are generated from low frequency to high frequency, and then, the magnification of each type of frequency image is implemented. By comparing the edge percentage of the SEM image to the self-generated frequency images, we can define the frequency ranges of the SEM images. Characterization of frequency ranges of SEM images benefits further processing and analysis of those SEM images, such as in noise filtering and contrast enhancement

  9. Characteristics of different frequency ranges in scanning electron microscope images

    Energy Technology Data Exchange (ETDEWEB)

    Sim, K. S., E-mail: kssim@mmu.edu.my; Nia, M. E.; Tan, T. L.; Tso, C. P.; Ee, C. S. [Faculty of Engineering and Technology, Multimedia University, 75450 Melaka (Malaysia)

    2015-07-22

    We demonstrate a new approach to characterize the frequency range in general scanning electron microscope (SEM) images. First, pure frequency images are generated from low frequency to high frequency, and then, the magnification of each type of frequency image is implemented. By comparing the edge percentage of the SEM image to the self-generated frequency images, we can define the frequency ranges of the SEM images. Characterization of frequency ranges of SEM images benefits further processing and analysis of those SEM images, such as in noise filtering and contrast enhancement.

  10. Distributed Algorithm for Voronoi Partition of Wireless Sensor Networks with a Limited Sensing Range.

    Science.gov (United States)

    He, Chenlong; Feng, Zuren; Ren, Zhigang

    2018-02-03

    For Wireless Sensor Networks (WSNs), the Voronoi partition of a region is a challenging problem owing to the limited sensing ability of each sensor and the distributed organization of the network. In this paper, an algorithm is proposed for each sensor having a limited sensing range to compute its limited Voronoi cell autonomously, so that the limited Voronoi partition of the entire WSN is generated in a distributed manner. Inspired by Graham's Scan (GS) algorithm used to compute the convex hull of a point set, the limited Voronoi cell of each sensor is obtained by sequentially scanning two consecutive bisectors between the sensor and its neighbors. The proposed algorithm called the Boundary Scan (BS) algorithm has a lower computational complexity than the existing Range-Constrained Voronoi Cell (RCVC) algorithm and reaches the lower bound of the computational complexity of the algorithms used to solve the problem of this kind. Moreover, it also improves the time efficiency of a key step in the Adjust-Sensing-Radius (ASR) algorithm used to compute the exact Voronoi cell. Extensive numerical simulations are performed to demonstrate the correctness and effectiveness of the BS algorithm. The distributed realization of the BS combined with a localization algorithm in WSNs is used to justify the WSN nature of the proposed algorithm.

  11. Analysis on the Effect of Sensor Views in Image Reconstruction Produced by Optical Tomography System Using Charge-Coupled Device.

    Science.gov (United States)

    Jamaludin, Juliza; Rahim, Ruzairi Abdul; Fazul Rahiman, Mohd Hafiz; Mohd Rohani, Jemmy

    2018-04-01

    Optical tomography (OPT) is a method to capture a cross-sectional image based on the data obtained by sensors, distributed around the periphery of the analyzed system. This system is based on the measurement of the final light attenuation or absorption of radiation after crossing the measured objects. The number of sensor views will affect the results of image reconstruction, where the high number of sensor views per projection will give a high image quality. This research presents an application of charge-coupled device linear sensor and laser diode in an OPT system. Experiments in detecting solid and transparent objects in crystal clear water were conducted. Two numbers of sensors views, 160 and 320 views are evaluated in this research in reconstructing the images. The image reconstruction algorithms used were filtered images of linear back projection algorithms. Analysis on comparing the simulation and experiments image results shows that, with 320 image views giving less area error than 160 views. This suggests that high image view resulted in the high resolution of image reconstruction.

  12. Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Xiaoliang Ge

    2018-02-01

    Full Text Available This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models.

  13. Origin of high photoconductive gain in fully transparent heterojunction nanocrystalline oxide image sensors and interconnects.

    Science.gov (United States)

    Jeon, Sanghun; Song, Ihun; Lee, Sungsik; Ryu, Byungki; Ahn, Seung-Eon; Lee, Eunha; Kim, Young; Nathan, Arokia; Robertson, John; Chung, U-In

    2014-11-05

    A technique for invisible image capture using a photosensor array based on transparent conducting oxide semiconductor thin-film transistors and transparent interconnection technologies is presented. A transparent conducting layer is employed for the sensor electrodes as well as interconnection in the array, providing about 80% transmittance at visible-light wavelengths. The phototransistor is a Hf-In-Zn-O/In-Zn-O heterostructure yielding a high quantum-efficiency in the visible range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Particle detection and classification using commercial off the shelf CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Martín [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Comisión Nacional de Energía Atómica (CNEA), Centro Atómico Bariloche, Av. Bustillo 9500, Bariloche 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina); Lipovetzky, Jose, E-mail: lipo@cab.cnea.gov.ar [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Comisión Nacional de Energía Atómica (CNEA), Centro Atómico Bariloche, Av. Bustillo 9500, Bariloche 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina); Sofo Haro, Miguel; Sidelnik, Iván; Blostein, Juan Jerónimo; Alcalde Bessia, Fabricio; Berisso, Mariano Gómez [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina)

    2016-08-11

    In this paper we analyse the response of two different Commercial Off The shelf CMOS image sensors as particle detectors. Sensors were irradiated using X-ray photons, gamma photons, beta particles and alpha particles from diverse sources. The amount of charge produced by different particles, and the size of the spot registered on the sensor are compared, and analysed by an algorithm to classify them. For a known incident energy spectrum, the employed sensors provide a dose resolution lower than microGray, showing their potentials in radioprotection, area monitoring, or medical applications.

  15. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Hutcheson, Joshua A.; Majid, Aneeka A.; Powless, Amy J.; Muldoon, Timothy J., E-mail: tmuldoon@uark.edu [Department of Biomedical Engineering, University of Arkansas, 120 Engineering Hall, Fayetteville, Arkansas 72701 (United States)

    2015-09-15

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min{sup −1} with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels{sup −1}.

  16. Multispectral Imaging in Cultural Heritage Conservation

    Science.gov (United States)

    Del Pozo, S.; Rodríguez-Gonzálvez, P.; Sánchez-Aparicio, L. J.; Muñoz-Nieto, A.; Hernández-López, D.; Felipe-García, B.; González-Aguilera, D.

    2017-08-01

    This paper sums up the main contribution derived from the thesis entitled "Multispectral imaging for the analysis of materials and pathologies in civil engineering, constructions and natural spaces" awarded by CIPA-ICOMOS for its connection with the preservation of Cultural Heritage. This thesis is framed within close-range remote sensing approaches by the fusion of sensors operating in the optical domain (visible to shortwave infrared spectrum). In the field of heritage preservation, multispectral imaging is a suitable technique due to its non-destructive nature and its versatility. It combines imaging and spectroscopy to analyse materials and land covers and enables the use of a variety of different geomatic sensors for this purpose. These sensors collect both spatial and spectral information for a given scenario and a specific spectral range, so that, their smaller storage units save the spectral properties of the radiation reflected by the surface of interest. The main goal of this research work is to characterise different construction materials as well as the main pathologies of Cultural Heritage elements by combining active and passive sensors recording data in different ranges. Conclusions about the suitability of each type of sensor and spectral range are drawn in relation to each particular case study and damage. It should be emphasised that results are not limited to images, since 3D intensity data from laser scanners can be integrated with 2D data from passive sensors obtaining high quality products due to the added value that metric brings to multispectral images.

  17. MULTISPECTRAL IMAGING IN CULTURAL HERITAGE CONSERVATION

    Directory of Open Access Journals (Sweden)

    S. Del Pozo

    2017-08-01

    Full Text Available This paper sums up the main contribution derived from the thesis entitled "Multispectral imaging for the analysis of materials and pathologies in civil engineering, constructions and natural spaces" awarded by CIPA-ICOMOS for its connection with the preservation of Cultural Heritage. This thesis is framed within close-range remote sensing approaches by the fusion of sensors operating in the optical domain (visible to shortwave infrared spectrum. In the field of heritage preservation, multispectral imaging is a suitable technique due to its non-destructive nature and its versatility. It combines imaging and spectroscopy to analyse materials and land covers and enables the use of a variety of different geomatic sensors for this purpose. These sensors collect both spatial and spectral information for a given scenario and a specific spectral range, so that, their smaller storage units save the spectral properties of the radiation reflected by the surface of interest. The main goal of this research work is to characterise different construction materials as well as the main pathologies of Cultural Heritage elements by combining active and passive sensors recording data in different ranges. Conclusions about the suitability of each type of sensor and spectral range are drawn in relation to each particular case study and damage. It should be emphasised that results are not limited to images, since 3D intensity data from laser scanners can be integrated with 2D data from passive sensors obtaining high quality products due to the added value that metric brings to multispectral images.

  18. Nanoimprinted distributed feedback dye laser sensor for real-time imaging of small molecule diffusion

    DEFF Research Database (Denmark)

    Vannahme, Christoph; Dufva, Martin; Kristensen, Anders

    2014-01-01

    Label-free imaging is a promising tool for the study of biological processes such as cell adhesion and small molecule signaling processes. In order to image in two dimensions of space current solutions require motorized stages which results in low imaging frame rates. Here, a highly sensitive...... distributed feedback (DFB) dye laser sensor for real-time label-free imaging without any moving parts enabling a frame rate of 12 Hz is presented. The presence of molecules on the laser surface results in a wavelength shift which is used as sensor signal. The unique DFB laser structure comprises several areas...

  19. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    Science.gov (United States)

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  20. Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review

    Directory of Open Access Journals (Sweden)

    Zhuowen Lv

    2015-01-01

    Full Text Available Gait is a unique perceptible biometric feature at larger distances, and the gait representation approach plays a key role in a video sensor-based gait recognition system. Class Energy Image is one of the most important gait representation methods based on appearance, which has received lots of attentions. In this paper, we reviewed the expressions and meanings of various Class Energy Image approaches, and analyzed the information in the Class Energy Images. Furthermore, the effectiveness and robustness of these approaches were compared on the benchmark gait databases. We outlined the research challenges and provided promising future directions for the field. To the best of our knowledge, this is the first review that focuses on Class Energy Image. It can provide a useful reference in the literature of video sensor-based gait representation approach.

  1. Sensitivity Range Analysis of Infrared (IR) Transmitter and Receiver Sensor to Detect Sample Position in Automatic Sample Changer

    International Nuclear Information System (INIS)

    Syirrazie Che Soh; Nolida Yussup; Nur Aira Abdul Rahman; Maslina Ibrahim

    2016-01-01

    Sensitivity range of IR Transmitter and Receiver Sensor influences the effectiveness of the sensor to detect position of a sample. Then the purpose of this analysis is to determine the suitable design and specification the electronic driver of the sensor to gain appropriate sensitivity range for required operation. The related activities to this analysis cover electronic design concept and specification, calibration of design specification and evaluation on design specification for required application. (author)

  2. A Sensitive Dynamic and Active Pixel Vision Sensor for Color or Neural Imaging Applications.

    Science.gov (United States)

    Moeys, Diederik Paul; Corradi, Federico; Li, Chenghan; Bamford, Simeon A; Longinotti, Luca; Voigt, Fabian F; Berry, Stewart; Taverni, Gemma; Helmchen, Fritjof; Delbruck, Tobi

    2018-02-01

    Applications requiring detection of small visual contrast require high sensitivity. Event cameras can provide higher dynamic range (DR) and reduce data rate and latency, but most existing event cameras have limited sensitivity. This paper presents the results of a 180-nm Towerjazz CIS process vision sensor called SDAVIS192. It outputs temporal contrast dynamic vision sensor (DVS) events and conventional active pixel sensor frames. The SDAVIS192 improves on previous DAVIS sensors with higher sensitivity for temporal contrast. The temporal contrast thresholds can be set down to 1% for negative changes in logarithmic intensity (OFF events) and down to 3.5% for positive changes (ON events). The achievement is possible through the adoption of an in-pixel preamplification stage. This preamplifier reduces the effective intrascene DR of the sensor (70 dB for OFF and 50 dB for ON), but an automated operating region control allows up to at least 110-dB DR for OFF events. A second contribution of this paper is the development of characterization methodology for measuring DVS event detection thresholds by incorporating a measure of signal-to-noise ratio (SNR). At average SNR of 30 dB, the DVS temporal contrast threshold fixed pattern noise is measured to be 0.3%-0.8% temporal contrast. Results comparing monochrome and RGBW color filter array DVS events are presented. The higher sensitivity of SDAVIS192 make this sensor potentially useful for calcium imaging, as shown in a recording from cultured neurons expressing calcium sensitive green fluorescent protein GCaMP6f.

  3. Sensors and sensor systems for guidance and navigation; Proceedings of the Meeting, Orlando, FL, Apr. 2, 3, 1991

    Science.gov (United States)

    Wade, Jack; Tuchman, Avi

    1991-07-01

    The present conference discusses wide field-of-view star-tracker cameras, discrete frequency vs radius reticle trackers, a sensor system for comet approach and landing, a static horizon sensor for a remote-sensing satellite, an improved ring laser gyro navigator, FM reticle trackers in the pupil plane, and the 2D encoding of images via discrete reticles. Also discussed are reduced-cost coil windings for interferometric fiber-optic gyro sensors, the ASTRO 1M space attitude-determination system, passive range-sensor refinement via texture and segmentation, a coherent launch-site atmospheric wind sounder, and a radar-optronic tracking experiment for short and medium range aerial combat. (For individual items see A93-27044 to A93-27046)

  4. Aerial Triangulation Close-range Images with Dual Quaternion

    Directory of Open Access Journals (Sweden)

    SHENG Qinghong

    2015-05-01

    Full Text Available A new method for the aerial triangulation of close-range images based on dual quaternion is presented. Using dual quaternion to represent the spiral screw motion of the beam in the space, the real part of dual quaternion represents the angular elements of all the beams in the close-range area networks, the real part and the dual part of dual quaternion represents the line elements corporately. Finally, an aerial triangulation adjustment model based on dual quaternion is established, and the elements of interior orientation and exterior orientation and the object coordinates of the ground points are calculated. Real images and large attitude angle simulated images are selected to run the experiments of aerial triangulation. The experimental results show that the new method for the aerial triangulation of close-range images based on dual quaternion can obtain higher accuracy.

  5. Handbook of ultra-wideband short-range sensing theory, sensors, applications

    CERN Document Server

    Sachs, Jürgen

    2013-01-01

    Ranging from the theoretical basis of UWB sensors via implementation issues to applications, this much-needed book bridges the gap between designers and appliers working in civil engineering, biotechnology, medical engineering, robotic, mechanical engineering, safety and homeland security. From the contents: * History * Signal and systems in time and frequency domain * Propagation of electromagnetic waves (in frequency and time domain) * UWB-Principles * UWB-antennas and applicators * Data processing * Applications.

  6. Self-Configuring Indoor Localization Based on Low-Cost Ultrasonic Range Sensors

    Directory of Open Access Journals (Sweden)

    Can Basaran

    2014-10-01

    Full Text Available In smart environments, target tracking is an essential service used by numerous applications from activity recognition to personalized infotaintment. The target tracking relies on sensors with known locations to estimate and keep track of the path taken by the target, and hence, it is crucial to have an accurate map of such sensors. However, the need for manually entering their locations after deployment and expecting them to remain fixed, significantly limits the usability of target tracking. To remedy this drawback, we present a self-configuring and device-free localization protocol based on genetic algorithms that autonomously identifies the geographic topology of a network of ultrasonic range sensors as well as automatically detects any change in the established network structure in less than a minute and generates a new map within seconds. The proposed protocol significantly reduces hardware and deployment costs thanks to the use of low-cost off-the-shelf sensors with no manual configuration. Experiments on two real testbeds of different sizes show that the proposed protocol achieves an error of 7.16~17.53 cm in topology mapping, while also tracking a mobile target with an average error of 11.71~18.43 cm and detecting displacements of 1.41~3.16 m in approximately 30 s.

  7. Time-reversed lasing in the terahertz range and its preliminary study in sensor applications

    Energy Technology Data Exchange (ETDEWEB)

    Shen, Yun, E-mail: shenyunoptics@gmail.com [Department of Physics, Nanchang University, Nanchang 330031 (China); Liu, Huaqing [Department of Physics, Nanchang University, Nanchang 330031 (China); Deng, Xiaohua [Institute of Space Science and Technology, Nanchang University, Nanchang 330031 (China); Wang, Guoping [Key Laboratory of Artificial Micro- and Nano-Structures of Ministry of Education and School of Physics and Technology, Wuhan University, Wuhan 430072 (China)

    2017-02-05

    Time-reversed lasing in a uniform slab and a grating structure are investigated in the terahertz range. The results show that both the uniform slab and grating can support terahertz time-reversed lasing. Nevertheless, due to the tunable effective refractive index, the grating structure can not only exhibit time-reversed lasing more effectively and flexibly than a uniform slab, but also can realize significant absorption in a broader operating frequency range. Furthermore, applications of terahertz time-reversed lasing for novel concentration/thickness sensors are preliminarily studied in a single-channel coherent perfect absorber system. - Highlights: • Time-reversed lasing are investigated in the terahertz range. • The grating structure exhibit time-reversed lasing more effectively and flexibly than a uniform slab. • THz time-reversed lasing for novel concentration/thickness sensors are studied.

  8. Sorting method to extend the dynamic range of the Shack-Hartmann wave-front sensor

    International Nuclear Information System (INIS)

    Lee, Junwon; Shack, Roland V.; Descour, Michael R.

    2005-01-01

    We propose a simple and powerful algorithm to extend the dynamic range of a Shack-Hartmann wave-front sensor. In a conventional Shack-Hartmann wave-front sensor the dynamic range is limited by the f-number of a lenslet, because the focal spot is required to remain in the area confined by the single lenslet. The sorting method proposed here eliminates such a limitation and extends the dynamic range by tagging each spot in a special sequence. Since the sorting method is a simple algorithm that does not change the measurement configuration, there is no requirement for extra hardware, multiple measurements, or complicated algorithms. We not only present the theory and a calculation example of the sorting method but also actually implement measurement of a highly aberrated wave front from nonrotational symmetric optics

  9. Special Sensor Microwave Imager/Sounder (SSMIS) Temperature Data Record (TDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager/Sounder (SSMIS) is a series of passive microwave conically scanning imagers and sounders onboard the DMSP satellites beginning...

  10. Video-rate or high-precision: a flexible range imaging camera

    Science.gov (United States)

    Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.; Payne, Andrew D.; Conroy, Richard M.; Godbaz, John P.; Jongenelen, Adrian P. P.

    2008-02-01

    A range imaging camera produces an output similar to a digital photograph, but every pixel in the image contains distance information as well as intensity. This is useful for measuring the shape, size and location of objects in a scene, hence is well suited to certain machine vision applications. Previously we demonstrated a heterodyne range imaging system operating in a relatively high resolution (512-by-512) pixels and high precision (0.4 mm best case) configuration, but with a slow measurement rate (one every 10 s). Although this high precision range imaging is useful for some applications, the low acquisition speed is limiting in many situations. The system's frame rate and length of acquisition is fully configurable in software, which means the measurement rate can be increased by compromising precision and image resolution. In this paper we demonstrate the flexibility of our range imaging system by showing examples of high precision ranging at slow acquisition speeds and video-rate ranging with reduced ranging precision and image resolution. We also show that the heterodyne approach and the use of more than four samples per beat cycle provides better linearity than the traditional homodyne quadrature detection approach. Finally, we comment on practical issues of frame rate and beat signal frequency selection.

  11. Target recognition of log-polar ladar range images using moment invariants

    Science.gov (United States)

    Xia, Wenze; Han, Shaokun; Cao, Jie; Yu, Haoyong

    2017-01-01

    The ladar range image has received considerable attentions in the automatic target recognition field. However, previous research does not cover target recognition using log-polar ladar range images. Therefore, we construct a target recognition system based on log-polar ladar range images in this paper. In this system combined moment invariants and backpropagation neural network are selected as shape descriptor and shape classifier, respectively. In order to fully analyze the effect of log-polar sampling pattern on recognition result, several comparative experiments based on simulated and real range images are carried out. Eventually, several important conclusions are drawn: (i) if combined moments are computed directly by log-polar range images, translation, rotation and scaling invariant properties of combined moments will be invalid (ii) when object is located in the center of field of view, recognition rate of log-polar range images is less sensitive to the changing of field of view (iii) as object position changes from center to edge of field of view, recognition performance of log-polar range images will decline dramatically (iv) log-polar range images has a better noise robustness than Cartesian range images. Finally, we give a suggestion that it is better to divide field of view into recognition area and searching area in the real application.

  12. CMOS image sensor for detection of interferon gamma protein interaction as a point-of-care approach.

    Science.gov (United States)

    Marimuthu, Mohana; Kandasamy, Karthikeyan; Ahn, Chang Geun; Sung, Gun Yong; Kim, Min-Gon; Kim, Sanghyo

    2011-09-01

    Complementary metal oxide semiconductor (CMOS)-based image sensors have received increased attention owing to the possibility of incorporating them into portable diagnostic devices. The present research examined the efficiency and sensitivity of a CMOS image sensor for the detection of antigen-antibody interactions involving interferon gamma protein without the aid of expensive instruments. The highest detection sensitivity of about 1 fg/ml primary antibody was achieved simply by a transmission mechanism. When photons are prevented from hitting the sensor surface, a reduction in digital output occurs in which the number of photons hitting the sensor surface is approximately proportional to the digital number. Nanoscale variation in substrate thickness after protein binding can be detected with high sensitivity by the CMOS image sensor. Therefore, this technique can be easily applied to smartphones or any clinical diagnostic devices for the detection of several biological entities, with high impact on the development of point-of-care applications.

  13. High-dynamic-range imaging for cloud segmentation

    Science.gov (United States)

    Dev, Soumyabrata; Savoy, Florian M.; Lee, Yee Hui; Winkler, Stefan

    2018-04-01

    Sky-cloud images obtained from ground-based sky cameras are usually captured using a fisheye lens with a wide field of view. However, the sky exhibits a large dynamic range in terms of luminance, more than a conventional camera can capture. It is thus difficult to capture the details of an entire scene with a regular camera in a single shot. In most cases, the circumsolar region is overexposed, and the regions near the horizon are underexposed. This renders cloud segmentation for such images difficult. In this paper, we propose HDRCloudSeg - an effective method for cloud segmentation using high-dynamic-range (HDR) imaging based on multi-exposure fusion. We describe the HDR image generation process and release a new database to the community for benchmarking. Our proposed approach is the first using HDR radiance maps for cloud segmentation and achieves very good results.

  14. The enhanced cyan fluorescent protein: a sensitive pH sensor for fluorescence lifetime imaging.

    Science.gov (United States)

    Poëa-Guyon, Sandrine; Pasquier, Hélène; Mérola, Fabienne; Morel, Nicolas; Erard, Marie

    2013-05-01

    pH is an important parameter that affects many functions of live cells, from protein structure or function to several crucial steps of their metabolism. Genetically encoded pH sensors based on pH-sensitive fluorescent proteins have been developed and used to monitor the pH of intracellular compartments. The quantitative analysis of pH variations can be performed either by ratiometric or fluorescence lifetime detection. However, most available genetically encoded pH sensors are based on green and yellow fluorescent proteins and are not compatible with multicolor approaches. Taking advantage of the strong pH sensitivity of enhanced cyan fluorescent protein (ECFP), we demonstrate here its suitability as a sensitive pH sensor using fluorescence lifetime imaging. The intracellular ECFP lifetime undergoes large changes (32 %) in the pH 5 to pH 7 range, which allows accurate pH measurements to better than 0.2 pH units. By fusion of ECFP with the granular chromogranin A, we successfully measured the pH in secretory granules of PC12 cells, and we performed a kinetic analysis of intragranular pH variations in living cells exposed to ammonium chloride.

  15. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor.

    Science.gov (United States)

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-Ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-03-05

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes.

  16. Change Detection with GRASS GIS – Comparison of images taken by different sensors

    Directory of Open Access Journals (Sweden)

    Michael Fuchs

    2009-04-01

    Full Text Available Images of American military reconnaissance satellites of the Sixties (CORONA in combination with modern sensors (SPOT, QuickBird were used for detection of changes in land use. The pilot area was located about 40 km northwest of Yemen’s capital Sana’a and covered approximately 100 km2 . To produce comparable layers from images of distinctly different sources, the moving window technique was applied, using the diversity parameter. The resulting difference layers reveal plausible and interpretable change patterns, particularly in areas where urban sprawl occurs.The comparison of CORONA images with images taken by modern sensors proved to be an additional tool to visualize and quantify major changes in land use. The results should serve as additional basic data eg. in regional planning.The computation sequence was executed in GRASS GIS.

  17. Visible Wavelength Color Filters Using Dielectric Subwavelength Gratings for Backside-Illuminated CMOS Image Sensor Technologies.

    Science.gov (United States)

    Horie, Yu; Han, Seunghoon; Lee, Jeong-Yub; Kim, Jaekwan; Kim, Yongsung; Arbabi, Amir; Shin, Changgyun; Shi, Lilong; Arbabi, Ehsan; Kamali, Seyedeh Mahsa; Lee, Hong-Seok; Hwang, Sungwoo; Faraon, Andrei

    2017-05-10

    We report transmissive color filters based on subwavelength dielectric gratings that can replace conventional dye-based color filters used in backside-illuminated CMOS image sensor (BSI CIS) technologies. The filters are patterned in an 80 nm-thick poly silicon film on a 115 nm-thick SiO 2 spacer layer. They are optimized for operating at the primary RGB colors, exhibit peak transmittance of 60-80%, and have an almost insensitive response over a ± 20° angular range. This technology enables shrinking of the pixel sizes down to near a micrometer.

  18. Modeling of Potential Distribution of Electrical Capacitance Tomography Sensor for Multiphase Flow Image

    Directory of Open Access Journals (Sweden)

    S. Sathiyamoorthy

    2007-09-01

    Full Text Available Electrical Capacitance Tomography (ECT was used to develop image of various multi phase flow of gas-liquid-solid in a closed pipe. The principal difficulties to obtained real time image from ECT sensor are permittivity distribution across the plate and capacitance is nonlinear; the electric field is distorted by the material present and is also sensitive to measurement errors and noise. This work present a detailed description is given on method employed for image reconstruction from the capacitance measurements. The discretization and iterative algorithm is developed for improving the predictions with minimum error. The author analyzed eight electrodes square sensor ECT system with two-phase water-gas and solid-gas.

  19. Ultra-fast Sensor for Single-photon Detection in a Wide Range of the Electromagnetic Spectrum

    Directory of Open Access Journals (Sweden)

    Astghik KUZANYAN

    2016-12-01

    Full Text Available The results of computer simulation of heat distribution processes taking place after absorption of single photons of 1 eV-1 keV energy in three-layer sensor of the thermoelectric detector are being analyzed. Different geometries of the sensor with tungsten absorber, thermoelectric layer of cerium hexaboride and tungsten heat sink are considered. It is shown that by changing the sizes of the sensor layers it is possible to obtain transducers for registration of photons within the given spectral range with required energy resolution and count rate. It is concluded that, as compared to the single layer sensor, the thee-layer sensor has a number of advantages and demonstrate characteristics that make possible to consider the thermoelectric detector as a real alternative to superconducting single photon detectors.

  20. ISAR imaging using the instantaneous range instantaneous Doppler method

    CSIR Research Space (South Africa)

    Wazna, TM

    2015-10-01

    Full Text Available In Inverse Synthetic Aperture Radar (ISAR) imaging, the Range Instantaneous Doppler (RID) method is used to compensate for the nonuniform rotational motion of the target that degrades the Doppler resolution of the ISAR image. The Instantaneous Range...

  1. Computed Tomography Image Origin Identification Based on Original Sensor Pattern Noise and 3-D Image Reconstruction Algorithm Footprints.

    Science.gov (United States)

    Duan, Yuping; Bouslimi, Dalel; Yang, Guanyu; Shu, Huazhong; Coatrieux, Gouenou

    2017-07-01

    In this paper, we focus on the "blind" identification of the computed tomography (CT) scanner that has produced a CT image. To do so, we propose a set of noise features derived from the image chain acquisition and which can be used as CT-scanner footprint. Basically, we propose two approaches. The first one aims at identifying a CT scanner based on an original sensor pattern noise (OSPN) that is intrinsic to the X-ray detectors. The second one identifies an acquisition system based on the way this noise is modified by its three-dimensional (3-D) image reconstruction algorithm. As these reconstruction algorithms are manufacturer dependent and kept secret, our features are used as input to train a support vector machine (SVM) based classifier to discriminate acquisition systems. Experiments conducted on images issued from 15 different CT-scanner models of 4 distinct manufacturers demonstrate that our system identifies the origin of one CT image with a detection rate of at least 94% and that it achieves better performance than sensor pattern noise (SPN) based strategy proposed for general public camera devices.

  2. A CMOS Image Sensor With In-Pixel Buried-Channel Source Follower and Optimized Row Selector

    NARCIS (Netherlands)

    Chen, Y.; Wang, X.; Mierop, A.J.; Theuwissen, A.J.P.

    2009-01-01

    This paper presents a CMOS imager sensor with pinned-photodiode 4T active pixels which use in-pixel buried-channel source followers (SFs) and optimized row selectors. The test sensor has been fabricated in a 0.18-mum CMOS process. The sensor characterization was carried out successfully, and the

  3. A wide range and highly sensitive optical fiber pH sensor using polyacrylamide hydrogel

    Science.gov (United States)

    Pathak, Akhilesh Kumar; Singh, Vinod Kumar

    2017-12-01

    In the present study we report the fabrication and characterization of no-core fiber sensor (NCFS) using smart hydrogel coating for pH measurement. The no-core fiber (NCF) is stubbed between two single-mode fibers with SMA connector before immobilizing of smart hydrogel. The wavelength interrogation technique is used to calculate the sensitivity of the proposed sensor. The result shows a high sensitivity of 1.94 nm/pH for a wide range of pH values varied from 3 to 10 with a good linear response. In addition to high sensitivity, the fabricated sensor provides a fast response time with a good stability, repeatability and reproducibility.

  4. A low-cost, high-resolution, video-rate imaging optical radar

    Energy Technology Data Exchange (ETDEWEB)

    Sackos, J.T.; Nellums, R.O.; Lebien, S.M.; Diegert, C.F. [Sandia National Labs., Albuquerque, NM (United States); Grantham, J.W.; Monson, T. [Air Force Research Lab., Eglin AFB, FL (United States)

    1998-04-01

    Sandia National Laboratories has developed a unique type of portable low-cost range imaging optical radar (laser radar or LADAR). This innovative sensor is comprised of an active floodlight scene illuminator and an image intensified CCD camera receiver. It is a solid-state device (no moving parts) that offers significant size, performance, reliability, and simplicity advantages over other types of 3-D imaging sensors. This unique flash LADAR is based on low cost, commercially available hardware, and is well suited for many government and commercial uses. This paper presents an update of Sandia`s development of the Scannerless Range Imager technology and applications, and discusses the progress that has been made in evolving the sensor into a compact, low, cost, high-resolution, video rate Laser Dynamic Range Imager.

  5. A Shack-Hartmann Sensor for Single-Shot Multi-Contrast Imaging with Hard X-rays

    Directory of Open Access Journals (Sweden)

    Tomy dos Santos Rolo

    2018-05-01

    Full Text Available An array of compound refractive X-ray lenses (CRL with 20 × 20 lenslets, a focal distance of 20cm and a visibility of 0.93 is presented. It can be used as a Shack-Hartmann sensor for hard X-rays (SHARX for wavefront sensing and permits for true single-shot multi-contrast imaging the dynamics of materials with a spatial resolution in the micrometer range, sensitivity on nanosized structures and temporal resolution on the microsecond scale. The object’s absorption and its induced wavefront shift can be assessed simultaneously together with information from diffraction channels. In contrast to the established Hartmann sensors the SHARX has an increased flux efficiency through focusing of the beam rather than blocking parts of it. We investigated the spatiotemporal behavior of a cavitation bubble induced by laser pulses. Furthermore, we validated the SHARX by measuring refraction angles of a single diamond CRL, where we obtained an angular resolution better than 4 μ rad.

  6. Image Alignment for Multiple Camera High Dynamic Range Microscopy.

    Science.gov (United States)

    Eastwood, Brian S; Childs, Elisabeth C

    2012-01-09

    This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability for HDR alignment of three exposure-robust techniques. We conclude that image alignment based on matching feature descriptors extracted from radiant power images from calibrated cameras yields the most accurate and robust solution. We demonstrate the use of this alignment technique in a high dynamic range video microscope that enables live specimen imaging with a greater level of detail than can be captured with a single camera.

  7. Recce NG: from Recce sensor to image intelligence (IMINT)

    Science.gov (United States)

    Larroque, Serge

    2001-12-01

    Recce NG (Reconnaissance New Generation) is presented as a complete and optimized Tactical Reconnaissance System. Based on a new generation Pod integrating high resolution Dual Band sensors, the system has been designed with the operational lessons learnt from the last Peace Keeping Operations in Bosnia and Kosovo. The technical solutions retained as component modules of a full IMINT acquisition system, take benefit of the state of art in the following key technologies: Advanced Mission Planning System for long range stand-off Manned Recce, Aircraft and/or Pod tasking, operating sophisticated back-up software tools, high resolution 3D geo data and improved/combat proven MMI to reduce planning delays, Mature Dual Band sensors technology to achieve the Day and Night Recce Mission, including advanced automatic operational functions, as azimuth and roll tracking capabilities, low risk in Pod integration and in carrier avionics, controls and displays upgrades, to save time in operational turn over and maintenance, High rate Imagery Down Link, for Real Time or Near Real Time transmission, fully compatible with STANAG 7085 requirements, Advanced IMINT Exploitation Ground Segment, combat proven, NATO interoperable (STANAG 7023), integrating high value software tools for accurate location, improved radiometric image processing and open link to the C4ISR systems. The choice of an industrial Prime contractor mastering across the full system, all the prior listed key products and technologies, is mandatory to a successful delivery in terms of low Cost, Risk and Time Schedule.

  8. A sprayable luminescent pH sensor and its use for wound imaging in vivo.

    Science.gov (United States)

    Schreml, Stephan; Meier, Robert J; Weiß, Katharina T; Cattani, Julia; Flittner, Dagmar; Gehmert, Sebastian; Wolfbeis, Otto S; Landthaler, Michael; Babilas, Philipp

    2012-12-01

    Non-invasive luminescence imaging is of great interest for studying biological parameters in wound healing, tumors and other biomedical fields. Recently, we developed the first method for 2D luminescence imaging of pH in vivo on humans, and a novel method for one-stop-shop visualization of oxygen and pH using the RGB read-out of digital cameras. Both methods make use of semitransparent sensor foils. Here, we describe a sprayable ratiometric luminescent pH sensor, which combines properties of both these methods. Additionally, a major advantage is that the sensor spray is applicable to very uneven tissue surfaces due to its consistency. A digital RGB image of the spray on tissue is taken. The signal of the pH indicator (fluorescein isothiocyanate) is stored in the green channel (G), while that of the reference dye [ruthenium(II)-tris-(4,7-diphenyl-1,10-phenanthroline)] is stored in the red channel (R). Images are processed by rationing luminescence intensities (G/R) to result in pseudocolor pH maps of tissues, e.g. wounds. © 2012 John Wiley & Sons A/S.

  9. Finite Element Analysis of Film Stack Architecture for Complementary Metal-Oxide-Semiconductor Image Sensors.

    Science.gov (United States)

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-05-02

    Image sensors are the core components of computer, communication, and consumer electronic products. Complementary metal oxide semiconductor (CMOS) image sensors have become the mainstay of image-sensing developments, but are prone to leakage current. In this study, we simulate the CMOS image sensor (CIS) film stacking process by finite element analysis. To elucidate the relationship between the leakage current and stack architecture, we compare the simulated and measured leakage currents in the elements. Based on the analysis results, we further improve the performance by optimizing the architecture of the film stacks or changing the thin-film material. The material parameters are then corrected to improve the accuracy of the simulation results. The simulated and experimental results confirm a positive correlation between measured leakage current and stress. This trend is attributed to the structural defects induced by high stress, which generate leakage. Using this relationship, we can change the structure of the thin-film stack to reduce the leakage current and thereby improve the component life and reliability of the CIS components.

  10. Effects of Resolution, Range, and Image Contrast on Target Acquisition Performance.

    Science.gov (United States)

    Hollands, Justin G; Terhaar, Phil; Pavlovic, Nada J

    2018-05-01

    We sought to determine the joint influence of resolution, target range, and image contrast on the detection and identification of targets in simulated naturalistic scenes. Resolution requirements for target acquisition have been developed based on threshold values obtained using imaging systems, when target range was fixed, and image characteristics were determined by the system. Subsequent work has examined the influence of factors like target range and image contrast on target acquisition. We varied the resolution and contrast of static images in two experiments. Participants (soldiers) decided whether a human target was located in the scene (detection task) or whether a target was friendly or hostile (identification task). Target range was also varied (50-400 m). In Experiment 1, 30 participants saw color images with a single target exemplar. In Experiment 2, another 30 participants saw monochrome images containing different target exemplars. The effects of target range and image contrast were qualitatively different above and below 6 pixels per meter of target for both tasks in both experiments. Target detection and identification performance were a joint function of image resolution, range, and contrast for both color and monochrome images. The beneficial effects of increasing resolution for target acquisition performance are greater for closer (larger) targets.

  11. Influence of range-gated intensifiers on underwater imaging system SNR

    Science.gov (United States)

    Wang, Xia; Hu, Ling; Zhi, Qiang; Chen, Zhen-yue; Jin, Wei-qi

    2013-08-01

    Range-gated technology has been a hot research field in recent years due to its high effective back scattering eliminating. As a result, it can enhance the contrast between a target and its background and extent the working distance of the imaging system. The underwater imaging system is required to have the ability to image in low light level conditions, as well as the ability to eliminate the back scattering effect, which means that the receiver has to be high-speed external trigger function, high resolution, high sensitivity, low noise, higher gain dynamic range. When it comes to an intensifier, the noise characteristics directly restrict the observation effect and range of the imaging system. The background noise may decrease the image contrast and sharpness, even covering the signal making it impossible to recognize the target. So it is quite important to investigate the noise characteristics of intensifiers. SNR is an important parameter reflecting the noise features of a system. Through the use of underwater laser range-gated imaging prediction model, and according to the linear SNR system theory, the gated imaging noise performance of the present market adopted super second generation and generation Ⅲ intensifiers were theoretically analyzed. Based on the active laser underwater range-gated imaging model, the effect to the system by gated intensifiers and the relationship between the system SNR and MTF were studied. Through theoretical and simulation analysis to the image intensifier background noise and SNR, the different influence on system SNR by super second generation and generation Ⅲ ICCD was obtained. Range-gated system SNR formula was put forward, and compared the different effect influence on the system by using two kind of ICCDs was compared. According to the matlab simulation, a detailed analysis was carried out theoretically. All the work in this paper lays a theoretical foundation to further eliminating back scattering effect, improving

  12. A Portable Colloidal Gold Strip Sensor for Clenbuterol and Ractopamine Using Image Processing Technology

    Directory of Open Access Journals (Sweden)

    Yi Guo

    2013-01-01

    Full Text Available A portable colloidal golden strip sensor for detecting clenbuterol and ractopamine has been developed using image processing technology, as well as a novel strip reader has achieved innovatively with this imaging sensor. Colloidal gold strips for clenbuterol and ractopamine is used as first sensor with given biomedical immunication reaction. After three minutes the target sample dropped on, the color showing in the T line is relative to the content of objects as clenbuterol, this reader can finish many functions like automatic acquit ion of colored strip image, quantatively analysis of the color lines including the control line and test line, and data storage and transfer to computer. The system is integrated image collection, pattern recognition and real-time colloidal gold quantitative measurement. In experiment, clenbuterol and ractopamine standard substance with concentration from 0 ppb to 10 ppb is prepared and tested, the result reveals that standard solutions of clenbuterol and ractopamine have a good secondary fitting character with color degree (R2 is up to 0.99 and 0.98. Besides, through standard sample addition to the object negative substance, good recovery results are obtained up to 98 %. Above all, an optical sensor for colloidal strip measure is capable of determining the content of clenbuterol and ractopamine, it is likely to apply to quantatively identifying of similar reaction of colloidal golden strips.

  13. Ultra-high resolution coded wavefront sensor

    KAUST Repository

    Wang, Congli

    2017-06-08

    Wavefront sensors and more general phase retrieval methods have recently attracted a lot of attention in a host of application domains, ranging from astronomy to scientific imaging and microscopy. In this paper, we introduce a new class of sensor, the Coded Wavefront Sensor, which provides high spatio-temporal resolution using a simple masked sensor under white light illumination. Specifically, we demonstrate megapixel spatial resolution and phase accuracy better than 0.1 wavelengths at reconstruction rates of 50 Hz or more, thus opening up many new applications from high-resolution adaptive optics to real-time phase retrieval in microscopy.

  14. Mathematical Model and Calibration Experiment of a Large Measurement Range Flexible Joints 6-UPUR Six-Axis Force Sensor

    Directory of Open Access Journals (Sweden)

    Yanzhi Zhao

    2016-08-01

    Full Text Available Nowadays improving the accuracy and enlarging the measuring range of six-axis force sensors for wider applications in aircraft landing, rocket thrust, and spacecraft docking testing experiments has become an urgent objective. However, it is still difficult to achieve high accuracy and large measuring range with traditional parallel six-axis force sensors due to the influence of the gap and friction of the joints. Therefore, to overcome the mentioned limitations, this paper proposed a 6-Universal-Prismatic-Universal-Revolute (UPUR joints parallel mechanism with flexible joints to develop a large measurement range six-axis force sensor. The structural characteristics of the sensor are analyzed in comparison with traditional parallel sensor based on the Stewart platform. The force transfer relation of the sensor is deduced, and the force Jacobian matrix is obtained using screw theory in two cases of the ideal state and the state of flexibility of each flexible joint is considered. The prototype and loading calibration system are designed and developed. The K value method and least squares method are used to process experimental data, and in errors of kind Ι and kind II linearity are obtained. The experimental results show that the calibration error of the K value method is more than 13.4%, and the calibration error of the least squares method is 2.67%. The experimental results prove the feasibility of the sensor and the correctness of the theoretical analysis which are expected to be adopted in practical applications.

  15. Mathematical Model and Calibration Experiment of a Large Measurement Range Flexible Joints 6-UPUR Six-Axis Force Sensor.

    Science.gov (United States)

    Zhao, Yanzhi; Zhang, Caifeng; Zhang, Dan; Shi, Zhongpan; Zhao, Tieshi

    2016-08-11

    Nowadays improving the accuracy and enlarging the measuring range of six-axis force sensors for wider applications in aircraft landing, rocket thrust, and spacecraft docking testing experiments has become an urgent objective. However, it is still difficult to achieve high accuracy and large measuring range with traditional parallel six-axis force sensors due to the influence of the gap and friction of the joints. Therefore, to overcome the mentioned limitations, this paper proposed a 6-Universal-Prismatic-Universal-Revolute (UPUR) joints parallel mechanism with flexible joints to develop a large measurement range six-axis force sensor. The structural characteristics of the sensor are analyzed in comparison with traditional parallel sensor based on the Stewart platform. The force transfer relation of the sensor is deduced, and the force Jacobian matrix is obtained using screw theory in two cases of the ideal state and the state of flexibility of each flexible joint is considered. The prototype and loading calibration system are designed and developed. The K value method and least squares method are used to process experimental data, and in errors of kind Ι and kind II linearity are obtained. The experimental results show that the calibration error of the K value method is more than 13.4%, and the calibration error of the least squares method is 2.67%. The experimental results prove the feasibility of the sensor and the correctness of the theoretical analysis which are expected to be adopted in practical applications.

  16. Maintained functionality of an implantable radiotelemetric blood pressure and heart rate sensor after magnetic resonance imaging in rats

    International Nuclear Information System (INIS)

    Nölte, I; Boll, H; Figueiredo, G; Groden, C; Brockmann, M A; Gorbey, S; Lemmer, B

    2011-01-01

    Radiotelemetric sensors for in vivo assessment of blood pressure and heart rate are widely used in animal research. MRI with implanted sensors is regarded as contraindicated as transmitter malfunction and injury of the animal may be caused. Moreover, artefacts are expected to compromise image evaluation. In vitro, the function of a radiotelemetric sensor (TA11PA-C10, Data Sciences International) after exposure to MRI up to 9.4 T was assessed. The magnetic force of the electromagnetic field on the sensor as well as radiofrequency (RF)-induced sensor heating was analysed. Finally, MRI with an implanted sensor was performed in a rat. Imaging artefacts were analysed at 3.0 and 9.4 T ex vivo and in vivo. Transmitted 24 h blood pressure and heart rate were compared before and after MRI to verify the integrity of the telemetric sensor. The function of the sensor was not altered by MRI up to 9.4 T. The maximum force exerted on the sensor was 273 ± 50 mN. RF-induced heating was ruled out. Artefacts impeded the assessment of the abdomen and thorax in a dead rat, but not of the head and neck. MRI with implanted radiotelemetric sensors is feasible in principal. The tested sensor maintains functionality up to 9.4 T. Artefacts hampered abdominal and throacic imaging in rats, while assessment of the head and neck is possible

  17. Highly Sensitive and Wide-Dynamic-Range Multichannel Optical-Fiber pH Sensor Based on PWM Technique.

    Science.gov (United States)

    Khan, Md Rajibur Rahaman; Kang, Shin-Won

    2016-11-09

    In this study, we propose a highly sensitive multichannel pH sensor that is based on an optical-fiber pulse width modulation (PWM) technique. According to the optical-fiber PWM method, the received sensing signal's pulse width changes when the optical-fiber pH sensing-element of the array comes into contact with pH buffer solutions. The proposed optical-fiber PWM pH-sensing system offers a linear sensing response over a wide range of pH values from 2 to 12, with a high pH-sensing ability. The sensitivity of the proposed pH sensor is 0.46 µs/pH, and the correlation coefficient R² is approximately 0.997. Additional advantages of the proposed optical-fiber PWM pH sensor include a short/fast response-time of about 8 s, good reproducibility properties with a relative standard deviation (RSD) of about 0.019, easy fabrication, low cost, small size, reusability of the optical-fiber sensing-element, and the capability of remote sensing. Finally, the performance of the proposed PWM pH sensor was compared with that of potentiometric, optical-fiber modal interferometer, and optical-fiber Fabry-Perot interferometer pH sensors with respect to dynamic range width, linearity as well as response and recovery times. We observed that the proposed sensing systems have better sensing abilities than the above-mentioned pH sensors.

  18. Highly Sensitive and Wide-Dynamic-Range Multichannel Optical-Fiber pH Sensor Based on PWM Technique

    Science.gov (United States)

    Khan, Md. Rajibur Rahaman; Kang, Shin-Won

    2016-01-01

    In this study, we propose a highly sensitive multichannel pH sensor that is based on an optical-fiber pulse width modulation (PWM) technique. According to the optical-fiber PWM method, the received sensing signal’s pulse width changes when the optical-fiber pH sensing-element of the array comes into contact with pH buffer solutions. The proposed optical-fiber PWM pH-sensing system offers a linear sensing response over a wide range of pH values from 2 to 12, with a high pH-sensing ability. The sensitivity of the proposed pH sensor is 0.46 µs/pH, and the correlation coefficient R2 is approximately 0.997. Additional advantages of the proposed optical-fiber PWM pH sensor include a short/fast response-time of about 8 s, good reproducibility properties with a relative standard deviation (RSD) of about 0.019, easy fabrication, low cost, small size, reusability of the optical-fiber sensing-element, and the capability of remote sensing. Finally, the performance of the proposed PWM pH sensor was compared with that of potentiometric, optical-fiber modal interferometer, and optical-fiber Fabry–Perot interferometer pH sensors with respect to dynamic range width, linearity as well as response and recovery times. We observed that the proposed sensing systems have better sensing abilities than the above-mentioned pH sensors. PMID:27834865

  19. X-ray CCD image sensor with a thick depletion region

    International Nuclear Information System (INIS)

    Saito, Hirobumi; Watabe, Hiroshi.

    1984-01-01

    To develop a solid-state image sensor for high energy X-ray above 1 -- 2 keV, basic studies have been made on the CCD (charge coupled device) with a thick depletion region. A method of super-imposing a high DC bias voltage on low voltage signal pulses was newly proposed. The characteristics of both SCCD and BCCD were investigated, and their ability as X-ray sensors was compared. It was found that a depletion region of 60 μm thick was able to be obtained with ordinary doping density of 10 20 /m 3 , and that even thicker over 1 mm depletion region was able to be obtained with doping density of about 10 18 /m 3 , and a high bias voltage above 1 kV was able to be applied. It is suggested that the CCD image sensors for 8 keV or 24 keV X-ray can be realized since the absorption length of these X-ray in Si is about 60 μm and 1 mm, respectively. As for the characteristics other than the depletion thickness, the BCCD is preferable to SCCD for the present purpose because of lower noise and dark current. As for the transfer method, the frame-transfer method is recommended. (Aoki, K.)

  20. Photoresponse analysis of the CMOS photodiodes for CMOS x-ray image sensor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Soo; Ha, Jang Ho; Kim, Han Soo; Yeo, Sun Mok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-11-15

    Although in the short term CMOS active pixel sensors (APSs) cannot compete with the conventionally used charge coupled devices (CCDs) for high quality scientific imaging, recent development in CMOS APSs indicate that CMOS performance level of CCDs in several domains. CMOS APSs possess thereby a number of advantages such as simpler driving requirements and low power operation. CMOS image sensors can be processed in standard CMOS technologies and the potential of on-chip integration of analog and digital circuitry makes them more suitable for several vision systems where system cost is of importance. Moreover, CMOS imagers can directly benefit from on-going technological progress in the field of CMOS technologies. Due to these advantages, the CMOS APSs are currently being investigated actively for various applications such as star tracker, navigation camera and X-ray imaging etc. In most detection systems, it is thought that the sensor is most important, since this decides the signal and noise level. So, in CMOS APSs, the pixel is very important compared to other functional blocks. In order to predict the performance of such image sensor, a detailed understanding of the photocurrent generation in the photodiodes that comprise the CMOS APS is required. In this work, we developed the analytical model that can calculate the photocurrent generated in CMOS photodiode comprising CMOS APSs. The photocurrent calculations and photo response simulations with respect to the wavelength of the incident photon were performed using this model for four types of photodiodes that can be fabricated in standard CMOS process. n{sup +}/p{sup -}sub and n{sup +}/p{sup -}epi/p{sup -}sub photodiode show better performance compared to n{sup -}well/p{sup -}sub and n{sup -}well/p{sup -}epi/p{sup -}sub due to the wider depletion width. Comparing n{sup +}/p{sup -}sub and n{sup +}/p{sup -}epi/p{sup -}sub photodiode, n{sup +}/p{sup -}sub has higher photo-responsivity in longer wavelength because of

  1. Photoresponse analysis of the CMOS photodiodes for CMOS x-ray image sensor

    International Nuclear Information System (INIS)

    Kim, Young Soo; Ha, Jang Ho; Kim, Han Soo; Yeo, Sun Mok

    2012-01-01

    Although in the short term CMOS active pixel sensors (APSs) cannot compete with the conventionally used charge coupled devices (CCDs) for high quality scientific imaging, recent development in CMOS APSs indicate that CMOS performance level of CCDs in several domains. CMOS APSs possess thereby a number of advantages such as simpler driving requirements and low power operation. CMOS image sensors can be processed in standard CMOS technologies and the potential of on-chip integration of analog and digital circuitry makes them more suitable for several vision systems where system cost is of importance. Moreover, CMOS imagers can directly benefit from on-going technological progress in the field of CMOS technologies. Due to these advantages, the CMOS APSs are currently being investigated actively for various applications such as star tracker, navigation camera and X-ray imaging etc. In most detection systems, it is thought that the sensor is most important, since this decides the signal and noise level. So, in CMOS APSs, the pixel is very important compared to other functional blocks. In order to predict the performance of such image sensor, a detailed understanding of the photocurrent generation in the photodiodes that comprise the CMOS APS is required. In this work, we developed the analytical model that can calculate the photocurrent generated in CMOS photodiode comprising CMOS APSs. The photocurrent calculations and photo response simulations with respect to the wavelength of the incident photon were performed using this model for four types of photodiodes that can be fabricated in standard CMOS process. n + /p - sub and n + /p - epi/p - sub photodiode show better performance compared to n - well/p - sub and n - well/p - epi/p - sub due to the wider depletion width. Comparing n + /p - sub and n + /p - epi/p - sub photodiode, n + /p - sub has higher photo-responsivity in longer wavelength because of the higher electron diffusion current

  2. A Wide Spectral Range Reflectance and Luminescence Imaging System

    Directory of Open Access Journals (Sweden)

    Tapani Hirvonen

    2013-10-01

    Full Text Available In this study, we introduce a wide spectral range (200–2500 nm imaging system with a 250 μm minimum spatial resolution, which can be freely modified for a wide range of resolutions and measurement geometries. The system has been tested for reflectance and luminescence measurements, but can also be customized for transmittance measurements. This study includes the performance results of the developed system, as well as examples of spectral images. Discussion of the system relates it to existing systems and methods. The wide range spectral imaging system that has been developed is however highly customizable and has great potential in many practical applications.

  3. MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION

    Directory of Open Access Journals (Sweden)

    X. Liu

    2012-08-01

    Full Text Available Image Matching is often one of the first tasks in many Photogrammetry and Remote Sensing applications. This paper presents an efficient approach to automated multi-temporal and multi-sensor image matching based on local frequency information. Two new independent image representations, Local Average Phase (LAP and Local Weighted Amplitude (LWA, are presented to emphasize the common scene information, while suppressing the non-common illumination and sensor-dependent information. In order to get the two representations, local frequency information is firstly obtained from Log-Gabor wavelet transformation, which is similar to that of the human visual system; then the outputs of odd and even symmetric filters are used to construct the LAP and LWA. The LAP and LWA emphasize on the phase and amplitude information respectively. As these two representations are both derivative-free and threshold-free, they are robust to noise and can keep as much of the image details as possible. A new Compositional Similarity Measure (CSM is also presented to combine the LAP and LWA with the same weight for measuring the similarity of multi-temporal and multi-sensor images. The CSM can make the LAP and LWA compensate for each other and can make full use of the amplitude and phase of local frequency information. In many image matching applications, the template is usually selected without consideration of its matching robustness and accuracy. In order to overcome this problem, a local best matching point detection is presented to detect the best matching template. In the detection method, we employ self-similarity analysis to identify the template with the highest matching robustness and accuracy. Experimental results using some real images and simulation images demonstrate that the presented approach is effective for matching image pairs with significant scene and illumination changes and that it has advantages over other state-of-the-art approaches, which include: the

  4. High-Resolution Spin-on-Patterning of Perovskite Thin Films for a Multiplexed Image Sensor Array.

    Science.gov (United States)

    Lee, Woongchan; Lee, Jongha; Yun, Huiwon; Kim, Joonsoo; Park, Jinhong; Choi, Changsoon; Kim, Dong Chan; Seo, Hyunseon; Lee, Hakyong; Yu, Ji Woong; Lee, Won Bo; Kim, Dae-Hyeong

    2017-10-01

    Inorganic-organic hybrid perovskite thin films have attracted significant attention as an alternative to silicon in photon-absorbing devices mainly because of their superb optoelectronic properties. However, high-definition patterning of perovskite thin films, which is important for fabrication of the image sensor array, is hardly accomplished owing to their extreme instability in general photolithographic solvents. Here, a novel patterning process for perovskite thin films is described: the high-resolution spin-on-patterning (SoP) process. This fast and facile process is compatible with a variety of spin-coated perovskite materials and perovskite deposition techniques. The SoP process is successfully applied to develop a high-performance, ultrathin, and deformable perovskite-on-silicon multiplexed image sensor array, paving the road toward next-generation image sensor arrays. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor

    Science.gov (United States)

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-01-01

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes. PMID:29510599

  6. Hyperspectral Imaging Sensors and the Marine Coastal Zone

    Science.gov (United States)

    Richardson, Laurie L.

    2000-01-01

    Hyperspectral imaging sensors greatly expand the potential of remote sensing to assess, map, and monitor marine coastal zones. Each pixel in a hyperspectral image contains an entire spectrum of information. As a result, hyperspectral image data can be processed in two very different ways: by image classification techniques, to produce mapped outputs of features in the image on a regional scale; and by use of spectral analysis of the spectral data embedded within each pixel of the image. The latter is particularly useful in marine coastal zones because of the spectral complexity of suspended as well as benthic features found in these environments. Spectral-based analysis of hyperspectral (AVIRIS) imagery was carried out to investigate a marine coastal zone of South Florida, USA. Florida Bay is a phytoplankton-rich estuary characterized by taxonomically distinct phytoplankton assemblages and extensive seagrass beds. End-member spectra were extracted from AVIRIS image data corresponding to ground-truth sample stations and well-known field sites. Spectral libraries were constructed from the AVIRIS end-member spectra and used to classify images using the Spectral Angle Mapper (SAM) algorithm, a spectral-based approach that compares the spectrum, in each pixel of an image with each spectrum in a spectral library. Using this approach different phytoplankton assemblages containing diatoms, cyanobacteria, and green microalgae, as well as benthic community (seagrasses), were mapped.

  7. High speed display algorithm for 3D medical images using Multi Layer Range Image

    International Nuclear Information System (INIS)

    Ban, Hideyuki; Suzuki, Ryuuichi

    1993-01-01

    We propose high speed algorithm that display 3D voxel images obtained from medical imaging systems such as MRI. This algorithm convert voxel image data to 6 Multi Layer Range Image (MLRI) data, which is an augmentation of the range image data. To avoid the calculation for invisible voxels, the algorithm selects at most 3 MLRI data from 6 in accordance with the view direction. The proposed algorithm displays 256 x 256 x 256 voxel data within 0.6 seconds using 22 MIPS Workstation without a special hardware such as Graphics Engine. Real-time display will be possible on 100 MIPS class Workstation by our algorithm. (author)

  8. Pulse Based Time-of-Flight Range Sensing.

    Science.gov (United States)

    Sarbolandi, Hamed; Plack, Markus; Kolb, Andreas

    2018-05-23

    Pulse-based Time-of-Flight (PB-ToF) cameras are an attractive alternative range imaging approach, compared to the widely commercialized Amplitude Modulated Continuous-Wave Time-of-Flight (AMCW-ToF) approach. This paper presents an in-depth evaluation of a PB-ToF camera prototype based on the Hamamatsu area sensor S11963-01CR. We evaluate different ToF-related effects, i.e., temperature drift, systematic error, depth inhomogeneity, multi-path effects, and motion artefacts. Furthermore, we evaluate the systematic error of the system in more detail, and introduce novel concepts to improve the quality of range measurements by modifying the mode of operation of the PB-ToF camera. Finally, we describe the means of measuring the gate response of the PB-ToF sensor and using this information for PB-ToF sensor simulation.

  9. Dual-Emitting Fluorescent Metal-Organic Framework Nanocomposites as a Broad-Range pH Sensor for Fluorescence Imaging.

    Science.gov (United States)

    Chen, Haiyong; Wang, Jing; Shan, Duoliang; Chen, Jing; Zhang, Shouting; Lu, Xiaoquan

    2018-05-15

    pH plays an important role in understanding physiological/pathologic processes, and abnormal pH is a symbol of many common diseases such as cancer, stroke, and Alzheimer's disease. In this work, an effective dual-emission fluorescent metal-organic framework nanocomposite probe (denoted as RB-PCN) has been constructed for sensitive and broad-range detection of pH. RB-PCN was prepared by encapsulating the DBI-PEG-NH 2 -functionalized Fe 3 O 4 into Zr-MOFs and then further reacting it with rhodamine B isothiocyanates (RBITC). In RB-PCN, RBITC is capable of sensing changes in pH in acidic solutions. Zr-MOFs not only enrich the target analyte but also exhibit a fluorescence response to pH changes in alkaline solutions. Based on the above structural and compositional features, RB-PCN could detect a wide range of pH changes. Importantly, such a nanoprobe could "see" the intracellular pH changes by fluorescence confocal imaging as well as "measure" the wider range of pH in actual samples by fluorescence spectroscopy. To the best of our knowledge, this is the first time a MOF-based dual-emitting fluorescent nanoprobe has been used for a wide range of pH detection.

  10. Column-Parallel Single Slope ADC with Digital Correlated Multiple Sampling for Low Noise CMOS Image Sensors

    NARCIS (Netherlands)

    Chen, Y.; Theuwissen, A.J.P.; Chae, Y.

    2011-01-01

    This paper presents a low noise CMOS image sensor (CIS) using 10/12 bit configurable column-parallel single slope ADCs (SS-ADCs) and digital correlated multiple sampling (CMS). The sensor used is a conventional 4T active pixel with a pinned-photodiode as photon detector. The test sensor was

  11. Multiocular image sensor with on-chip beam-splitter and inner meta-micro-lens for single-main-lens stereo camera.

    Science.gov (United States)

    Koyama, Shinzo; Onozawa, Kazutoshi; Tanaka, Keisuke; Saito, Shigeru; Kourkouss, Sahim Mohamed; Kato, Yoshihisa

    2016-08-08

    We developed multiocular 1/3-inch 2.75-μm-pixel-size 2.1M- pixel image sensors by co-design of both on-chip beam-splitter and 100-nm-width 800-nm-depth patterned inner meta-micro-lens for single-main-lens stereo camera systems. A camera with the multiocular image sensor can capture horizontally one-dimensional light filed by both the on-chip beam-splitter horizontally dividing ray according to incident angle, and the inner meta-micro-lens collecting the divided ray into pixel with small optical loss. Cross-talks between adjacent light field images of a fabricated binocular image sensor and of a quad-ocular image sensor are as low as 6% and 7% respectively. With the selection of two images from one-dimensional light filed images, a selective baseline for stereo vision is realized to view close objects with single-main-lens. In addition, by adding multiple light field images with different ratios, baseline distance can be tuned within an aperture of a main lens. We suggest the electrically selective or tunable baseline stereo vision to reduce 3D fatigue of viewers.

  12. Radiometric inter-sensor cross-calibration uncertainty using a traceable high accuracy reference hyperspectral imager

    Science.gov (United States)

    Gorroño, Javier; Banks, Andrew C.; Fox, Nigel P.; Underwood, Craig

    2017-08-01

    Optical earth observation (EO) satellite sensors generally suffer from drifts and biases relative to their pre-launch calibration, caused by launch and/or time in the space environment. This places a severe limitation on the fundamental reliability and accuracy that can be assigned to satellite derived information, and is particularly critical for long time base studies for climate change and enabling interoperability and Analysis Ready Data. The proposed TRUTHS (Traceable Radiometry Underpinning Terrestrial and Helio-Studies) mission is explicitly designed to address this issue through re-calibrating itself directly to a primary standard of the international system of units (SI) in-orbit and then through the extension of this SI-traceability to other sensors through in-flight cross-calibration using a selection of Committee on Earth Observation Satellites (CEOS) recommended test sites. Where the characteristics of the sensor under test allows, this will result in a significant improvement in accuracy. This paper describes a set of tools, algorithms and methodologies that have been developed and used in order to estimate the radiometric uncertainty achievable for an indicative target sensor through in-flight cross-calibration using a well-calibrated hyperspectral SI-traceable reference sensor with observational characteristics such as TRUTHS. In this study, Multi-Spectral Imager (MSI) of Sentinel-2 and Landsat-8 Operational Land Imager (OLI) is evaluated as an example, however the analysis is readily translatable to larger-footprint sensors such as Sentinel-3 Ocean and Land Colour Instrument (OLCI) and Visible Infrared Imaging Radiometer Suite (VIIRS). This study considers the criticality of the instrumental and observational characteristics on pixel level reflectance factors, within a defined spatial region of interest (ROI) within the target site. It quantifies the main uncertainty contributors in the spectral, spatial, and temporal domains. The resultant tool

  13. Illumination Effect of Laser Light in Foggy Objects Using an Active Imaging System

    Energy Technology Data Exchange (ETDEWEB)

    Kwon, Seong-Ouk; Park, Seung-Kyu; Ahn, Yong-Jin; Baik, Sung-Hoon; Choi, Young-Soo; Jeong, Kyung-Min [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Active imaging techniques usually provide improved image information when compared to passive imaging techniques. Active vision is a direct visualization technique using an artificial illuminant. Range-gated imaging (RGI) technique is one of active vision technologies. The RGI technique extracts vision information by summing time sliced vision images. In the RGI system, objects are illuminated for ultra-short time by a high intensity illuminant and then the light reflected from objects is captured by a highly sensitive image sensor with the exposure of ultra-short time. The Range-gated imaging is an emerging technology in the field of surveillance for security application, especially in the visualization of darken night or foggy environment. Although RGI viewing was discovered in the 1960's, this technology is currently more applicable by virtue of the rapid development of optical and sensor technologies, such as highly sensitive imaging sensor and ultra-short pulse laser light. Especially, this system can be adopted in robot-vision system by virtue of the compact system configuration. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been demonstrated range imaging based on range-gated imaging. Laser light having a short pulse width is usually used for the range-gated imaging system. In this paper, an illumination effect of laser light in foggy objects is studied using a range-gated imaging system. The used imaging system consists of an ultra-short pulse (0.35 ns) laser light and a gated imaging sensor. The experiment is carried out to monitor objects in a box filled by fog. In this paper, the effects by fog particles in range-gated imaging technique are studied. Edge blurring and range distortion are the generated by fog particles.

  14. Illumination Effect of Laser Light in Foggy Objects Using an Active Imaging System

    International Nuclear Information System (INIS)

    Kwon, Seong-Ouk; Park, Seung-Kyu; Ahn, Yong-Jin; Baik, Sung-Hoon; Choi, Young-Soo; Jeong, Kyung-Min

    2015-01-01

    Active imaging techniques usually provide improved image information when compared to passive imaging techniques. Active vision is a direct visualization technique using an artificial illuminant. Range-gated imaging (RGI) technique is one of active vision technologies. The RGI technique extracts vision information by summing time sliced vision images. In the RGI system, objects are illuminated for ultra-short time by a high intensity illuminant and then the light reflected from objects is captured by a highly sensitive image sensor with the exposure of ultra-short time. The Range-gated imaging is an emerging technology in the field of surveillance for security application, especially in the visualization of darken night or foggy environment. Although RGI viewing was discovered in the 1960's, this technology is currently more applicable by virtue of the rapid development of optical and sensor technologies, such as highly sensitive imaging sensor and ultra-short pulse laser light. Especially, this system can be adopted in robot-vision system by virtue of the compact system configuration. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been demonstrated range imaging based on range-gated imaging. Laser light having a short pulse width is usually used for the range-gated imaging system. In this paper, an illumination effect of laser light in foggy objects is studied using a range-gated imaging system. The used imaging system consists of an ultra-short pulse (0.35 ns) laser light and a gated imaging sensor. The experiment is carried out to monitor objects in a box filled by fog. In this paper, the effects by fog particles in range-gated imaging technique are studied. Edge blurring and range distortion are the generated by fog particles

  15. Image Denoising Using Interquartile Range Filter with Local Averaging

    OpenAIRE

    Jassim, Firas Ajil

    2013-01-01

    Image denoising is one of the fundamental problems in image processing. In this paper, a novel approach to suppress noise from the image is conducted by applying the interquartile range (IQR) which is one of the statistical methods used to detect outlier effect from a dataset. A window of size kXk was implemented to support IQR filter. Each pixel outside the IQR range of the kXk window is treated as noisy pixel. The estimation of the noisy pixels was obtained by local averaging. The essential...

  16. Landsat 8 Operational Land Imager (OLI)_Thermal Infared Sensor (TIRS) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — Abstract:The Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) are instruments onboard the Landsat 8 satellite, which was launched in February of...

  17. Luminescence imaging of water during carbon-ion irradiation for range estimation

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Seiichi, E-mail: s-yama@met.nagoya-u.ac.jp; Komori, Masataka; Koyama, Shuji; Morishita, Yuki; Sekihara, Eri [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine, Higashi-ku, Nagoya, Aichi 461-8673 (Japan); Akagi, Takashi; Yamashita, Tomohiro [Hygo Ion Beam Medical Center, Hyogo 679-5165 (Japan); Toshito, Toshiyuki [Department of Proton Therapy Physics, Nagoya Proton Therapy Center, Nagoya City West Medical Center, Aichi 462-8508 (Japan)

    2016-05-15

    Purpose: The authors previously reported successful luminescence imaging of water during proton irradiation and its application to range estimation. However, since the feasibility of this approach for carbon-ion irradiation remained unclear, the authors conducted luminescence imaging during carbon-ion irradiation and estimated the ranges. Methods: The authors placed a pure-water phantom on the patient couch of a carbon-ion therapy system and measured the luminescence images with a high-sensitivity, cooled charge-coupled device camera during carbon-ion irradiation. The authors also carried out imaging of three types of phantoms (tap-water, an acrylic block, and a plastic scintillator) and compared their intensities and distributions with those of a phantom containing pure-water. Results: The luminescence images of pure-water phantoms during carbon-ion irradiation showed clear Bragg peaks, and the measured carbon-ion ranges from the images were almost the same as those obtained by simulation. The image of the tap-water phantom showed almost the same distribution as that of the pure-water phantom. The acrylic block phantom’s luminescence image produced seven times higher luminescence and had a 13% shorter range than that of the water phantoms; the range with the acrylic phantom generally matched the calculated value. The plastic scintillator showed ∼15 000 times higher light than that of water. Conclusions: Luminescence imaging during carbon-ion irradiation of water is not only possible but also a promising method for range estimation in carbon-ion therapy.

  18. Luminescence imaging of water during carbon-ion irradiation for range estimation

    International Nuclear Information System (INIS)

    Yamamoto, Seiichi; Komori, Masataka; Koyama, Shuji; Morishita, Yuki; Sekihara, Eri; Akagi, Takashi; Yamashita, Tomohiro; Toshito, Toshiyuki

    2016-01-01

    Purpose: The authors previously reported successful luminescence imaging of water during proton irradiation and its application to range estimation. However, since the feasibility of this approach for carbon-ion irradiation remained unclear, the authors conducted luminescence imaging during carbon-ion irradiation and estimated the ranges. Methods: The authors placed a pure-water phantom on the patient couch of a carbon-ion therapy system and measured the luminescence images with a high-sensitivity, cooled charge-coupled device camera during carbon-ion irradiation. The authors also carried out imaging of three types of phantoms (tap-water, an acrylic block, and a plastic scintillator) and compared their intensities and distributions with those of a phantom containing pure-water. Results: The luminescence images of pure-water phantoms during carbon-ion irradiation showed clear Bragg peaks, and the measured carbon-ion ranges from the images were almost the same as those obtained by simulation. The image of the tap-water phantom showed almost the same distribution as that of the pure-water phantom. The acrylic block phantom’s luminescence image produced seven times higher luminescence and had a 13% shorter range than that of the water phantoms; the range with the acrylic phantom generally matched the calculated value. The plastic scintillator showed ∼15 000 times higher light than that of water. Conclusions: Luminescence imaging during carbon-ion irradiation of water is not only possible but also a promising method for range estimation in carbon-ion therapy.

  19. Development of a thinned back-illuminated CMOS active pixel sensor for extreme ultraviolet spectroscopy and imaging in space science

    International Nuclear Information System (INIS)

    Waltham, N.R.; Prydderch, M.; Mapson-Menard, H.; Pool, P.; Harris, A.

    2007-01-01

    We describe our programme to develop a large-format, science-grade, monolithic CMOS active pixel sensor for future space science missions, and in particular an extreme ultraviolet (EUV) spectrograph for solar physics studies on ESA's Solar Orbiter. Our route to EUV sensitivity relies on adapting the back-thinning and rear-illumination techniques first developed for CCD sensors. Our first large-format sensor consists of 4kx3k 5 μm pixels fabricated on a 0.25 μm CMOS imager process. Wafer samples of these sensors have been thinned by e2v technologies with the aim of obtaining good sensitivity at EUV wavelengths. We present results from both front- and back-illuminated versions of this sensor. We also present our plans to develop a new sensor of 2kx2k 10 μm pixels, which will be fabricated on a 0.35 μm CMOS process. In progress towards this goal, we have designed a test-structure consisting of six arrays of 512x512 10 μm pixels. Each of the arrays has been given a different pixel design to allow verification of our models, and our progress towards optimizing a design for minimal system readout noise and maximum dynamic range. These sensors will also be back-thinned for characterization at EUV wavelengths

  20. A simple and low-cost biofilm quantification method using LED and CMOS image sensor.

    Science.gov (United States)

    Kwak, Yeon Hwa; Lee, Junhee; Lee, Junghoon; Kwak, Soo Hwan; Oh, Sangwoo; Paek, Se-Hwan; Ha, Un-Hwan; Seo, Sungkyu

    2014-12-01

    A novel biofilm detection platform, which consists of a cost-effective red, green, and blue light-emitting diode (RGB LED) as a light source and a lens-free CMOS image sensor as a detector, is designed. This system can measure the diffraction patterns of cells from their shadow images, and gather light absorbance information according to the concentration of biofilms through a simple image processing procedure. Compared to a bulky and expensive commercial spectrophotometer, this platform can provide accurate and reproducible biofilm concentration detection and is simple, compact, and inexpensive. Biofilms originating from various bacterial strains, including Pseudomonas aeruginosa (P. aeruginosa), were tested to demonstrate the efficacy of this new biofilm detection approach. The results were compared with the results obtained from a commercial spectrophotometer. To utilize a cost-effective light source (i.e., an LED) for biofilm detection, the illumination conditions were optimized. For accurate and reproducible biofilm detection, a simple, custom-coded image processing algorithm was developed and applied to a five-megapixel CMOS image sensor, which is a cost-effective detector. The concentration of biofilms formed by P. aeruginosa was detected and quantified by varying the indole concentration, and the results were compared with the results obtained from a commercial spectrophotometer. The correlation value of the results from those two systems was 0.981 (N = 9, P CMOS image-sensor platform. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Image sensors for radiometric measurements in the ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Desa, E.S.; Desa, B.A.E.

    the sensors at a stabilised moderately cool temperature of 15 deg. C and to intelligently control the exposure time of the device, so as to reliably measure flux levels in the range 1 W/m super(2)/nm to 10/6 W/m super(2)/nm commonly encountered in the ocean...

  2. Characterisation of a novel reverse-biased PPD CMOS image sensor

    Science.gov (United States)

    Stefanov, K. D.; Clarke, A. S.; Ivory, J.; Holland, A. D.

    2017-11-01

    A new pinned photodiode (PPD) CMOS image sensor (CIS) has been developed and characterised. The sensor can be fully depleted by means of reverse bias applied to the substrate, and the principle of operation is applicable to very thick sensitive volumes. Additional n-type implants under the pixel p-wells, called Deep Depletion Extension (DDE), have been added in order to eliminate the large parasitic substrate current that would otherwise be present in a normal device. The first prototype has been manufactured on a 18 μm thick, 1000 Ω .cm epitaxial silicon wafers using 180 nm PPD image sensor process at TowerJazz Semiconductor. The chip contains arrays of 10 μm and 5.4 μm pixels, with variations of the shape, size and the depth of the DDE implant. Back-side illuminated (BSI) devices were manufactured in collaboration with Teledyne e2v, and characterised together with the front-side illuminated (FSI) variants. The presented results show that the devices could be reverse-biased without parasitic leakage currents, in good agreement with simulations. The new 10 μm pixels in both BSI and FSI variants exhibit nearly identical photo response to the reference non-modified pixels, as characterised with the photon transfer curve. Different techniques were used to measure the depletion depth in FSI and BSI chips, and the results are consistent with the expected full depletion.

  3. Compact, self-contained enhanced-vision system (EVS) sensor simulator

    Science.gov (United States)

    Tiana, Carlo

    2007-04-01

    We describe the model SIM-100 PC-based simulator, for imaging sensors used, or planned for use, in Enhanced Vision System (EVS) applications. Typically housed in a small-form-factor PC, it can be easily integrated into existing out-the-window visual simulators for fixed-wing or rotorcraft, to add realistic sensor imagery to the simulator cockpit. Multiple bands of infrared (short-wave, midwave, extended-midwave and longwave) as well as active millimeter-wave RADAR systems can all be simulated in real time. Various aspects of physical and electronic image formation and processing in the sensor are accurately (and optionally) simulated, including sensor random and fixed pattern noise, dead pixels, blooming, B-C scope transformation (MMWR). The effects of various obscurants (fog, rain, etc.) on the sensor imagery are faithfully represented and can be selected by an operator remotely and in real-time. The images generated by the system are ideally suited for many applications, ranging from sensor development engineering tradeoffs (Field Of View, resolution, etc.), to pilot familiarization and operational training, and certification support. The realistic appearance of the simulated images goes well beyond that of currently deployed systems, and beyond that required by certification authorities; this level of realism will become necessary as operational experience with EVS systems grows.

  4. Overview of CMOS process and design options for image sensor dedicated to space applications

    Science.gov (United States)

    Martin-Gonthier, P.; Magnan, P.; Corbiere, F.

    2005-10-01

    With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.

  5. Low-complex energy-aware image communication in visual sensor networks

    Science.gov (United States)

    Phamila, Yesudhas Asnath Victy; Amutha, Ramachandran

    2013-10-01

    A low-complex, low bit rate, energy-efficient image compression algorithm explicitly designed for resource-constrained visual sensor networks applied for surveillance, battle field, habitat monitoring, etc. is presented, where voluminous amount of image data has to be communicated over a bandwidth-limited wireless medium. The proposed method overcomes the energy limitation of individual nodes and is investigated in terms of image quality, entropy, processing time, overall energy consumption, and system lifetime. This algorithm is highly energy efficient and extremely fast since it applies energy-aware zonal binary discrete cosine transform (DCT) that computes only the few required significant coefficients and codes them using enhanced complementary Golomb Rice code without using any floating point operations. Experiments are performed using the Atmel Atmega128 and MSP430 processors to measure the resultant energy savings. Simulation results show that the proposed energy-aware fast zonal transform consumes only 0.3% of energy needed by conventional DCT. This algorithm consumes only 6% of energy needed by Independent JPEG Group (fast) version, and it suits for embedded systems requiring low power consumption. The proposed scheme is unique since it significantly enhances the lifetime of the camera sensor node and the network without any need for distributed processing as was traditionally required in existing algorithms.

  6. Image Alignment for Multiple Camera High Dynamic Range Microscopy

    OpenAIRE

    Eastwood, Brian S.; Childs, Elisabeth C.

    2012-01-01

    This paper investigates the problem of image alignment for multiple camera high dynamic range (HDR) imaging. HDR imaging combines information from images taken with different exposure settings. Combining information from multiple cameras requires an alignment process that is robust to the intensity differences in the images. HDR applications that use a limited number of component images require an alignment technique that is robust to large exposure differences. We evaluate the suitability fo...

  7. High-resolution CCD imaging alternatives

    Science.gov (United States)

    Brown, D. L.; Acker, D. E.

    1992-08-01

    High resolution CCD color cameras have recently stimulated the interest of a large number of potential end-users for a wide range of practical applications. Real-time High Definition Television (HDTV) systems are now being used or considered for use in applications ranging from entertainment program origination through digital image storage to medical and scientific research. HDTV generation of electronic images offers significant cost and time-saving advantages over the use of film in such applications. Further in still image systems electronic image capture is faster and more efficient than conventional image scanners. The CCD still camera can capture 3-dimensional objects into the computing environment directly without having to shoot a picture on film develop it and then scan the image into a computer. 2. EXTENDING CCD TECHNOLOGY BEYOND BROADCAST Most standard production CCD sensor chips are made for broadcast-compatible systems. One popular CCD and the basis for this discussion offers arrays of roughly 750 x 580 picture elements (pixels) or a total array of approximately 435 pixels (see Fig. 1). FOR. A has developed a technique to increase the number of available pixels for a given image compared to that produced by the standard CCD itself. Using an inter-lined CCD with an overall spatial structure several times larger than the photo-sensitive sensor areas each of the CCD sensors is shifted in two dimensions in order to fill in spatial gaps between adjacent sensors.

  8. Method of orthogonally splitting imaging pose measurement

    Science.gov (United States)

    Zhao, Na; Sun, Changku; Wang, Peng; Yang, Qian; Liu, Xintong

    2018-01-01

    In order to meet the aviation's and machinery manufacturing's pose measurement need of high precision, fast speed and wide measurement range, and to resolve the contradiction between measurement range and resolution of vision sensor, this paper proposes an orthogonally splitting imaging pose measurement method. This paper designs and realizes an orthogonally splitting imaging vision sensor and establishes a pose measurement system. The vision sensor consists of one imaging lens, a beam splitter prism, cylindrical lenses and dual linear CCD. Dual linear CCD respectively acquire one dimensional image coordinate data of the target point, and two data can restore the two dimensional image coordinates of the target point. According to the characteristics of imaging system, this paper establishes the nonlinear distortion model to correct distortion. Based on cross ratio invariability, polynomial equation is established and solved by the least square fitting method. After completing distortion correction, this paper establishes the measurement mathematical model of vision sensor, and determines intrinsic parameters to calibrate. An array of feature points for calibration is built by placing a planar target in any different positions for a few times. An terative optimization method is presented to solve the parameters of model. The experimental results show that the field angle is 52 °, the focus distance is 27.40 mm, image resolution is 5185×5117 pixels, displacement measurement error is less than 0.1mm, and rotation angle measurement error is less than 0.15°. The method of orthogonally splitting imaging pose measurement can satisfy the pose measurement requirement of high precision, fast speed and wide measurement range.

  9. SEGMENTATION AND QUALITY ANALYSIS OF LONG RANGE CAPTURED IRIS IMAGE

    Directory of Open Access Journals (Sweden)

    Anand Deshpande

    2016-05-01

    Full Text Available The iris segmentation plays a major role in an iris recognition system to increase the performance of the system. This paper proposes a novel method for segmentation of iris images to extract the iris part of long range captured eye image and an approach to select best iris frame from the iris polar image sequences by analyzing the quality of iris polar images. The quality of iris image is determined by the frequency components present in the iris polar images. The experiments are carried out on CASIA-long range captured iris image sequences. The proposed segmentation method is compared with Hough transform based segmentation and it has been determined that the proposed method gives higher accuracy for segmentation than Hough transform.

  10. Study on super-resolution three-dimensional range-gated imaging technology

    Science.gov (United States)

    Guo, Huichao; Sun, Huayan; Wang, Shuai; Fan, Youchen; Li, Yuanmiao

    2018-04-01

    Range-gated three dimensional imaging technology is a hotspot in recent years, because of the advantages of high spatial resolution, high range accuracy, long range, and simultaneous reflection of target reflectivity information. Based on the study of the principle of intensity-related method, this paper has carried out theoretical analysis and experimental research. The experimental system adopts the high power pulsed semiconductor laser as light source, gated ICCD as the imaging device, can realize the imaging depth and distance flexible adjustment to achieve different work mode. The imaging experiment of small imaging depth is carried out aiming at building 500m away, and 26 group images were obtained with distance step 1.5m. In this paper, the calculation method of 3D point cloud based on triangle method is analyzed, and 15m depth slice of the target 3D point cloud are obtained by using two frame images, the distance precision is better than 0.5m. The influence of signal to noise ratio, illumination uniformity and image brightness on distance accuracy are analyzed. Based on the comparison with the time-slicing method, a method for improving the linearity of point cloud is proposed.

  11. Real-time DNA Amplification and Detection System Based on a CMOS Image Sensor.

    Science.gov (United States)

    Wang, Tiantian; Devadhasan, Jasmine Pramila; Lee, Do Young; Kim, Sanghyo

    2016-01-01

    In the present study, we developed a polypropylene well-integrated complementary metal oxide semiconductor (CMOS) platform to perform the loop mediated isothermal amplification (LAMP) technique for real-time DNA amplification and detection simultaneously. An amplification-coupled detection system directly measures the photon number changes based on the generation of magnesium pyrophosphate and color changes. The photon number decreases during the amplification process. The CMOS image sensor observes the photons and converts into digital units with the aid of an analog-to-digital converter (ADC). In addition, UV-spectral studies, optical color intensity detection, pH analysis, and electrophoresis detection were carried out to prove the efficiency of the CMOS sensor based the LAMP system. Moreover, Clostridium perfringens was utilized as proof-of-concept detection for the new system. We anticipate that this CMOS image sensor-based LAMP method will enable the creation of cost-effective, label-free, optical, real-time and portable molecular diagnostic devices.

  12. Multi-Sensor Fusion of Infrared and Electro-Optic Signals for High Resolution Night Images

    Directory of Open Access Journals (Sweden)

    Victor Lawrence

    2012-07-01

    Full Text Available Electro-optic (EO image sensors exhibit the properties of high resolution and low noise level at daytime, but they do not work in dark environments. Infrared (IR image sensors exhibit poor resolution and cannot separate objects with similar temperature. Therefore, we propose a novel framework of IR image enhancement based on the information (e.g., edge from EO images, which improves the resolution of IR images and helps us distinguish objects at night. Our framework superimposing/blending the edges of the EO image onto the corresponding transformed IR image improves their resolution. In this framework, we adopt the theoretical point spread function (PSF proposed by Hardie et al. for the IR image, which has the modulation transfer function (MTF of a uniform detector array and the incoherent optical transfer function (OTF of diffraction-limited optics. In addition, we design an inverse filter for the proposed PSF and use it for the IR image transformation. The framework requires four main steps: (1 inverse filter-based IR image transformation; (2 EO image edge detection; (3 registration; and (4 blending/superimposing of the obtained image pair. Simulation results show both blended and superimposed IR images, and demonstrate that blended IR images have better quality over the superimposed images. Additionally, based on the same steps, simulation result shows a blended IR image of better quality when only the original IR image is available.

  13. Multi-Sensor Mud Detection

    Science.gov (United States)

    Rankin, Arturo L.; Matthies, Larry H.

    2010-01-01

    Robust mud detection is a critical perception requirement for Unmanned Ground Vehicle (UGV) autonomous offroad navigation. A military UGV stuck in a mud body during a mission may have to be sacrificed or rescued, both of which are unattractive options. There are several characteristics of mud that may be detectable with appropriate UGV-mounted sensors. For example, mud only occurs on the ground surface, is cooler than surrounding dry soil during the daytime under nominal weather conditions, is generally darker than surrounding dry soil in visible imagery, and is highly polarized. However, none of these cues are definitive on their own. Dry soil also occurs on the ground surface, shadows, snow, ice, and water can also be cooler than surrounding dry soil, shadows are also darker than surrounding dry soil in visible imagery, and cars, water, and some vegetation are also highly polarized. Shadows, snow, ice, water, cars, and vegetation can all be disambiguated from mud by using a suite of sensors that span multiple bands in the electromagnetic spectrum. Because there are military operations when it is imperative for UGV's to operate without emitting strong, detectable electromagnetic signals, passive sensors are desirable. JPL has developed a daytime mud detection capability using multiple passive imaging sensors. Cues for mud from multiple passive imaging sensors are fused into a single mud detection image using a rule base, and the resultant mud detection is localized in a terrain map using range data generated from a stereo pair of color cameras.

  14. The assessment of multi-sensor image fusion using wavelet transforms for mapping the Brazilian Savanna

    NARCIS (Netherlands)

    Weimar Acerbi, F.; Clevers, J.G.P.W.; Schaepman, M.E.

    2006-01-01

    Multi-sensor image fusion using the wavelet approach provides a conceptual framework for the improvement of the spatial resolution with minimal distortion of the spectral content of the source image. This paper assesses whether images with a large ratio of spatial resolution can be fused, and

  15. An efficient and secure partial image encryption for wireless multimedia sensor networks using discrete wavelet transform, chaotic maps and substitution box

    Science.gov (United States)

    Khan, Muazzam A.; Ahmad, Jawad; Javaid, Qaisar; Saqib, Nazar A.

    2017-03-01

    Wireless Sensor Networks (WSN) is widely deployed in monitoring of some physical activity and/or environmental conditions. Data gathered from WSN is transmitted via network to a central location for further processing. Numerous applications of WSN can be found in smart homes, intelligent buildings, health care, energy efficient smart grids and industrial control systems. In recent years, computer scientists has focused towards findings more applications of WSN in multimedia technologies, i.e. audio, video and digital images. Due to bulky nature of multimedia data, WSN process a large volume of multimedia data which significantly increases computational complexity and hence reduces battery time. With respect to battery life constraints, image compression in addition with secure transmission over a wide ranged sensor network is an emerging and challenging task in Wireless Multimedia Sensor Networks. Due to the open nature of the Internet, transmission of data must be secure through a process known as encryption. As a result, there is an intensive demand for such schemes that is energy efficient as well as highly secure since decades. In this paper, discrete wavelet-based partial image encryption scheme using hashing algorithm, chaotic maps and Hussain's S-Box is reported. The plaintext image is compressed via discrete wavelet transform and then the image is shuffled column-wise and row wise-wise via Piece-wise Linear Chaotic Map (PWLCM) and Nonlinear Chaotic Algorithm, respectively. To get higher security, initial conditions for PWLCM are made dependent on hash function. The permuted image is bitwise XORed with random matrix generated from Intertwining Logistic map. To enhance the security further, final ciphertext is obtained after substituting all elements with Hussain's substitution box. Experimental and statistical results confirm the strength of the anticipated scheme.

  16. Development of a Compact Range-gated Vision System to Monitor Structures in Low-visibility Environments

    International Nuclear Information System (INIS)

    Ahn, Yong-Jin; Park, Seung-Kyu; Baik, Sung-Hoon; Kim, Dong-Lyul; Choi, Young-Soo; Jeong, Kyung-Min

    2015-01-01

    Image acquisition in disaster area or radiation area of nuclear industry is an important function for safety inspection and preparing appropriate damage control plans. So, automatic vision system to monitor structures and facilities in blurred smoking environments such as the places of a fire and detonation is essential. Vision systems can't acquire an image when the illumination light is blocked by disturbance materials, such as smoke, fog and dust. To overcome the imaging distortion caused by obstacle materials, robust vision systems should have extra-functions, such as active illumination through disturbance materials. One of active vision system is a range-gated imaging system. The vision system based on the range-gated imaging system can acquire image data from the blurred and darken light environments. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and range image data is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra-short exposure time to only get the illumination light. Here, the illuminant illuminates objects by flashing strong light through disturbance materials, such as smoke particles and dust particles. In contrast to passive conventional vision systems, the RGI active vision technology enables operation even in harsh environments like low-visibility smoky environment. In this paper, a compact range-gated vision system is developed to monitor structures in low-visibility environment. The system consists of illumination light, a range-gating camera and a control computer. Visualization experiments are carried out in low-visibility foggy environment to see imaging capability

  17. Development of a Compact Range-gated Vision System to Monitor Structures in Low-visibility Environments

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Yong-Jin; Park, Seung-Kyu; Baik, Sung-Hoon; Kim, Dong-Lyul; Choi, Young-Soo; Jeong, Kyung-Min [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    Image acquisition in disaster area or radiation area of nuclear industry is an important function for safety inspection and preparing appropriate damage control plans. So, automatic vision system to monitor structures and facilities in blurred smoking environments such as the places of a fire and detonation is essential. Vision systems can't acquire an image when the illumination light is blocked by disturbance materials, such as smoke, fog and dust. To overcome the imaging distortion caused by obstacle materials, robust vision systems should have extra-functions, such as active illumination through disturbance materials. One of active vision system is a range-gated imaging system. The vision system based on the range-gated imaging system can acquire image data from the blurred and darken light environments. Range-gated imaging (RGI) is a direct active visualization technique using a highly sensitive image sensor and a high intensity illuminant. Currently, the range-gated imaging technique providing 2D and range image data is one of emerging active vision technologies. The range-gated imaging system gets vision information by summing time sliced vision images. In the RGI system, a high intensity illuminant illuminates for ultra-short time and a highly sensitive image sensor is gated by ultra-short exposure time to only get the illumination light. Here, the illuminant illuminates objects by flashing strong light through disturbance materials, such as smoke particles and dust particles. In contrast to passive conventional vision systems, the RGI active vision technology enables operation even in harsh environments like low-visibility smoky environment. In this paper, a compact range-gated vision system is developed to monitor structures in low-visibility environment. The system consists of illumination light, a range-gating camera and a control computer. Visualization experiments are carried out in low-visibility foggy environment to see imaging capability.

  18. Flexible, highly sensitive pressure sensor with a wide range based on graphene-silk network structure

    Science.gov (United States)

    Liu, Ying; Tao, Lu-Qi; Wang, Dan-Yang; Zhang, Tian-Yu; Yang, Yi; Ren, Tian-Ling

    2017-03-01

    In this paper, a flexible, simple-preparation, and low-cost graphene-silk pressure sensor based on soft silk substrate through thermal reduction was demonstrated. Taking silk as the support body, the device had formed a three-dimensional structure with ordered multi-layer structure. Through a simple and low-cost process technology, graphene-silk pressure sensor can achieve the sensitivity value of 0.4 kPa - 1 , and the measurement range can be as high as 140 kPa. Besides, pressure sensor can have a good combination with knitted clothing and textile product. The signal had good reproducibility in response to different pressures. Furthermore, graphene-silk pressure sensor can not only detect pressure higher than 100 kPa, but also can measure weak body signals. The characteristics of high-sensitivity, good repeatability, flexibility, and comfort for skin provide the high possibility to fit on various wearable electronics.

  19. Wide-Range Highly-Efficient Wireless Power Receivers for Implantable Biomedical Sensors

    KAUST Repository

    Ouda, Mahmoud

    2016-11-01

    Wireless power transfer (WPT) is the key enabler for a myriad of applications, from low-power RFIDs, and wireless sensors, to wirelessly charged electric vehicles, and even massive power transmission from space solar cells. One of the major challenges in designing implantable biomedical devices is the size and lifetime of the battery. Thus, replacing the battery with a miniaturized wireless power receiver (WPRx) facilitates designing sustainable biomedical implants in smaller volumes for sentient medical applications. In the first part of this dissertation, we propose a miniaturized, fully integrated, wirelessly powered implantable sensor with on-chip antenna, designed and implemented in a standard 0.18μm CMOS process. As a batteryless device, it can be implanted once inside the body with no need for further invasive surgeries to replace batteries. The proposed single-chip solution is designed for intraocular pressure monitoring (IOPM), and can serve as a sustainable platform for implantable devices or IoT nodes. A custom setup is developed to test the chip in a saline solution with electrical properties similar to those of the aqueous humor of the eye. The proposed chip, in this eye-like setup, is wirelessly charged to 1V from a 5W transmitter 3cm away from the chip. In the second part, we propose a self-biased, differential rectifier with enhanced efficiency over an extended range of input power. A prototype is designed for the medical implant communication service (MICS) band at 433MHz. It demonstrates an efficiency improvement of more than 40% in the rectifier power conversion efficiency (PCE) and a dynamic range extension of more than 50% relative to the conventional cross-coupled rectifier. A sensitivity of -15.2dBm input power for 1V output voltage and a peak PCE of 65% are achieved for a 50k load. In the third part, we propose a wide-range, differential RF-to-DC power converter using an adaptive, self-biasing technique. The proposed architecture doubles

  20. Development of High Resolution Eddy Current Imaging Using an Electro-Mechanical Sensor (Preprint)

    Science.gov (United States)

    2011-11-01

    The Fluxgate Magnetometer ,” J. Phys. E: Sci. Instrum., Vol. 12: 241-253. 13. A. Abedi, J. J. Fellenstein, A. J. Lucas, and J. P. Wikswo, Jr., “A...206 (2006). 11. Ripka, P., 1992, Review of Fluxgate Sensors, Sensors and Actuators, A. 33, Elsevier Sequoia: 129-141. 12. Primdahl, F., 1979...superconducting quantum interference device magnetometer system for quantitative analysis and imaging of hidden corrosion activity in aircraft aluminum

  1. Simulation and measurement of total ionizing dose radiation induced image lag increase in pinned photodiode CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jing [School of Materials Science and Engineering, Xiangtan University, Hunan (China); State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Chen, Wei, E-mail: chenwei@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Wang, Zujun, E-mail: wangzujun@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Xue, Yuanyuan; Yao, Zhibin; He, Baoping; Ma, Wuying; Jin, Junshan; Sheng, Jiangkun; Dong, Guantao [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China)

    2017-06-01

    This paper presents an investigation of total ionizing dose (TID) induced image lag sources in pinned photodiodes (PPD) CMOS image sensors based on radiation experiments and TCAD simulation. The radiation experiments have been carried out at the Cobalt −60 gamma-ray source. The experimental results show the image lag degradation is more and more serious with increasing TID. Combining with the TCAD simulation results, we can confirm that the junction of PPD and transfer gate (TG) is an important region forming image lag during irradiation. These simulations demonstrate that TID can generate a potential pocket leading to incomplete transfer.

  2. Robot Vision to Monitor Structures in Invisible Fog Environments Using Active Imaging Technology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seungkyu; Park, Nakkyu; Baik, Sunghoon; Choi, Youngsoo; Jeong, Kyungmin [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    Active vision is a direct visualization technique using a highly sensitive image sensor and a high intensity illuminant. Range-gated imaging (RGI) technique providing 2D and 3D images is one of emerging active vision technologies. The RGI technique extracts vision information by summing time sliced vision images. In the RGI system, objects are illuminated for ultra-short time by a high intensity illuminant and then the light reflected from objects is captured by a highly sensitive image sensor with the exposure of ultra-short time. The RGI system provides 2D and 3D image data from several images and it moreover provides clear images from invisible fog and smoke environment by using summing of time-sliced images. Nowadays, the Range-gated (RG) imaging is an emerging technology in the field of surveillance for security applications, especially in the visualization of invisible night and fog environment. Although RGI viewing was discovered in the 1960's, this technology is, nowadays, more and more applicable by virtue of the rapid development of optical and sensor technologies, such as highly sensitive imaging sensor and ultra-short pulse laser light. In contrast to passive vision systems, this technology enables operation even in harsh environments like fog and smoke. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been demonstrated 3D imaging based on range-gated imaging. In this paper, a robot system to monitor structures in invisible fog environment is developed using an active range-gated imaging technique. The system consists of an ultra-short pulse laser device and a highly sensitive imaging sensor. The developed vision system is carried out to monitor objects in invisible fog environment. The experimental result of this newly approach vision system is described in this paper. To see invisible objects in fog

  3. Robot Vision to Monitor Structures in Invisible Fog Environments Using Active Imaging Technology

    International Nuclear Information System (INIS)

    Park, Seungkyu; Park, Nakkyu; Baik, Sunghoon; Choi, Youngsoo; Jeong, Kyungmin

    2014-01-01

    Active vision is a direct visualization technique using a highly sensitive image sensor and a high intensity illuminant. Range-gated imaging (RGI) technique providing 2D and 3D images is one of emerging active vision technologies. The RGI technique extracts vision information by summing time sliced vision images. In the RGI system, objects are illuminated for ultra-short time by a high intensity illuminant and then the light reflected from objects is captured by a highly sensitive image sensor with the exposure of ultra-short time. The RGI system provides 2D and 3D image data from several images and it moreover provides clear images from invisible fog and smoke environment by using summing of time-sliced images. Nowadays, the Range-gated (RG) imaging is an emerging technology in the field of surveillance for security applications, especially in the visualization of invisible night and fog environment. Although RGI viewing was discovered in the 1960's, this technology is, nowadays, more and more applicable by virtue of the rapid development of optical and sensor technologies, such as highly sensitive imaging sensor and ultra-short pulse laser light. In contrast to passive vision systems, this technology enables operation even in harsh environments like fog and smoke. During the past decades, several applications of this technology have been applied in target recognition and in harsh environments, such as fog, underwater vision. Also, this technology has been demonstrated 3D imaging based on range-gated imaging. In this paper, a robot system to monitor structures in invisible fog environment is developed using an active range-gated imaging technique. The system consists of an ultra-short pulse laser device and a highly sensitive imaging sensor. The developed vision system is carried out to monitor objects in invisible fog environment. The experimental result of this newly approach vision system is described in this paper. To see invisible objects in fog

  4. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    Directory of Open Access Journals (Sweden)

    Ying Cai

    2012-09-01

    Full Text Available In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT, the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3% and overall (92.0%–93.1% accuracies. Our

  5. An interferometric radar sensor for monitoring the vibrations of structures at short ranges

    Directory of Open Access Journals (Sweden)

    Luzi Guido

    2018-01-01

    Full Text Available The Real-Aperture-Radar (RAR interferometry technique consolidated in the last decade as an operational tool for the monitoring of large civil engineering structures as bridges, towers, and buildings. In literature, experimental campaigns collected through a well-known commercial equipment have been widely documented, while the cases where different types of sensors have been tested are a few. On the bases of some experimental tests, a new sensor working at high frequency, providing some improved performances, is here discussed. The core of the proposed system is an off-the-shelf, linear frequency modulated continuous wave device. The development of this apparatus is aimed at achieving a proof-of-concept, tackling operative aspects related to the development of a low cost and reliable system. The capability to detect the natural frequencies of a lightpole has been verified; comparing the results of the proposed sensor with those ones obtained through a commercial system based on the same technique, a more detailed description of the vibrating structure has been achieved. The results of this investigation confirmed that the development of sensors working at higher frequencies, although deserving deeper studies, is very promising and could open new applications demanding higher spatial resolutions at close ranges.

  6. Understanding synthesis imaging dynamic range

    Science.gov (United States)

    Braun, R.

    2013-03-01

    We develop a general framework for quantifying the many different contributions to the noise budget of an image made with an array of dishes or aperture array stations. Each noise contribution to the visibility data is associated with a relevant correlation timescale and frequency bandwidth so that the net impact on a complete observation can be assessed when a particular effect is not captured in the instrumental calibration. All quantities are parameterised as function of observing frequency and the visibility baseline length. We apply the resulting noise budget analysis to a wide range of existing and planned telescope systems that will operate between about 100 MHz and 5 GHz to ascertain the magnitude of the calibration challenges that they must overcome to achieve thermal noise limited performance. We conclude that calibration challenges are increased in several respects by small dimensions of the dishes or aperture array stations. It will be more challenging to achieve thermal noise limited performance using 15 m class dishes rather than the 25 m dishes of current arrays. Some of the performance risks are mitigated by the deployment of phased array feeds and more with the choice of an (alt,az,pol) mount, although a larger dish diameter offers the best prospects for risk mitigation. Many improvements to imaging performance can be anticipated at the expense of greater complexity in calibration algorithms. However, a fundamental limitation is ultimately imposed by an insufficient number of data constraints relative to calibration variables. The upcoming aperture array systems will be operating in a regime that has never previously been addressed, where a wide range of effects are expected to exceed the thermal noise by two to three orders of magnitude. Achieving routine thermal noise limited imaging performance with these systems presents an extreme challenge. The magnitude of that challenge is inversely related to the aperture array station diameter.

  7. Selection of bi-level image compression method for reduction of communication energy in wireless visual sensor networks

    Science.gov (United States)

    Khursheed, Khursheed; Imran, Muhammad; Ahmad, Naeem; O'Nils, Mattias

    2012-06-01

    Wireless Visual Sensor Network (WVSN) is an emerging field which combines image sensor, on board computation unit, communication component and energy source. Compared to the traditional wireless sensor network, which operates on one dimensional data, such as temperature, pressure values etc., WVSN operates on two dimensional data (images) which requires higher processing power and communication bandwidth. Normally, WVSNs are deployed in areas where installation of wired solutions is not feasible. The energy budget in these networks is limited to the batteries, because of the wireless nature of the application. Due to the limited availability of energy, the processing at Visual Sensor Nodes (VSN) and communication from VSN to server should consume as low energy as possible. Transmission of raw images wirelessly consumes a lot of energy and requires higher communication bandwidth. Data compression methods reduce data efficiently and hence will be effective in reducing communication cost in WVSN. In this paper, we have compared the compression efficiency and complexity of six well known bi-level image compression methods. The focus is to determine the compression algorithms which can efficiently compress bi-level images and their computational complexity is suitable for computational platform used in WVSNs. These results can be used as a road map for selection of compression methods for different sets of constraints in WVSN.

  8. Luminescence imaging of water during proton-beam irradiation for range estimation

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Seiichi, E-mail: s-yama@met.nagoya-u.ac.jp; Okumura, Satoshi; Komori, Masataka [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine, Nagoya 461-8673 (Japan); Toshito, Toshiyuki [Department of Proton Therapy Physics, Nagoya Proton Therapy Center, Nagoya City West Medical Center, Nagoya 462-8508 (Japan)

    2015-11-15

    Purpose: Proton therapy has the ability to selectively deliver a dose to the target tumor, so the dose distribution should be accurately measured by a precise and efficient method. The authors found that luminescence was emitted from water during proton irradiation and conjectured that this phenomenon could be used for estimating the dose distribution. Methods: To achieve more accurate dose distribution, the authors set water phantoms on a table with a spot scanning proton therapy system and measured the luminescence images of these phantoms with a high-sensitivity, cooled charge coupled device camera during proton-beam irradiation. The authors imaged the phantoms of pure water, fluorescein solution, and an acrylic block. Results: The luminescence images of water phantoms taken during proton-beam irradiation showed clear Bragg peaks, and the measured proton ranges from the images were almost the same as those obtained with an ionization chamber. Furthermore, the image of the pure-water phantom showed almost the same distribution as the tap-water phantom, indicating that the luminescence image was not related to impurities in the water. The luminescence image of the fluorescein solution had ∼3 times higher intensity than water, with the same proton range as that of water. The luminescence image of the acrylic phantom had a 14.5% shorter proton range than that of water; the proton range in the acrylic phantom generally matched the calculated value. The luminescence images of the tap-water phantom during proton irradiation could be obtained in less than 2 s. Conclusions: Luminescence imaging during proton-beam irradiation is promising as an effective method for range estimation in proton therapy.

  9. Luminescence imaging of water during proton-beam irradiation for range estimation

    International Nuclear Information System (INIS)

    Yamamoto, Seiichi; Okumura, Satoshi; Komori, Masataka; Toshito, Toshiyuki

    2015-01-01

    Purpose: Proton therapy has the ability to selectively deliver a dose to the target tumor, so the dose distribution should be accurately measured by a precise and efficient method. The authors found that luminescence was emitted from water during proton irradiation and conjectured that this phenomenon could be used for estimating the dose distribution. Methods: To achieve more accurate dose distribution, the authors set water phantoms on a table with a spot scanning proton therapy system and measured the luminescence images of these phantoms with a high-sensitivity, cooled charge coupled device camera during proton-beam irradiation. The authors imaged the phantoms of pure water, fluorescein solution, and an acrylic block. Results: The luminescence images of water phantoms taken during proton-beam irradiation showed clear Bragg peaks, and the measured proton ranges from the images were almost the same as those obtained with an ionization chamber. Furthermore, the image of the pure-water phantom showed almost the same distribution as the tap-water phantom, indicating that the luminescence image was not related to impurities in the water. The luminescence image of the fluorescein solution had ∼3 times higher intensity than water, with the same proton range as that of water. The luminescence image of the acrylic phantom had a 14.5% shorter proton range than that of water; the proton range in the acrylic phantom generally matched the calculated value. The luminescence images of the tap-water phantom during proton irradiation could be obtained in less than 2 s. Conclusions: Luminescence imaging during proton-beam irradiation is promising as an effective method for range estimation in proton therapy

  10. An off-on Fluorescent Sensor for Detecting a Wide Range of Water Content in Organic Solvents

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Kanghyeon; Lee, Wanjin; Kim, Jae Nyoung; Kim, Hyung Jin [Chonnam National Univ., Gwangju (Korea, Republic of)

    2013-08-15

    This paper describes the synthesis and water sensing properties of a fluorescent photoinduced electron transfer (PET) sensor (5) with an extended operating sensing range. The 1,8-naphthalimide derivative (5) attached with a piperazine group and a carboxylic group was synthesized and applied as a fluorescent water sensor in water-miscible organic solvents. The fluorescence intensity of the dye 5 increased with increasing water content up to 80% (v/v) and the fluorescence intensities were enhanced 45-, 67- and 122-fold in aqueous EtOH, DMF and DMSO solutions, respectively. In aqueous acetone solution, the enhancement of the fluorescence intensities was somewhat lower (30-fold) but the response range was wider (0-90%, v/v)

  11. Development of a solid-state multi-sensor array camera for real time imaging of magnetic fields

    International Nuclear Information System (INIS)

    Benitez, D; Gaydecki, P; Quek, S; Torres, V

    2007-01-01

    The development of a real-time magnetic field imaging camera based on solid-state sensors is described. The final laboratory comprises a 2D array of 33 x 33 solid state, tri-axial magneto-inductive sensors, and is located within a large current-carrying coil. This may be excited to produce either a steady or time-varying magnetic field. Outputs from several rows of sensors are routed to a sub-master controller and all sub-masters route to a master-controller responsible for data coordination and signal pre-processing. The data are finally streamed to a host computer via a USB interface and the image generated and displayed at a rate of several frames per second. Accurate image generation is predicated on a knowledge of the sensor response, magnetic field perturbations and the nature of the target respecting permeability and conductivity. To this end, the development of the instrumentation has been complemented by extensive numerical modelling of field distribution patterns using boundary element methods. Although it was originally intended for deployment in the nondestructive evaluation (NDE) of reinforced concrete, it was soon realised during the course of the work that the magnetic field imaging system had many potential applications, for example, in medicine, security screening, quality assurance (such as the food industry), other areas of nondestructive evaluation (NDE), designs associated with magnetic fields, teaching and research

  12. Development of a solid-state multi-sensor array camera for real time imaging of magnetic fields

    Science.gov (United States)

    Benitez, D.; Gaydecki, P.; Quek, S.; Torres, V.

    2007-07-01

    The development of a real-time magnetic field imaging camera based on solid-state sensors is described. The final laboratory comprises a 2D array of 33 x 33 solid state, tri-axial magneto-inductive sensors, and is located within a large current-carrying coil. This may be excited to produce either a steady or time-varying magnetic field. Outputs from several rows of sensors are routed to a sub-master controller and all sub-masters route to a master-controller responsible for data coordination and signal pre-processing. The data are finally streamed to a host computer via a USB interface and the image generated and displayed at a rate of several frames per second. Accurate image generation is predicated on a knowledge of the sensor response, magnetic field perturbations and the nature of the target respecting permeability and conductivity. To this end, the development of the instrumentation has been complemented by extensive numerical modelling of field distribution patterns using boundary element methods. Although it was originally intended for deployment in the nondestructive evaluation (NDE) of reinforced concrete, it was soon realised during the course of the work that the magnetic field imaging system had many potential applications, for example, in medicine, security screening, quality assurance (such as the food industry), other areas of nondestructive evaluation (NDE), designs associated with magnetic fields, teaching and research.

  13. A novel track imaging system as a range counter

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z. [National Institute of Radiological Sciences (Japan); Matsufuji, N. [National Institute of Radiological Sciences (Japan); Tokyo Institute of Technology (Japan); Kanayama, S. [Chiba University (Japan); Ishida, A. [National Institute of Radiological Sciences (Japan); Tokyo Institute of Technology (Japan); Kohno, T. [Tokyo Institute of Technology (Japan); Koba, Y.; Sekiguchi, M.; Kitagawa, A.; Murakami, T. [National Institute of Radiological Sciences (Japan)

    2016-05-01

    An image-intensified, camera-based track imaging system has been developed to measure the tracks of ions in a scintillator block. To study the performance of the detector unit in the system, two types of scintillators, a dosimetrically tissue-equivalent plastic scintillator EJ-240 and a CsI(Tl) scintillator, were separately irradiated with carbon ion ({sup 12}C) beams of therapeutic energy from HIMAC at NIRS. The images of individual ion tracks in the scintillators were acquired by the newly developed track imaging system. The ranges reconstructed from the images are reported here. The range resolution of the measurements is 1.8 mm for 290 MeV/u carbon ions, which is considered a significant improvement on the energy resolution of the conventional ΔE/E method. The detector is compact and easy to handle, and it can fit inside treatment rooms for in-situ studies, as well as satisfy clinical quality assurance purposes.

  14. Highly sensitive digital optical sensor with large measurement range based on the dual-microring resonator with waveguide-coupled feedback

    International Nuclear Information System (INIS)

    Xiang Xing-Ye; Wang Kui-Ru; Yuan Jin-Hui; Jin Bo-Yuan; Sang Xin-Zhu; Yu Chong-Xiu

    2014-01-01

    We propose a novel high-performance digital optical sensor based on the Mach—Zehnder interferential effect and the dual-microring resonators with the waveguide-coupled feedback. The simulation results show that the sensitivity of the sensor can be orders of magnitude higher than that of a conventional sensor, and high quality factor is not critical in it. Moreover, by optimizing the length of the feedback waveguide to be equal to the perimeter of the ring, the measurement range of the proposed sensor is twice as much as that of the conventional sensor in the weak coupling case

  15. Development of a quartz tuning-fork-based force sensor for measurements in the tens of nanoNewton force range during nanomanipulation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Oiko, V. T. A., E-mail: oiko@ifi.unicamp.br; Rodrigues, V.; Ugarte, D. [Instituto de Física “Gleb Wataghin,” Univ. Estadual de Campinas (UNICAMP), Campinas 13083-859 (Brazil); Martins, B. V. C. [Department of Physics, University of Alberta, Edmonton, Alberta T6G 2R3 (Canada); Silva, P. C. [Laboratório Nacional de Nanotecnologia, CNPEM, Campinas 13083-970 (Brazil)

    2014-03-15

    Understanding the mechanical properties of nanoscale systems requires new experimental and theoretical tools. In particular, force sensors compatible with nanomechanical testing experiments and with sensitivity in the nN range are required. Here, we report the development and testing of a tuning-fork-based force sensor for in situ nanomanipulation experiments inside a scanning electron microscope. The sensor uses a very simple design for the electronics and it allows the direct and quantitative force measurement in the 1–100 nN force range. The sensor response is initially calibrated against a nN range force standard, as, for example, a calibrated Atomic Force Microscopy cantilever; subsequently, applied force values can be directly derived using only the electric signals generated by the tuning fork. Using a homemade nanomanipulator, the quantitative force sensor has been used to analyze the mechanical deformation of multi-walled carbon nanotube bundles, where we analyzed forces in the 5–40 nN range, measured with an error bar of a few nN.

  16. Development of High Resolution Eddy Current Imaging Using an Electro-Mechanical Sensor (Postprint)

    Science.gov (United States)

    2011-08-01

    Primdahl, F., 1979, “The Fluxgate Magnetometer ,” J. Phys. E: Sci. Instrum., Vol. 12: 241-253. 13. A. Abedi, J. J. Fellenstein, A. J. Lucas, and J. P...Issues 1-2, Pages 203-206 (2006). 11. Ripka, P., 1992, Review of Fluxgate Sensors, Sensors and Actuators, A. 33, Elsevier Sequoia: 129-141. 12...Wikswo, Jr., “A superconducting quantum interference device magnetometer system for quantitative analysis and imaging of hidden corrosion activity in

  17. Continued development of a portable widefield hyperspectral imaging (HSI) sensor for standoff detection of explosive, chemical, and narcotic residues

    Science.gov (United States)

    Nelson, Matthew P.; Gardner, Charles W.; Klueva, Oksana; Tomas, David

    2014-05-01

    Passive, standoff detection of chemical, explosive and narcotic threats employing widefield, shortwave infrared (SWIR) hyperspectral imaging (HSI) continues to gain acceptance in defense and security fields. A robust and user-friendly portable platform with such capabilities increases the effectiveness of locating and identifying threats while reducing risks to personnel. In 2013 ChemImage Sensor Systems (CISS) introduced Aperio, a handheld sensor, using real-time SWIR HSI for wide area surveillance and standoff detection of explosives, chemical threats, and narcotics. That SWIR HSI system employed a liquid-crystal tunable filter for real-time automated detection and display of threats. In these proceedings, we report on a next generation device called VeroVision™, which incorporates an improved optical design that enhances detection performance at greater standoff distances with increased sensitivity and detection speed. A tripod mounted sensor head unit (SHU) with an optional motorized pan-tilt unit (PTU) is available for precision pointing and sensor stabilization. This option supports longer standoff range applications which are often seen at checkpoint vehicle inspection where speed and precision is necessary. Basic software has been extended to include advanced algorithms providing multi-target display functionality, automatic threshold determination, and an automated detection recipe capability for expanding the library as new threats emerge. In these proceedings, we report on the improvements associated with the next generation portable widefield SWIR HSI sensor, VeroVision™. Test data collected during development are presented in this report which supports the targeted applications for use of VeroVision™ for screening residue and bulk levels of explosive and drugs on vehicles and personnel at checkpoints as well as various applications for other secure areas. Additionally, we highlight a forensic application of the technology for assisting forensic

  18. Real-time, wide-area hyperspectral imaging sensors for standoff detection of explosives and chemical warfare agents

    Science.gov (United States)

    Gomer, Nathaniel R.; Tazik, Shawna; Gardner, Charles W.; Nelson, Matthew P.

    2017-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the detection and analysis of targets located within complex backgrounds. HSI can detect threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Unfortunately, current generation HSI systems have size, weight, and power limitations that prohibit their use for field-portable and/or real-time applications. Current generation systems commonly provide an inefficient area search rate, require close proximity to the target for screening, and/or are not capable of making real-time measurements. ChemImage Sensor Systems (CISS) is developing a variety of real-time, wide-field hyperspectral imaging systems that utilize shortwave infrared (SWIR) absorption and Raman spectroscopy. SWIR HSI sensors provide wide-area imagery with at or near real time detection speeds. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focusing on sensor design and detection results.

  19. Sensor Pods: Multi-Resolution Surveys from a Light Aircraft

    Directory of Open Access Journals (Sweden)

    Conor Cahalane

    2017-02-01

    Full Text Available Airborne remote sensing, whether performed from conventional aerial survey platforms such as light aircraft or the more recent Remotely Piloted Airborne Systems (RPAS has the ability to compliment mapping generated using earth-orbiting satellites, particularly for areas that may experience prolonged cloud cover. Traditional aerial platforms are costly but capture spectral resolution imagery over large areas. RPAS are relatively low-cost, and provide very-high resolution imagery but this is limited to small areas. We believe that we are the first group to retrofit these new, low-cost, lightweight sensors in a traditional aircraft. Unlike RPAS surveys which have a limited payload, this is the first time that a method has been designed to operate four distinct RPAS sensors simultaneously—hyperspectral, thermal, hyper, RGB, video. This means that imagery covering a broad range of the spectrum captured during a single survey, through different imaging capture techniques (frame, pushbroom, video can be applied to investigate different multiple aspects of the surrounding environment such as, soil moisture, vegetation vitality, topography or drainage, etc. In this paper, we present the initial results validating our innovative hybrid system adapting dedicated RPAS sensors for a light aircraft sensor pod, thereby providing the benefits of both methodologies. Simultaneous image capture with a Nikon D800E SLR and a series of dedicated RPAS sensors, including a FLIR thermal imager, a four-band multispectral camera and a 100-band hyperspectral imager was enabled by integration in a single sensor pod operating from a Cessna c172. However, to enable accurate sensor fusion for image analysis, each sensor must first be combined in a common vehicle coordinate system and a method for triggering, time-stamping and calculating the position/pose of each sensor at the time of image capture devised. Initial tests were carried out over agricultural regions with

  20. High frame rate multi-resonance imaging refractometry with distributed feedback dye laser sensor

    DEFF Research Database (Denmark)

    Vannahme, Christoph; Dufva, Martin; Kristensen, Anders

    2015-01-01

    imaging refractometry without moving parts is presented. DFB dye lasers are low-cost and highly sensitive refractive index sensors. The unique multi-wavelength DFB laser structure presented here comprises several areas with different grating periods. Imaging in two dimensions of space is enabled...... by analyzing laser light from all areas in parallel with an imaging spectrometer. With this multi-resonance imaging refractometry method, the spatial position in one direction is identified from the horizontal, i.e., spectral position of the multiple laser lines which is obtained from the spectrometer charged...

  1. Laser beam welding quality monitoring system based in high-speed (10 kHz) uncooled MWIR imaging sensors

    Science.gov (United States)

    Linares, Rodrigo; Vergara, German; Gutiérrez, Raúl; Fernández, Carlos; Villamayor, Víctor; Gómez, Luis; González-Camino, Maria; Baldasano, Arturo; Castro, G.; Arias, R.; Lapido, Y.; Rodríguez, J.; Romero, Pablo

    2015-05-01

    The combination of flexibility, productivity, precision and zero-defect manufacturing in future laser-based equipment are a major challenge that faces this enabling technology. New sensors for online monitoring and real-time control of laserbased processes are necessary for improving products quality and increasing manufacture yields. New approaches to fully automate processes towards zero-defect manufacturing demand smarter heads where lasers, optics, actuators, sensors and electronics will be integrated in a unique compact and affordable device. Many defects arising in laser-based manufacturing processes come from instabilities in the dynamics of the laser process. Temperature and heat dynamics are key parameters to be monitored. Low cost infrared imagers with high-speed of response will constitute the next generation of sensors to be implemented in future monitoring and control systems for laser-based processes, capable to provide simultaneous information about heat dynamics and spatial distribution. This work describes the result of using an innovative low-cost high-speed infrared imager based on the first quantum infrared imager monolithically integrated with Si-CMOS ROIC of the market. The sensor is able to provide low resolution images at frame rates up to 10 KHz in uncooled operation at the same cost as traditional infrared spot detectors. In order to demonstrate the capabilities of the new sensor technology, a low-cost camera was assembled on a standard production laser welding head, allowing to register melting pool images at frame rates of 10 kHz. In addition, a specific software was developed for defect detection and classification. Multiple laser welding processes were recorded with the aim to study the performance of the system and its application to the real-time monitoring of laser welding processes. During the experiments, different types of defects were produced and monitored. The classifier was fed with the experimental images obtained. Self

  2. Empirical electro-optical and x-ray performance evaluation of CMOS active pixels sensor for low dose, high resolution x-ray medical imaging

    International Nuclear Information System (INIS)

    Arvanitis, C. D.; Bohndiek, S. E.; Royle, G.; Blue, A.; Liang, H. X.; Clark, A.; Prydderch, M.; Turchetta, R.; Speller, R.

    2007-01-01

    Monolithic complementary metal oxide semiconductor (CMOS) active pixel sensors with high performance have gained attention in the last few years in many scientific and space applications. In order to evaluate the increasing capabilities of this technology, in particular where low dose high resolution x-ray medical imaging is required, critical electro-optical and physical x-ray performance evaluation was determined. The electro-optical performance includes read noise, full well capacity, interacting quantum efficiency, and pixels cross talk. The x-ray performance, including x-ray sensitivity, modulation transfer function, noise power spectrum, and detection quantum efficiency, has been evaluated in the mammographic energy range. The sensor is a 525x525 standard three transistor CMOS active pixel sensor array with more than 75% fill factor and 25x25 μm pixel pitch. Reading at 10 f/s, it is found that the sensor has 114 electrons total additive noise, 10 5 electrons full well capacity with shot noise limited operation, and 34% interacting quantum efficiency at 530 nm. Two different structured CsI:Tl phosphors with thickness 95 and 115 μm, respectively, have been optically coupled via a fiber optic plate to the array resulting in two different system configurations. The sensitivity of the two different system configurations was 43 and 47 electrons per x-ray incident on the sensor. The MTF at 10% of the two different system configurations was 9.5 and 9 cycles/mm with detective quantum efficiency of 0.45 and 0.48, respectively, close to zero frequency at ∼0.44 μC/kg (1.72 mR) detector entrance exposure. The detector was quantum limited at low spatial frequencies and its performance was comparable with high resolution a:Si and charge coupled device based x-ray imagers. The detector also demonstrates almost an order of magnitude lower noise than active matrix flat panel imagers. The results suggest that CMOS active pixel sensors when coupled to structured CsI:Tl can

  3. Image dynamic range test and evaluation of Gaofen-2 dual cameras

    Science.gov (United States)

    Zhang, Zhenhua; Gan, Fuping; Wei, Dandan

    2015-12-01

    In order to fully understand the dynamic range of Gaofen-2 satellite data and support the data processing, application and next satellites development, in this article, we evaluated the dynamic range by calculating some statistics such as maximum ,minimum, average and stand deviation of four images obtained at the same time by Gaofen-2 dual cameras in Beijing area; then the maximum ,minimum, average and stand deviation of each longitudinal overlap of PMS1,PMS2 were calculated respectively for the evaluation of each camera's dynamic range consistency; and these four statistics of each latitudinal overlap of PMS1,PMS2 were calculated respectively for the evaluation of the dynamic range consistency between PMS1 and PMS2 at last. The results suggest that there is a wide dynamic range of DN value in the image obtained by PMS1 and PMS2 which contains rich information of ground objects; in general, the consistency of dynamic range between the single camera images is in close agreement, but also a little difference, so do the dual cameras. The consistency of dynamic range between the single camera images is better than the dual cameras'.

  4. A screen-printed flexible flow sensor

    International Nuclear Information System (INIS)

    Moschos, A; Kaltsas, G; Syrovy, T; Syrova, L

    2017-01-01

    A thermal flow sensor was printed on a flexible plastic substrate using exclusively screen-printing techniques. The presented device was implemented with custom made screen-printed thermistors, which allows simple, cost-efficient production on a variety of flexible substrates while maintaining the typical advantages of thermal flow sensors. Evaluation was performed for both static (zero flow) and dynamic conditions using a combination of electrical measurements and IR imaging techniques in order to determine important characteristics, such as temperature response, output repeatability, etc. The flow sensor was characterized utilizing the hot-wire and calorimetric principles of operation, while the preliminary results appear to be very promising, since the sensor was successfully evaluated and displayed adequate sensitivity in a relatively wide flow range. (paper)

  5. Discrimination between sedimentary rocks from close-range visible and very-near-infrared images

    NARCIS (Netherlands)

    Pozo, Susana Del; Lindenbergh, R.C.; Rodríguez-Gonzálvez, Pablo; Blom, J.C.; González-Aguilera, Diego

    2015-01-01

    Variation in the mineral composition of rocks results in a change of their spectral response capable of being studied by imaging spectroscopy. This paper proposes the use of a low-cost handy sensor, a calibrated visible-very near infrared (VIS-VNIR) multispectral camera for the recognition of

  6. Third-generation imaging sensor system concepts

    Science.gov (United States)

    Reago, Donald A.; Horn, Stuart B.; Campbell, James, Jr.; Vollmerhausen, Richard H.

    1999-07-01

    Second generation forward looking infrared sensors, based on either parallel scanning, long wave (8 - 12 um) time delay and integration HgCdTe detectors or mid wave (3 - 5 um), medium format staring (640 X 480 pixels) InSb detectors, are being fielded. The science and technology community is now turning its attention toward the definition of a future third generation of FLIR sensors, based on emerging research and development efforts. Modeled third generation sensor performance demonstrates a significant improvement in performance over second generation, resulting in enhanced lethality and survivability on the future battlefield. In this paper we present the current thinking on what third generation sensors systems will be and the resulting requirements for third generation focal plane array detectors. Three classes of sensors have been identified. The high performance sensor will contain a megapixel or larger array with at least two colors. Higher operating temperatures will also be the goal here so that power and weight can be reduced. A high performance uncooled sensor is also envisioned that will perform somewhere between first and second generation cooled detectors, but at significantly lower cost, weight, and power. The final third generation sensor is a very low cost micro sensor. This sensor can open up a whole new IR market because of its small size, weight, and cost. Future unattended throwaway sensors, micro UAVs, and helmet mounted IR cameras will be the result of this new class.

  7. Application Of FA Sensor 2

    International Nuclear Information System (INIS)

    Park, Seon Ho

    1993-03-01

    This book introduces FA sensor from basic to making system, which includes light sensor like photo diode and photo transistor, photo electricity sensor, CCD type image sensor, MOS type image sensor, color sensor, cds cell, and optical fiber scope. It also deals with direct election position sensor such as proximity switch, differential motion, linear scale of photo electricity type, and magnet scale, rotary sensor with summary of rotary encoder, rotary encoder types and applications, flow sensor, and sensing technology.

  8. Study of CMOS Image Sensors for the Alignment System of the CMS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Virto, A. L.; Vila, I.; Rodrigo, T.; Matorras, F.; Figueroa, C. F.; Calvo, E.; Calderon, A.; Arce, P.; Oller, J. C.; Molinero, A.; Josa, M. I.; Fuentes, J.; Ferrando, A.; Fernandez, M. G.; Barcala, J. M.

    2002-07-01

    We report on an in-depth study made on commercial CMOS image sensors in order to determine their feasibility for beam light position detection in the CMS multipoint alignment scheme. (Author) 21 refs.

  9. Gimbal Integration to Small Format, Airborne, MWIR and LWIR Imaging Sensors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is for enhanced sensor performance and high resolution imaging for Long Wave InfraRed (LWIR) and Medium Wave IR (MWIR) camera systems used in...

  10. Imaging Intracellular pH in Live Cells with a Genetically-Encoded Red Fluorescent Protein Sensor

    OpenAIRE

    Tantama, Mathew; Hung, Yin Pun; Yellen, Gary

    2011-01-01

    Intracellular pH affects protein structure and function, and proton gradients underlie the function of organelles such as lysosomes and mitochondria. We engineered a genetically-encoded pH sensor by mutagenesis of the red fluorescent protein mKeima, providing a new tool to image intracellular pH in live cells. This sensor, named pHRed, is the first ratiometric, single-protein red fluorescent sensor of pH. Fluorescence emission of pHRed peaks at 610 nm while exhibiting dual excitation peaks at...

  11. A radiographic imaging system based upon a 2-D silicon microstrip sensor

    CERN Document Server

    Papanestis, A; Corrin, E; Raymond, M; Hall, G; Triantis, F A; Manthos, N; Evagelou, I; Van den Stelt, P; Tarrant, T; Speller, R D; Royle, G F

    2000-01-01

    A high resolution, direct-digital detector system based upon a 2-D silicon microstrip sensor has been designed, built and is undergoing evaluation for applications in dentistry and mammography. The sensor parameters and image requirements were selected using Monte Carlo simulations. Sensors selected for evaluation have a strip pitch of 50mum on the p-side and 80mum on the n-side. Front-end electronics and data acquisition are based on the APV6 chip and were adapted from systems used at CERN for high-energy physics experiments. The APV6 chip is not self-triggering so data acquisition is done at a fixed trigger rate. This paper describes the mammographic evaluation of the double sided microstrip sensor. Raw data correction procedures were implemented to remove the effects of dead strips and non-uniform response. Standard test objects (TORMAX) were used to determine limiting spatial resolution and detectability. MTFs were determined using the edge response. The results indicate that the spatial resolution of the...

  12. Reduction of CMOS Image Sensor Read Noise to Enable Photon Counting.

    Science.gov (United States)

    Guidash, Michael; Ma, Jiaju; Vogelsang, Thomas; Endsley, Jay

    2016-04-09

    Recent activity in photon counting CMOS image sensors (CIS) has been directed to reduction of read noise. Many approaches and methods have been reported. This work is focused on providing sub 1 e(-) read noise by design and operation of the binary and small signal readout of photon counting CIS. Compensation of transfer gate feed-through was used to provide substantially reduced CDS time and source follower (SF) bandwidth. SF read noise was reduced by a factor of 3 with this method. This method can be applied broadly to CIS devices to reduce the read noise for small signals to enable use as a photon counting sensor.

  13. Target recognition of ladar range images using slice image: comparison of four improved algorithms

    Science.gov (United States)

    Xia, Wenze; Han, Shaokun; Cao, Jingya; Wang, Liang; Zhai, Yu; Cheng, Yang

    2017-07-01

    Compared with traditional 3-D shape data, ladar range images possess properties of strong noise, shape degeneracy, and sparsity, which make feature extraction and representation difficult. The slice image is an effective feature descriptor to resolve this problem. We propose four improved algorithms on target recognition of ladar range images using slice image. In order to improve resolution invariance of the slice image, mean value detection instead of maximum value detection is applied in these four improved algorithms. In order to improve rotation invariance of the slice image, three new improved feature descriptors-which are feature slice image, slice-Zernike moments, and slice-Fourier moments-are applied to the last three improved algorithms, respectively. Backpropagation neural networks are used as feature classifiers in the last two improved algorithms. The performance of these four improved recognition systems is analyzed comprehensively in the aspects of the three invariances, recognition rate, and execution time. The final experiment results show that the improvements for these four algorithms reach the desired effect, the three invariances of feature descriptors are not directly related to the final recognition performance of recognition systems, and these four improved recognition systems have different performances under different conditions.

  14. 110 °C range athermalization of wavefront coding infrared imaging systems

    Science.gov (United States)

    Feng, Bin; Shi, Zelin; Chang, Zheng; Liu, Haizheng; Zhao, Yaohong

    2017-09-01

    110 °C range athermalization is significant but difficult for designing infrared imaging systems. Our wavefront coding athermalized infrared imaging system adopts an optical phase mask with less manufacturing errors and a decoding method based on shrinkage function. The qualitative experiments prove that our wavefront coding athermalized infrared imaging system has three prominent merits: (1) working well over a temperature range of 110 °C; (2) extending the focal depth up to 15.2 times; (3) achieving a decoded image being approximate to its corresponding in-focus infrared image, with a mean structural similarity index (MSSIM) value greater than 0.85.

  15. Proximity Operations and Docking Sensor Development

    Science.gov (United States)

    Howard, Richard T.; Bryan, Thomas C.; Brewster, Linda L.; Lee, James E.

    2009-01-01

    The Next Generation Advanced Video Guidance Sensor (NGAVGS) has been under development for the last three years as a long-range proximity operations and docking sensor for use in an Automated Rendezvous and Docking (AR&D) system. The first autonomous rendezvous and docking in the history of the U.S. Space Program was successfully accomplished by Orbital Express, using the Advanced Video Guidance Sensor (AVGS) as the primary docking sensor. That flight proved that the United States now has a mature and flight proven sensor technology for supporting Crew Exploration Vehicles (CEV) and Commercial Orbital Transport Systems (COTS) Automated Rendezvous and Docking (AR&D). NASA video sensors have worked well in the past: the AVGS used on the Demonstration of Autonomous Rendezvous Technology (DART) mission operated successfully in spot mode out to 2 km, and the first generation rendezvous and docking sensor, the Video Guidance Sensor (VGS), was developed and successfully flown on Space Shuttle flights in 1997 and 1998. 12 Parts obsolescence issues prevent the construction of more AVGS units, and the next generation sensor was updated to allow it to support the CEV and COTS programs. The flight proven AR&D sensor has been redesigned to update parts and add additional capabilities for CEV and COTS with the development of the Next Generation AVGS at the Marshall Space Flight Center. The obsolete imager and processor are being replaced with new radiation tolerant parts. In addition, new capabilities include greater sensor range, auto ranging capability, and real-time video output. This paper presents some sensor hardware trades, use of highly integrated laser components, and addresses the needs of future vehicles that may rendezvous and dock with the International Space Station (ISS) and other Constellation vehicles. It also discusses approaches for upgrading AVGS to address parts obsolescence, and concepts for minimizing the sensor footprint, weight, and power requirements

  16. Comparison and experimental validation of two potential resonant viscosity sensors in the kilohertz range

    International Nuclear Information System (INIS)

    Lemaire, Etienne; Caillard, Benjamin; Dufour, Isabelle; Heinisch, Martin; Jakoby, Bernhard

    2013-01-01

    Oscillating microstructures are well established and find application in many fields. These include force sensors, e.g. AFM micro-cantilevers or accelerometers based on resonant suspended plates. This contribution presents two vibrating mechanical structures acting as force sensors in liquid media in order to measure hydrodynamic interactions. Rectangular cross section microcantilevers as well as circular cross section wires are investigated. Each structure features specific benefits, which are discussed in detail. Furthermore, their mechanical parameters and their deflection in liquids are characterized. Finally, an inverse analytical model is applied to calculate the complex viscosity near the resonant frequency for both types of structures. With this approach it is possible to determine rheological parameters in the kilohertz range in situ within a few seconds. The monitoring of the complex viscosity of yogurt during the fermentation process is used as a proof of concept to qualify at least one of the two sensors in opaque mixtures. (paper)

  17. Comparison and experimental validation of two potential resonant viscosity sensors in the kilohertz range

    Energy Technology Data Exchange (ETDEWEB)

    Lemaire, Etienne; Caillard, Benjamin; Dufour, Isabelle [Univ. Bordeaux, IMS, UMR 5218, F-33400 Talence (France); Heinisch, Martin; Jakoby, Bernhard [Institute for Microelectronics and Microsensors, Johannes Kepler University, Linz (Austria)

    2013-08-15

    Oscillating microstructures are well established and find application in many fields. These include force sensors, e.g. AFM micro-cantilevers or accelerometers based on resonant suspended plates. This contribution presents two vibrating mechanical structures acting as force sensors in liquid media in order to measure hydrodynamic interactions. Rectangular cross section microcantilevers as well as circular cross section wires are investigated. Each structure features specific benefits, which are discussed in detail. Furthermore, their mechanical parameters and their deflection in liquids are characterized. Finally, an inverse analytical model is applied to calculate the complex viscosity near the resonant frequency for both types of structures. With this approach it is possible to determine rheological parameters in the kilohertz range in situ within a few seconds. The monitoring of the complex viscosity of yogurt during the fermentation process is used as a proof of concept to qualify at least one of the two sensors in opaque mixtures. (paper)

  18. Comparison and experimental validation of two potential resonant viscosity sensors in the kilohertz range

    Science.gov (United States)

    Lemaire, Etienne; Heinisch, Martin; Caillard, Benjamin; Jakoby, Bernhard; Dufour, Isabelle

    2013-08-01

    Oscillating microstructures are well established and find application in many fields. These include force sensors, e.g. AFM micro-cantilevers or accelerometers based on resonant suspended plates. This contribution presents two vibrating mechanical structures acting as force sensors in liquid media in order to measure hydrodynamic interactions. Rectangular cross section microcantilevers as well as circular cross section wires are investigated. Each structure features specific benefits, which are discussed in detail. Furthermore, their mechanical parameters and their deflection in liquids are characterized. Finally, an inverse analytical model is applied to calculate the complex viscosity near the resonant frequency for both types of structures. With this approach it is possible to determine rheological parameters in the kilohertz range in situ within a few seconds. The monitoring of the complex viscosity of yogurt during the fermentation process is used as a proof of concept to qualify at least one of the two sensors in opaque mixtures.

  19. Long-range surface plasmon resonance sensor with liquid micro-channels to maintain the symmetry condition of the refractive index

    International Nuclear Information System (INIS)

    Kan, Tetsuo; Kojo, Hiroyuki; Iwase, Eiji; Matsumoto, Kiyoshi; Shimoyama, Isao

    2010-01-01

    We propose a method to maintain the symmetry of the refractive index with respect to an Au film, in which the refractive indices are the same near both surfaces of the Au film, for LRSPR (long-range surface plasmon resonance) sensors. Maintenance of the symmetry is necessary for exciting the LRSPR mode. However, because the buffer layer under the Au film is usually made of a solid dielectric material with a constant refractive index, the quality of the measurement is reduced when the refractive index of the analyte used is dramatically different from that of the buffer layer. To solve this problem, the proposed sensor is equipped with liquid channels under the Au film. The Au film used in this study was supported by a thin (100 nm) polymer film forming parallel, one-dimensional liquid channels with a 29 µm pitch. Because the analyte solution flows in the channels, both surfaces of the Au film face the same analyte. Thus, this configuration automatically satisfies the symmetry condition for analytes with a wide range of refractive indices. We examined the validity of the sensor and compared it to that of a conventional sensor by measuring the LRSPR for five analyte solutions with different refractive indices. LRSPR dips were clearly observed for all of the analytes tested. The dip of the conventional LRSPR sensor became shallow when the refractive index increased, with the final dip depth being 65% of the initial dip depth for a refractive index of 1.358. In contrast, the dip depth of the proposed LRSPR sensor remained constant over the entire measured refractive index range of 1.331 to 1.358. These results indicate that the proposed sensor maintains the symmetry condition and confirm that the proposed method is effective for highly sensitive LRSPR measurements for a wide variety of analyte species

  20. Image enhancement circuit using nonlinear processing curve and constrained histogram range equalization

    NARCIS (Netherlands)

    Cvetkovic, S.D.; With, de P.H.N.; Panchanathan, S.; Vasudev, B.

    2004-01-01

    For real-time imaging in surveillance applications, image fidelity is of primary importance to ensure customer confidence. The obtained image fidelity is a result from amongst others dynamic range expansion and video signal enhancement. The dynamic range of the signal needs adaptation, because the

  1. Multi-exposure high dynamic range image synthesis with camera shake correction

    Science.gov (United States)

    Li, Xudong; Chen, Yongfu; Jiang, Hongzhi; Zhao, Huijie

    2017-10-01

    Machine vision plays an important part in industrial online inspection. Owing to the nonuniform illuminance conditions and variable working distances, the captured image tends to be over-exposed or under-exposed. As a result, when processing the image such as crack inspection, the algorithm complexity and computing time increase. Multiexposure high dynamic range (HDR) image synthesis is used to improve the quality of the captured image, whose dynamic range is limited. Inevitably, camera shake will result in ghost effect, which blurs the synthesis image to some extent. However, existed exposure fusion algorithms assume that the input images are either perfectly aligned or captured in the same scene. These assumptions limit the application. At present, widely used registration based on Scale Invariant Feature Transform (SIFT) is usually time consuming. In order to rapidly obtain a high quality HDR image without ghost effect, we come up with an efficient Low Dynamic Range (LDR) images capturing approach and propose a registration method based on ORiented Brief (ORB) and histogram equalization which can eliminate the illumination differences between the LDR images. The fusion is performed after alignment. The experiment results demonstrate that the proposed method is robust to illumination changes and local geometric distortion. Comparing with other exposure fusion methods, our method is more efficient and can produce HDR images without ghost effect by registering and fusing four multi-exposure images.

  2. Wireless wearable range-of-motion sensor system for upper and lower extremity joints: a validation study.

    Science.gov (United States)

    Kumar, Yogaprakash; Yen, Shih-Cheng; Tay, Arthur; Lee, Wangwei; Gao, Fan; Zhao, Ziyi; Li, Jingze; Hon, Benjamin; Tian-Ma Xu, Tim; Cheong, Angela; Koh, Karen; Ng, Yee-Sien; Chew, Effie; Koh, Gerald

    2015-02-01

    Range-of-motion (ROM) assessment is a critical assessment tool during the rehabilitation process. The conventional approach uses the goniometer which remains the most reliable instrument but it is usually time-consuming and subject to both intra- and inter-therapist measurement errors. An automated wireless wearable sensor system for the measurement of ROM has previously been developed by the current authors. Presented is the correlation and accuracy of the automated wireless wearable sensor system against a goniometer in measuring ROM in the major joints of upper (UEs) and lower extremities (LEs) in 19 healthy subjects and 20 newly disabled inpatients through intra (same) subject comparison of ROM assessments between the sensor system against goniometer measurements by physical therapists. In healthy subjects, ROM measurements using the new sensor system were highly correlated with goniometry, with 95% of differences sensor system were also highly correlated with goniometry, with 95% of the differences being < 20° and 25° for most movements in the major joints of UE and LE, respectively.

  3. Optical fiber sensors for image formation in radiodiagnostic - preliminary essays

    International Nuclear Information System (INIS)

    Carvalho, Cesar C. de; Werneck, Marcelo M.

    1998-01-01

    This work describes preliminary experiments that will bring subsidies to analyze the capability to implement a system able to capture radiological images with new sensor system, comprised by FOs scanning process and I-CCD camera. These experiments have the main objective to analyze the optical response from FOs bundle, with several typos of scintillators associated with them, when it is submitted to medical x-rays exposition. (author)

  4. State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation.

    Science.gov (United States)

    Sansoni, Giovanna; Trebeschi, Marco; Docchio, Franco

    2009-01-01

    3D imaging sensors for the acquisition of three dimensional (3D) shapes have created, in recent years, a considerable degree of interest for a number of applications. The miniaturization and integration of the optical and electronic components used to build them have played a crucial role in the achievement of compactness, robustness and flexibility of the sensors. Today, several 3D sensors are available on the market, even in combination with other sensors in a "sensor fusion" approach. An importance equal to that of physical miniaturization has the portability of the measurements, via suitable interfaces, into software environments designed for their elaboration, e.g., CAD-CAM systems, virtual renders, and rapid prototyping tools. In this paper, following an overview of the state-of-art of 3D imaging sensors, a number of significant examples of their use are presented, with particular reference to industry, heritage, medicine, and criminal investigation applications.

  5. A Support Vector Machine Approach for Truncated Fingerprint Image Detection from Sweeping Fingerprint Sensors

    Science.gov (United States)

    Chen, Chi-Jim; Pai, Tun-Wen; Cheng, Mox

    2015-01-01

    A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS), successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM), based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates. PMID:25835186

  6. A Support Vector Machine Approach for Truncated Fingerprint Image Detection from Sweeping Fingerprint Sensors

    Directory of Open Access Journals (Sweden)

    Chi-Jim Chen

    2015-03-01

    Full Text Available A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS, successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM, based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates.

  7. Dense range images from sparse point clouds using multi-scale processing

    NARCIS (Netherlands)

    Do, Q.L.; Ma, L.; With, de P.H.N.

    2013-01-01

    Multi-modal data processing based on visual and depth/range images has become relevant in computer vision for 3D reconstruction applications such as city modeling, robot navigation etc. In this paper, we generate highaccuracy dense range images from sparse point clouds to facilitate such

  8. Nanosecond-laser induced crosstalk of CMOS image sensor

    Science.gov (United States)

    Zhu, Rongzhen; Wang, Yanbin; Chen, Qianrong; Zhou, Xuanfeng; Ren, Guangsen; Cui, Longfei; Li, Hua; Hao, Daoliang

    2018-02-01

    The CMOS Image Sensor (CIS) is photoelectricity image device which focused the photosensitive array, amplifier, A/D transfer, storage, DSP, computer interface circuit on the same silicon substrate[1]. It has low power consumption, high integration,low cost etc. With large scale integrated circuit technology progress, the noise suppression level of CIS is enhanced unceasingly, and its image quality is getting better and better. It has been in the security monitoring, biometrice, detection and imaging and even military reconnaissance and other field is widely used. CIS is easily disturbed and damaged while it is irradiated by laser. It is of great significance to study the effect of laser irradiation on optoelectronic countermeasure and device for the laser strengthening resistance is of great significance. There are some researchers have studied the laser induced disturbed and damaged of CIS. They focused on the saturation, supersaturated effects, and they observed different effects as for unsaturation, saturation, supersaturated, allsaturated and pixel flip etc. This paper research 1064nm laser interference effect in a typical before type CMOS, and observring the saturated crosstalk and half the crosstalk line. This paper extracted from cmos devices working principle and signal detection methods such as the Angle of the formation mechanism of the crosstalk line phenomenon are analyzed.

  9. High speed global shutter image sensors for professional applications

    Science.gov (United States)

    Wu, Xu; Meynants, Guy

    2015-04-01

    Global shutter imagers expand the use to miscellaneous applications, such as machine vision, 3D imaging, medical imaging, space etc. to eliminate motion artifacts in rolling shutter imagers. A low noise global shutter pixel requires more than one non-light sensitive memory to reduce the read noise. But larger memory area reduces the fill-factor of the pixels. Modern micro-lenses technology can compensate this fill-factor loss. Backside illumination (BSI) is another popular technique to improve the pixel fill-factor. But some pixel architecture may not reach sufficient shutter efficiency with backside illumination. Non-light sensitive memory elements make the fabrication with BSI possible. Machine vision like fast inspection system, medical imaging like 3D medical or scientific applications always ask for high frame rate global shutter image sensors. Thanks to the CMOS technology, fast Analog-to-digital converters (ADCs) can be integrated on chip. Dual correlated double sampling (CDS) on chip ADC with high interface digital data rate reduces the read noise and makes more on-chip operation control. As a result, a global shutter imager with digital interface is a very popular solution for applications with high performance and high frame rate requirements. In this paper we will review the global shutter architectures developed in CMOSIS, discuss their optimization process and compare their performances after fabrication.

  10. Area-efficient readout with 14-bit SAR-ADC for CMOS image sensors

    Directory of Open Access Journals (Sweden)

    Aziza Sassi Ben

    2016-01-01

    Full Text Available This paper proposes a readout design for CMOS image sensors. It has been squeezed into a 7.5um pitch under a 0.28um 1P3M technology. The ADC performs one 14-bit conversion in only 1.5us and targets a theoretical DNL feature about +1.3/-1 at 14-bit accuracy. Correlated Double Sampling (CDS is performed both in the analog and digital domains to preserve the image quality.

  11. Image accuracy and representational enhancement through low-level, multi-sensor integration techniques

    International Nuclear Information System (INIS)

    Baker, J.E.

    1993-05-01

    Multi-Sensor Integration (MSI) is the combining of data and information from more than one source in order to generate a more reliable and consistent representation of the environment. The need for MSI derives largely from basic ambiguities inherent in our current sensor imaging technologies. These ambiguities exist as long as the mapping from reality to image is not 1-to-1. That is, if different 44 realities'' lead to identical images, a single image cannot reveal the particular reality which was the truth. MSI techniques can be divided into three categories based on the relative information content of the original images with that of the desired representation: (1) ''detail enhancement,'' wherein the relative information content of the original images is less rich than the desired representation; (2) ''data enhancement,'' wherein the MSI techniques axe concerned with improving the accuracy of the data rather than either increasing or decreasing the level of detail; and (3) ''conceptual enhancement,'' wherein the image contains more detail than is desired, making it difficult to easily recognize objects of interest. In conceptual enhancement one must group pixels corresponding to the same conceptual object and thereby reduce the level of extraneous detail. This research focuses on data and conceptual enhancement algorithms. To be useful in many real-world applications, e.g., autonomous or teleoperated robotics, real-time feedback is critical. But, many MSI/image processing algorithms require significant processing time. This is especially true of feature extraction, object isolation, and object recognition algorithms due to their typical reliance on global or large neighborhood information. This research attempts to exploit the speed currently available in state-of-the-art digitizers and highly parallel processing systems by developing MSI algorithms based on pixel rather than global-level features

  12. Modular design strategies for protein sensors and switches

    NARCIS (Netherlands)

    Merkx, M.; Ryadnov, M.; Brunsveld, L.; Suga, H.

    2014-01-01

    Protein-based sensors and switches provide attractive tools for the real-time monitoring and control of molecular processes in complex biological environments, with applications ranging from intracellular imaging to the rewiring of signal transduction pathways and molecular diagnostics. A

  13. Researchers develop CCD image sensor with 20ns per row parallel readout time

    CERN Multimedia

    Bush, S

    2004-01-01

    "Scientists at the Rutherford Appleton Laboratory (RAL) in Oxfordshire have developed what they claim is the fastest CCD (charge-coupled device) image sensor, with a readout time which is 20ns per row" (1/2 page)

  14. Multi-sensor integration for autonomous robots in nuclear power plants

    International Nuclear Information System (INIS)

    Mann, R.C.; Jones, J.P.; Beckerman, M.; Glover, C.W.; Farkas, L.; Bilbro, G.L.; Snyder, W.

    1989-01-01

    As part of a concerted RandD program in advanced robotics for hazardous environments, scientists and engineers at the Oak Ridge National Laboratory (ORNL) are performing research in the areas of systems integration, range-sensor-based 3-D world modeling, and multi-sensor integration. This program features a unique teaming arrangement that involves the universities of Florida, Michigan, Tennessee, and Texas; Odetics Corporation; and ORNL. This paper summarizes work directed at integrating information extracted from data collected with range sensors and CCD cameras on-board a mobile robot, in order to produce reliable descriptions of the robot's environment. Specifically, the paper describes the integration of two-dimensional vision and sonar range information, and an approach to integrate registered luminance and laser range images. All operations are carried out on-board the mobile robot using a 16-processor hypercube computer. 14 refs., 4 figs

  15. Histogram Matching Extends Acceptable Signal Strength Range on Optical Coherence Tomography Images

    Science.gov (United States)

    Chen, Chieh-Li; Ishikawa, Hiroshi; Wollstein, Gadi; Bilonick, Richard A.; Sigal, Ian A.; Kagemann, Larry; Schuman, Joel S.

    2015-01-01

    Purpose. We minimized the influence of image quality variability, as measured by signal strength (SS), on optical coherence tomography (OCT) thickness measurements using the histogram matching (HM) method. Methods. We scanned 12 eyes from 12 healthy subjects with the Cirrus HD-OCT device to obtain a series of OCT images with a wide range of SS (maximal range, 1–10) at the same visit. For each eye, the histogram of an image with the highest SS (best image quality) was set as the reference. We applied HM to the images with lower SS by shaping the input histogram into the reference histogram. Retinal nerve fiber layer (RNFL) thickness was automatically measured before and after HM processing (defined as original and HM measurements), and compared to the device output (device measurements). Nonlinear mixed effects models were used to analyze the relationship between RNFL thickness and SS. In addition, the lowest tolerable SSs, which gave the RNFL thickness within the variability margin of manufacturer recommended SS range (6–10), were determined for device, original, and HM measurements. Results. The HM measurements showed less variability across a wide range of image quality than the original and device measurements (slope = 1.17 vs. 4.89 and 1.72 μm/SS, respectively). The lowest tolerable SS was successfully reduced to 4.5 after HM processing. Conclusions. The HM method successfully extended the acceptable SS range on OCT images. This would qualify more OCT images with low SS for clinical assessment, broadening the OCT application to a wider range of subjects. PMID:26066749

  16. Thermal effects of an ICL-based mid-infrared CH4 sensor within a wide atmospheric temperature range

    Science.gov (United States)

    Ye, Weilin; Zheng, Chuantao; Sanchez, Nancy P.; Girija, Aswathy V.; He, Qixin; Zheng, Huadan; Griffin, Robert J.; Tittel, Frank K.

    2018-03-01

    The thermal effects of an interband cascade laser (ICL) based mid-infrared methane (CH4) sensor that uses long-path absorption spectroscopy were studied. The sensor performance in the laboratory at a constant temperature of ∼25 °C was measured for 5 h and its Allan deviation was ∼2 ppbv with a 1 s averaging time. A LabVIEW-based simulation program was developed to study thermal effects on infrared absorption and a temperature compensation technique was developed to minimize these effects. An environmental test chamber was employed to investigate the thermal effects that occur in the sensor system with variation of the test chamber temperature between 10 and 30 °C. The thermal response of the sensor in a laboratory setting was observed using a 2.1 ppm CH4 standard gas sample. Indoor/outdoor CH4 measurements were conducted to evaluate the sensor performance within a wide atmospheric temperature range.

  17. Monitoring Pest Insect Traps by Means of Low-Power Image Sensor Technologies

    Directory of Open Access Journals (Sweden)

    Juan J. Serrano

    2012-11-01

    Full Text Available Monitoring pest insect populations is currently a key issue in agriculture and forestry protection. At the farm level, human operators typically must perform periodical surveys of the traps disseminated through the field. This is a labor-, time- and cost-consuming activity, in particular for large plantations or large forestry areas, so it would be of great advantage to have an affordable system capable of doing this task automatically in an accurate and a more efficient way. This paper proposes an autonomous monitoring system based on a low-cost image sensor that it is able to capture and send images of the trap contents to a remote control station with the periodicity demanded by the trapping application. Our autonomous monitoring system will be able to cover large areas with very low energy consumption. This issue would be the main key point in our study; since the operational live of the overall monitoring system should be extended to months of continuous operation without any kind of maintenance (i.e., battery replacement. The images delivered by image sensors would be time-stamped and processed in the control station to get the number of individuals found at each trap. All the information would be conveniently stored at the control station, and accessible via Internet by means of available network services at control station (WiFi, WiMax, 3G/4G, etc..

  18. Monitoring Pest Insect Traps by Means of Low-Power Image Sensor Technologies

    Science.gov (United States)

    López, Otoniel; Rach, Miguel Martinez; Migallon, Hector; Malumbres, Manuel P.; Bonastre, Alberto; Serrano, Juan J.

    2012-01-01

    Monitoring pest insect populations is currently a key issue in agriculture and forestry protection. At the farm level, human operators typically must perform periodical surveys of the traps disseminated through the field. This is a labor-, time- and cost-consuming activity, in particular for large plantations or large forestry areas, so it would be of great advantage to have an affordable system capable of doing this task automatically in an accurate and a more efficient way. This paper proposes an autonomous monitoring system based on a low-cost image sensor that it is able to capture and send images of the trap contents to a remote control station with the periodicity demanded by the trapping application. Our autonomous monitoring system will be able to cover large areas with very low energy consumption. This issue would be the main key point in our study; since the operational live of the overall monitoring system should be extended to months of continuous operation without any kind of maintenance (i.e., battery replacement). The images delivered by image sensors would be time-stamped and processed in the control station to get the number of individuals found at each trap. All the information would be conveniently stored at the control station, and accessible via Internet by means of available network services at control station (WiFi, WiMax, 3G/4G, etc.). PMID:23202232

  19. Range-Gated Laser Stroboscopic Imaging for Night Remote Surveillance

    International Nuclear Information System (INIS)

    Xin-Wei, Wang; Yan, Zhou; Song-Tao, Fan; Jun, He; Yu-Liang, Liu

    2010-01-01

    For night remote surveillance, we present a method, the range-gated laser stroboscopic imaging(RGLSI), which uses a new kind of time delay integration mode to integrate target signals so that night remote surveillance can be realized by a low-energy illuminated laser. The time delay integration in this method has no influence on the video frame rate. Compared with the traditional range-gated laser imaging, RGLSI can reduce scintillation and target speckle effects and significantly improve the image signal-to-noise ratio analyzed. Even under low light level and low visibility conditions, the RGLSI system can effectively work. In a preliminary experiment, we have detected and recognized a railway bridge one kilometer away under a visibility of six kilometers, when the effective illuminated energy is 29.5 μJ

  20. Highly sensitive and area-efficient CMOS image sensor using a PMOSFET-type photodetector with a built-in transfer gate

    Science.gov (United States)

    Seo, Sang-Ho; Kim, Kyoung-Do; Kong, Jae-Sung; Shin, Jang-Kyoo; Choi, Pyung

    2007-02-01

    In this paper, a new CMOS image sensor is presented, which uses a PMOSFET-type photodetector with a transfer gate that has a high and variable sensitivity. The proposed CMOS image sensor has been fabricated using a 0.35 μm 2-poly 4- metal standard CMOS technology and is composed of a 256 × 256 array of 7.05 × 7.10 μm pixels. The unit pixel has a configuration of a pseudo 3-transistor active pixel sensor (APS) with the PMOSFET-type photodetector with a transfer gate, which has a function of conventional 4-transistor APS. The generated photocurrent is controlled by the transfer gate of the PMOSFET-type photodetector. The maximum responsivity of the photodetector is larger than 1.0 × 10 3 A/W without any optical lens. Fabricated 256 × 256 CMOS image sensor exhibits a good response to low-level illumination as low as 5 lux.

  1. Microwave Sensors for Breast Cancer Detection.

    Science.gov (United States)

    Wang, Lulu

    2018-02-23

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript.

  2. State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation

    Directory of Open Access Journals (Sweden)

    Giovanna Sansoni

    2009-01-01

    Full Text Available 3D imaging sensors for the acquisition of three dimensional (3D shapes have created, in recent years, a considerable degree of interest for a number of applications. The miniaturization and integration of the optical and electronic components used to build them have played a crucial role in the achievement of compactness, robustness and flexibility of the sensors. Today, several 3D sensors are available on the market, even in combination with other sensors in a “sensor fusion” approach. An importance equal to that of physical miniaturization has the portability of the measurements, via suitable interfaces, into software environments designed for their elaboration, e.g., CAD-CAM systems, virtual renders, and rapid prototyping tools. In this paper, following an overview of the state-of-art of 3D imaging sensors, a number of significant examples of their use are presented, with particular reference to industry, heritage, medicine, and criminal investigation applications.

  3. High-speed imaging at high x-ray energy: CdTe sensors coupled to charge-integrating pixel array detectors

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Julian; Tate, Mark W.; Shanks, Katherine S.; Philipp, Hugh T.; Weiss, Joel T.; Purohit, Prafull [Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY 14853 (United States); Chamberlain, Darol [Cornell High Energy Synchrotron Source (CHESS), Cornell University, Ithaca, NY 14853 (United States); Gruner, Sol M., E-mail: smg26@cornell.edu [Laboratory of Atomic and Solid State Physics, Cornell University, Ithaca, NY 14853 (United States); Cornell High Energy Synchrotron Source (CHESS), Cornell University, Ithaca, NY 14853 (United States)

    2016-07-27

    Pixel Array Detectors (PADs) consist of an x-ray sensor layer bonded pixel-by-pixel to an underlying readout chip. This approach allows both the sensor and the custom pixel electronics to be tailored independently to best match the x-ray imaging requirements. Here we describe the hybridization of CdTe sensors to two different charge-integrating readout chips, the Keck PAD and the Mixed-Mode PAD (MM-PAD), both developed previously in our laboratory. The charge-integrating architecture of each of these PADs extends the instantaneous counting rate by many orders of magnitude beyond that obtainable with photon counting architectures. The Keck PAD chip consists of rapid, 8-frame, in-pixel storage elements with framing periods <150 ns. The second detector, the MM-PAD, has an extended dynamic range by utilizing an in-pixel overflow counter coupled with charge removal circuitry activated at each overflow. This allows the recording of signals from the single-photon level to tens of millions of x-rays/pixel/frame while framing at 1 kHz. Both detector chips consist of a 128×128 pixel array with (150 µm){sup 2} pixels.

  4. Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks†

    Science.gov (United States)

    Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping

    2014-01-01

    Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromised master nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals. PMID:25615731

  5. Collusion-aware privacy-preserving range query in tiered wireless sensor networks.

    Science.gov (United States)

    Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping

    2014-12-11

    Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.

  6. Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Xiaoying Zhang

    2014-12-01

    Full Text Available Wireless sensor networks (WSNs are indispensable building blocks for the Internet of Things (IoT. With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.

  7. Enhancing the dynamic range of Ultrasound Imaging Velocimetry using interleaved imaging

    NARCIS (Netherlands)

    Poelma, C.; Fraser, K.H.

    2013-01-01

    In recent years, non-invasive velocity field measurement based on correlation of ultrasound images has been introduced as a promising technique for fundamental research into disease processes, as well as a diagnostic tool. A major drawback of the method is the relatively limited dynamic range when

  8. Integration of computer imaging and sensor data for structural health monitoring of bridges

    International Nuclear Information System (INIS)

    Zaurin, R; Catbas, F N

    2010-01-01

    The condition of civil infrastructure systems (CIS) changes over their life cycle for different reasons such as damage, overloading, severe environmental inputs, and ageing due normal continued use. The structural performance often decreases as a result of the change in condition. Objective condition assessment and performance evaluation are challenging activities since they require some type of monitoring to track the response over a period of time. In this paper, integrated use of video images and sensor data in the context of structural health monitoring is demonstrated as promising technologies for the safety of civil structures in general and bridges in particular. First, the challenges and possible solutions to using video images and computer vision techniques for structural health monitoring are presented. Then, the synchronized image and sensing data are analyzed to obtain unit influence line (UIL) as an index for monitoring bridge behavior under identified loading conditions. Subsequently, the UCF 4-span bridge model is used to demonstrate the integration and implementation of imaging devices and traditional sensing technology with UIL for evaluating and tracking the bridge behavior. It is shown that video images and computer vision techniques can be used to detect, classify and track different vehicles with synchronized sensor measurements to establish an input–output relationship to determine the normalized response of the bridge

  9. Automatic Welding System of Aluminum Pipe by Monitoring Backside Image of Molten Pool Using Vision Sensor

    Science.gov (United States)

    Baskoro, Ario Sunar; Kabutomori, Masashi; Suga, Yasuo

    An automatic welding system using Tungsten Inert Gas (TIG) welding with vision sensor for welding of aluminum pipe was constructed. This research studies the intelligent welding process of aluminum alloy pipe 6063S-T5 in fixed position and moving welding torch with the AC welding machine. The monitoring system consists of a vision sensor using a charge-coupled device (CCD) camera to monitor backside image of molten pool. The captured image was processed to recognize the edge of molten pool by image processing algorithm. Neural network model for welding speed control were constructed to perform the process automatically. From the experimental results it shows the effectiveness of the control system confirmed by good detection of molten pool and sound weld of experimental result.

  10. An Image Compression Scheme in Wireless Multimedia Sensor Networks Based on NMF

    Directory of Open Access Journals (Sweden)

    Shikang Kong

    2017-02-01

    Full Text Available With the goal of addressing the issue of image compression in wireless multimedia sensor networks with high recovered quality and low energy consumption, an image compression and transmission scheme based on non-negative matrix factorization (NMF is proposed in this paper. First, the NMF algorithm theory is studied. Then, a collaborative mechanism of image capture, block, compression and transmission is completed. Camera nodes capture images and send them to ordinary nodes which use an NMF algorithm for image compression. Compressed images are transmitted to the station by the cluster head node and received from ordinary nodes. The station takes on the image restoration. Simulation results show that, compared with the JPEG2000 and singular value decomposition (SVD compression schemes, the proposed scheme has a higher quality of recovered images and lower total node energy consumption. It is beneficial to reduce the burden of energy consumption and prolong the life of the whole network system, which has great significance for practical applications of WMSNs.

  11. Low SWaP multispectral sensors using dichroic filter arrays

    Science.gov (United States)

    Dougherty, John; Varghese, Ron

    2015-06-01

    The benefits of multispectral imaging are well established in a variety of applications including remote sensing, authentication, satellite and aerial surveillance, machine vision, biomedical, and other scientific and industrial uses. However, many of the potential solutions require more compact, robust, and cost-effective cameras to realize these benefits. The next generation of multispectral sensors and cameras needs to deliver improvements in size, weight, power, portability, and spectral band customization to support widespread deployment for a variety of purpose-built aerial, unmanned, and scientific applications. A novel implementation uses micro-patterning of dichroic filters1 into Bayer and custom mosaics, enabling true real-time multispectral imaging with simultaneous multi-band image acquisition. Consistent with color image processing, individual spectral channels are de-mosaiced with each channel providing an image of the field of view. This approach can be implemented across a variety of wavelength ranges and on a variety of detector types including linear, area, silicon, and InGaAs. This dichroic filter array approach can also reduce payloads and increase range for unmanned systems, with the capability to support both handheld and autonomous systems. Recent examples and results of 4 band RGB + NIR dichroic filter arrays in multispectral cameras are discussed. Benefits and tradeoffs of multispectral sensors using dichroic filter arrays are compared with alternative approaches - including their passivity, spectral range, customization options, and scalable production.

  12. Edgeless silicon sensors for Medipix-based large-area X-ray imaging detectors

    International Nuclear Information System (INIS)

    Bosma, M J; Visser, J; Koffeman, E N; Evrard, O; De Moor, P; De Munck, K; Tezcan, D Sabuncuoglu

    2011-01-01

    Some X-ray imaging applications demand sensitive areas exceeding the active area of a single sensor. This requires a seamless tessellation of multiple detector modules with edgeless sensors. Our research is aimed at minimising the insensitive periphery that isolates the active area from the edge. Reduction of the edge-defect induced charge injection, caused by the deleterious effects of dicing, is an important step. We report on the electrical characterisation of 300 μm thick edgeless silicon p + -ν-n + diodes, diced using deep reactive ion etching. Sensors with both n-type and p-type stop rings were fabricated in various edge topologies. Leakage currents in the active area are compared with those of sensors with a conventional design. As expected, we observe an inverse correlation between leakage-current density and both the edge distance and stop-ring width. From this correlation we determine a minimum acceptable edge distance of 50 μm. We also conclude that structures with a p-type stop ring show lower leakage currents and higher breakdown voltages than the ones with an n-type stop ring.

  13. Time-domain fiber loop ringdown sensor and sensor network

    Science.gov (United States)

    Kaya, Malik

    Optical fibers have been mostly used in fiber optic communications, imaging optics, sensing technology, etc. Fiber optic sensors have gained increasing attention for scientific and structural health monitoring (SHM) applications. In this study, fiber loop ringdown (FLRD) sensors were fabricated for scientific, SHM, and sensor networking applications. FLRD biosensors were fabricated for both bulk refractive index (RI)- and surface RI-based DNA sensing and one type of bacteria sensing. Furthermore, the effect of glucose oxidase (GOD) immobilization at the sensor head on sensor performance was evaluated for both glucose and synthetic urine solutions with glucose concentration between 0.1% and 10%. Detection sensitivities of the glucose sensors were achieved as low as 0.05%. For chemical sensing, heavy water, ranging from 97% to 10%, and several elemental solutions were monitored by using the FLRD chemical sensors. Bulk index-based FLRD sensing showed that trace elements can be detected in deionized water. For physical sensing, water and cracking sensors were fabricated and embedded into concrete. A partially-etched single-mode fiber (SMF) was embedded into a concrete bar for water monitoring while a bare SMF without any treatment was directly embedded into another concrete bar for monitoring cracks. Furthermore, detection sensitivities of water and crack sensors were investigated as 10 ml water and 0.5 mm surface crack width, respectively. Additionally fiber loop ringdown-fiber Bragg grating temperature sensors were developed in the laboratory; two sensor units for water, crack, and temperature sensing were deployed into a concrete cube in a US Department of Energy test bed (Miami, FL). Multi-sensor applications in a real concrete structure were accomplished by testing the six FLRD sensors. As a final stage, a sensor network was assembled by multiplexing two or three FLRD sensors in series and parallel. Additionally, two FLRD sensors were combined in series and

  14. Coseismic displacements from SAR image offsets between different satellite sensors: Application to the 2001 Bhuj (India) earthquake

    KAUST Repository

    Wang, Teng

    2015-09-05

    Synthetic aperture radar (SAR) image offset tracking is increasingly being used for measuring ground displacements, e.g., due to earthquakes and landslide movement. However, this technique has been applied only to images acquired by the same or identical satellites. Here we propose a novel approach for determining offsets between images acquired by different satellite sensors, extending the usability of existing SAR image archives. The offsets are measured between two multiimage reflectivity maps obtained from different SAR data sets, which provide significantly better results than with single preevent and postevent images. Application to the 2001 Mw7.6 Bhuj earthquake reveals, for the first time, its near-field deformation using multiple preearthquake ERS and postearthquake Envisat images. The rupture model estimated from these cross-sensor offsets and teleseismic waveforms shows a compact fault slip pattern with fairly short rise times (<3 s) and a large stress drop (20 MPa), explaining the intense shaking observed in the earthquake.

  15. MHz rate X-Ray imaging with GaAs:Cr sensors using the LPD detector system

    Science.gov (United States)

    Veale, M. C.; Booker, P.; Cline, B.; Coughlan, J.; Hart, M.; Nicholls, T.; Schneider, A.; Seller, P.; Pape, I.; Sawhney, K.; Lozinskaya, A. D.; Novikov, V. A.; Tolbanov, O. P.; Tyazhev, A.; Zarubin, A. N.

    2017-02-01

    The STFC Rutherford Appleton Laboratory (U.K.) and Tomsk State University (Russia) have been working together to develop and characterise detector systems based on chromium-compensated gallium arsenide (GaAs:Cr) semiconductor material for high frame rate X-ray imaging. Previous work has demonstrated the spectroscopic performance of the material and its resistance to damage induced by high fluxes of X-rays. In this paper, recent results from experiments at the Diamond Light Source Synchrotron have demonstrated X-ray imaging with GaAs:Cr sensors at a frame rate of 3.7 MHz using the Large Pixel Detector (LPD) ASIC, developed by STFC for the European XFEL. Measurements have been made using a monochromatic 20 keV X-ray beam delivered in a single hybrid pulse with an instantenous flux of up to ~ 1 × 1010 photons s-1 mm-2. The response of 500 μm GaAs:Cr sensors is compared to that of the standard 500 μm thick LPD Si sensors.

  16. Image enhancement technology research for army applications

    NARCIS (Netherlands)

    Schwering, P.B.W.; Kemp, R.A.W.; Schutte, K.

    2013-01-01

    Recognition and identification ranges are limited to the quality of the images. Both the received contrast and the spatial resolution determine if objects are recognizable. Several aspects affect the image quality. First of all the sensor itself. The image quality depends on the size of the infrared

  17. NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data Vb0

    Data.gov (United States)

    National Aeronautics and Space Administration — The NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data were collected by the LIS instrument on the ISS used to detect the...

  18. Time-of-flight camera via a single-pixel correlation image sensor

    Science.gov (United States)

    Mao, Tianyi; Chen, Qian; He, Weiji; Dai, Huidong; Ye, Ling; Gu, Guohua

    2018-04-01

    A time-of-flight imager based on single-pixel correlation image sensors is proposed for noise-free depth map acquisition in presence of ambient light. Digital micro-mirror device and time-modulated IR-laser provide spatial and temporal illumination on the unknown object. Compressed sensing and ‘four bucket principle’ method are combined to reconstruct the depth map from a sequence of measurements at a low sampling rate. Second-order correlation transform is also introduced to reduce the noise from the detector itself and direct ambient light. Computer simulations are presented to validate the computational models and improvement of reconstructions.

  19. Study of CT-based positron range correction in high resolution 3D PET imaging

    Energy Technology Data Exchange (ETDEWEB)

    Cal-Gonzalez, J., E-mail: jacobo@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Herraiz, J.L. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Vicente, E. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Instituto de Estructura de la Materia, Consejo Superior de Investigaciones Cientificas (CSIC), Madrid (Spain); Herranz, E. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Desco, M. [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Vaquero, J.J. [Dpto. de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Dpto. Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Positron range limits the spatial resolution of PET images and has a different effect for different isotopes and positron propagation materials. Therefore it is important to consider it during image reconstruction, in order to obtain optimal image quality. Positron range distributions for most common isotopes used in PET in different materials were computed using the Monte Carlo simulations with PeneloPET. The range profiles were introduced into the 3D OSEM image reconstruction software FIRST and employed to blur the image either in the forward projection or in the forward and backward projection. The blurring introduced takes into account the different materials in which the positron propagates. Information on these materials may be obtained, for instance, from a segmentation of a CT image. The results of introducing positron blurring in both forward and backward projection operations was compared to using it only during forward projection. Further, the effect of different shapes of positron range profile in the quality of the reconstructed images with positron range correction was studied. For high positron energy isotopes, the reconstructed images show significant improvement in spatial resolution when positron range is taken into account during reconstruction, compared to reconstructions without positron range modeling.

  20. Study of CT-based positron range correction in high resolution 3D PET imaging

    International Nuclear Information System (INIS)

    Cal-Gonzalez, J.; Herraiz, J.L.; Espana, S.; Vicente, E.; Herranz, E.; Desco, M.; Vaquero, J.J.; Udias, J.M.

    2011-01-01

    Positron range limits the spatial resolution of PET images and has a different effect for different isotopes and positron propagation materials. Therefore it is important to consider it during image reconstruction, in order to obtain optimal image quality. Positron range distributions for most common isotopes used in PET in different materials were computed using the Monte Carlo simulations with PeneloPET. The range profiles were introduced into the 3D OSEM image reconstruction software FIRST and employed to blur the image either in the forward projection or in the forward and backward projection. The blurring introduced takes into account the different materials in which the positron propagates. Information on these materials may be obtained, for instance, from a segmentation of a CT image. The results of introducing positron blurring in both forward and backward projection operations was compared to using it only during forward projection. Further, the effect of different shapes of positron range profile in the quality of the reconstructed images with positron range correction was studied. For high positron energy isotopes, the reconstructed images show significant improvement in spatial resolution when positron range is taken into account during reconstruction, compared to reconstructions without positron range modeling.

  1. Influence of long-range Coulomb interaction in velocity map imaging.

    Science.gov (United States)

    Barillot, T; Brédy, R; Celep, G; Cohen, S; Compagnon, I; Concina, B; Constant, E; Danakas, S; Kalaitzis, P; Karras, G; Lépine, F; Loriot, V; Marciniak, A; Predelus-Renois, G; Schindler, B; Bordas, C

    2017-07-07

    The standard velocity-map imaging (VMI) analysis relies on the simple approximation that the residual Coulomb field experienced by the photoelectron ejected from a neutral or ion system may be neglected. Under this almost universal approximation, the photoelectrons follow ballistic (parabolic) trajectories in the externally applied electric field, and the recorded image may be considered as a 2D projection of the initial photoelectron velocity distribution. There are, however, several circumstances where this approximation is not justified and the influence of long-range forces must absolutely be taken into account for the interpretation and analysis of the recorded images. The aim of this paper is to illustrate this influence by discussing two different situations involving isolated atoms or molecules where the analysis of experimental images cannot be performed without considering long-range Coulomb interactions. The first situation occurs when slow (meV) photoelectrons are photoionized from a neutral system and strongly interact with the attractive Coulomb potential of the residual ion. The result of this interaction is the formation of a more complex structure in the image, as well as the appearance of an intense glory at the center of the image. The second situation, observed also at low energy, occurs in the photodetachment from a multiply charged anion and it is characterized by the presence of a long-range repulsive potential. Then, while the standard VMI approximation is still valid, the very specific features exhibited by the recorded images can be explained only by taking into consideration tunnel detachment through the repulsive Coulomb barrier.

  2. Improved Feature Detection in Fused Intensity-Range Images with Complex SIFT (ℂSIFT

    Directory of Open Access Journals (Sweden)

    Boris Jutzi

    2011-09-01

    Full Text Available The real and imaginary parts are proposed as an alternative to the usual Polar representation of complex-valued images. It is proven that the transformation from Polar to Cartesian representation contributes to decreased mutual information, and hence to greater distinctiveness. The Complex Scale-Invariant Feature Transform (ℂSIFT detects distinctive features in complex-valued images. An evaluation method for estimating the uniformity of feature distributions in complex-valued images derived from intensity-range images is proposed. In order to experimentally evaluate the proposed methodology on intensity-range images, three different kinds of active sensing systems were used: Range Imaging, Laser Scanning, and Structured Light Projection devices (PMD CamCube 2.0, Z+F IMAGER 5003, Microsoft Kinect.

  3. Model-based restoration using light vein for range-gated imaging systems.

    Science.gov (United States)

    Wang, Canjin; Sun, Tao; Wang, Tingfeng; Wang, Rui; Guo, Jin; Tian, Yuzhen

    2016-09-10

    The images captured by an airborne range-gated imaging system are degraded by many factors, such as light scattering, noise, defocus of the optical system, atmospheric disturbances, platform vibrations, and so on. The characteristics of low illumination, few details, and high noise make the state-of-the-art restoration method fail. In this paper, we present a restoration method especially for range-gated imaging systems. The degradation process is divided into two parts: the static part and the dynamic part. For the static part, we establish the physical model of the imaging system according to the laser transmission theory, and estimate the static point spread function (PSF). For the dynamic part, a so-called light vein feature extraction method is presented to estimate the fuzzy parameter of the atmospheric disturbance and platform movement, which make contributions to the dynamic PSF. Finally, combined with the static and dynamic PSF, an iterative updating framework is used to restore the image. Compared with the state-of-the-art methods, the proposed method can effectively suppress ringing artifacts and achieve better performance in a range-gated imaging system.

  4. Visual Sensor Based Image Segmentation by Fuzzy Classification and Subregion Merge

    Directory of Open Access Journals (Sweden)

    Huidong He

    2017-01-01

    Full Text Available The extraction and tracking of targets in an image shot by visual sensors have been studied extensively. The technology of image segmentation plays an important role in such tracking systems. This paper presents a new approach to color image segmentation based on fuzzy color extractor (FCE. Different from many existing methods, the proposed approach provides a new classification of pixels in a source color image which usually classifies an individual pixel into several subimages by fuzzy sets. This approach shows two unique features: the spatial proximity and color similarity, and it mainly consists of two algorithms: CreateSubImage and MergeSubImage. We apply the FCE to segment colors of the test images from the database at UC Berkeley in the RGB, HSV, and YUV, the three different color spaces. The comparative studies show that the FCE applied in the RGB space is superior to the HSV and YUV spaces. Finally, we compare the segmentation effect with Canny edge detection and Log edge detection algorithms. The results show that the FCE-based approach performs best in the color image segmentation.

  5. Image sensor pixel with on-chip high extinction ratio polarizer based on 65-nm standard CMOS technology.

    Science.gov (United States)

    Sasagawa, Kiyotaka; Shishido, Sanshiro; Ando, Keisuke; Matsuoka, Hitoshi; Noda, Toshihiko; Tokuda, Takashi; Kakiuchi, Kiyomi; Ohta, Jun

    2013-05-06

    In this study, we demonstrate a polarization sensitive pixel for a complementary metal-oxide-semiconductor (CMOS) image sensor based on 65-nm standard CMOS technology. Using such a deep-submicron CMOS technology, it is possible to design fine metal patterns smaller than the wavelengths of visible light by using a metal wire layer. We designed and fabricated a metal wire grid polarizer on a 20 × 20 μm(2) pixel for image sensor. An extinction ratio of 19.7 dB was observed at a wavelength 750 nm.

  6. Extended Special Sensor Microwave Imager (SSM/I) Temperature Data Record (TDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager (SSM/I) is a seven-channel linearly polarized passive microwave radiometer that operates at frequencies of 19.36 (vertically and...

  7. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Sensor Data Record (SDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sensor Data Records (SDRs), or Level 1b data, from the Visible Infrared Imaging Radiometer Suite (VIIRS) are the calibrated and geolocated radiance and reflectance...

  8. Low Computational-Cost Footprint Deformities Diagnosis Sensor through Angles, Dimensions Analysis and Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    J. Rodolfo Maestre-Rendon

    2017-11-01

    Full Text Available Manual measurements of foot anthropometry can lead to errors since this task involves the experience of the specialist who performs them, resulting in different subjective measures from the same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are based on a qualitative interpretation by the physician; there is no quantitative interpretation of the footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that an appropriate treatment is provided for the improvement of the patient without risking his or her health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low computational-cost analysis of the image and the interpretation of the results through a quantitative evaluation. The smart sensor implemented required the use of a camera (Logitech C920 connected to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist, physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS has proven to be robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements from the digitalized image of the ink mat.

  9. Color sensitivity of the multi-exposure HDR imaging process

    Science.gov (United States)

    Lenseigne, Boris; Jacobs, Valéry Ann; Withouck, Martijn; Hanselaer, Peter; Jonker, Pieter P.

    2013-04-01

    Multi-exposure high dynamic range(HDR) imaging builds HDR radiance maps by stitching together different views of a same scene with varying exposures. Practically, this process involves converting raw sensor data into low dynamic range (LDR) images, estimate the camera response curves, and use them in order to recover the irradiance for every pixel. During the export, applying white balance settings and image stitching, which both have an influence on the color balance in the final image. In this paper, we use a calibrated quasi-monochromatic light source, an integrating sphere, and a spectrograph in order to evaluate and compare the average spectral response of the image sensor. We finally draw some conclusion about the color consistency of HDR imaging and the additional steps necessary to use multi-exposure HDR imaging as a tool to measure the physical quantities such as radiance and luminance.

  10. Using the Medipix3 detector for direct electron imaging in the range 60 keV to 200 keV in electron microscopy

    Science.gov (United States)

    Mir, J. A.; Plackett, R.; Shipsey, I.; dos Santos, J. M. F.

    2017-11-01

    Hybrid pixel sensor technology such as the Medipix3 represents a unique tool for electron imaging. We have investigated its performance as a direct imaging detector using a Transmission Electron Microscope (TEM) which incorporated a Medipix3 detector with a 300 μm thick silicon layer compromising of 256×256 pixels at 55 μm pixel pitch. We present results taken with the Medipix3 in Single Pixel Mode (SPM) with electron beam energies in the range, 60-200 keV . Measurements of the Modulation Transfer Function (MTF) and the Detective Quantum Efficiency (DQE) were investigated. At a given beam energy, the MTF data was acquired by deploying the established knife edge technique. Similarly, the experimental data required to determine DQE was obtained by acquiring a stack of images of a focused beam and of free space (flatfield) to determine the Noise Power Spectrum (NPS).

  11. Edge pixel response studies of edgeless silicon sensor technology for pixellated imaging detectors

    Science.gov (United States)

    Maneuski, D.; Bates, R.; Blue, A.; Buttar, C.; Doonan, K.; Eklund, L.; Gimenez, E. N.; Hynds, D.; Kachkanov, S.; Kalliopuska, J.; McMullen, T.; O'Shea, V.; Tartoni, N.; Plackett, R.; Vahanen, S.; Wraight, K.

    2015-03-01

    Silicon sensor technologies with reduced dead area at the sensor's perimeter are under development at a number of institutes. Several fabrication methods for sensors which are sensitive close to the physical edge of the device are under investigation utilising techniques such as active-edges, passivated edges and current-terminating rings. Such technologies offer the goal of a seamlessly tiled detection surface with minimum dead space between the individual modules. In order to quantify the performance of different geometries and different bulk and implant types, characterisation of several sensors fabricated using active-edge technology were performed at the B16 beam line of the Diamond Light Source. The sensors were fabricated by VTT and bump-bonded to Timepix ROICs. They were 100 and 200 μ m thick sensors, with the last pixel-to-edge distance of either 50 or 100 μ m. The sensors were fabricated as either n-on-n or n-on-p type devices. Using 15 keV monochromatic X-rays with a beam spot of 2.5 μ m, the performance at the outer edge and corners pixels of the sensors was evaluated at three bias voltages. The results indicate a significant change in the charge collection properties between the edge and 5th (up to 275 μ m) from edge pixel for the 200 μ m thick n-on-n sensor. The edge pixel performance of the 100 μ m thick n-on-p sensors is affected only for the last two pixels (up to 110 μ m) subject to biasing conditions. Imaging characteristics of all sensor types investigated are stable over time and the non-uniformities can be minimised by flat-field corrections. The results from the synchrotron tests combined with lab measurements are presented along with an explanation of the observed effects.

  12. Engineering workstation: Sensor modeling

    Science.gov (United States)

    Pavel, M; Sweet, B.

    1993-01-01

    The purpose of the engineering workstation is to provide an environment for rapid prototyping and evaluation of fusion and image processing algorithms. Ideally, the algorithms are designed to optimize the extraction of information that is useful to a pilot for all phases of flight operations. Successful design of effective fusion algorithms depends on the ability to characterize both the information available from the sensors and the information useful to a pilot. The workstation is comprised of subsystems for simulation of sensor-generated images, image processing, image enhancement, and fusion algorithms. As such, the workstation can be used to implement and evaluate both short-term solutions and long-term solutions. The short-term solutions are being developed to enhance a pilot's situational awareness by providing information in addition to his direct vision. The long term solutions are aimed at the development of complete synthetic vision systems. One of the important functions of the engineering workstation is to simulate the images that would be generated by the sensors. The simulation system is designed to use the graphics modeling and rendering capabilities of various workstations manufactured by Silicon Graphics Inc. The workstation simulates various aspects of the sensor-generated images arising from phenomenology of the sensors. In addition, the workstation can be used to simulate a variety of impairments due to mechanical limitations of the sensor placement and due to the motion of the airplane. Although the simulation is currently not performed in real-time, sequences of individual frames can be processed, stored, and recorded in a video format. In that way, it is possible to examine the appearance of different dynamic sensor-generated and fused images.

  13. Target recognition of ladar range images using even-order Zernike moments.

    Science.gov (United States)

    Liu, Zheng-Jun; Li, Qi; Xia, Zhi-Wei; Wang, Qi

    2012-11-01

    Ladar range images have attracted considerable attention in automatic target recognition fields. In this paper, Zernike moments (ZMs) are applied to classify the target of the range image from an arbitrary azimuth angle. However, ZMs suffer from high computational costs. To improve the performance of target recognition based on small samples, even-order ZMs with serial-parallel backpropagation neural networks (BPNNs) are applied to recognize the target of the range image. It is found that the rotation invariance and classified performance of the even-order ZMs are both better than for odd-order moments and for moments compressed by principal component analysis. The experimental results demonstrate that combining the even-order ZMs with serial-parallel BPNNs can significantly improve the recognition rate for small samples.

  14. Characterisation of a monolithic active pixel sensor for electron detection in the energy range 10-20 keV

    International Nuclear Information System (INIS)

    Matheson, J.; Moldovan, G.; Clark, A.; Prydderch, M.; Turchetta, R.; Derbyshire, G.; Kirkland, A.; Allinson, N.

    2009-01-01

    As part of a feasibility study into the use of novel electron detectors for X-ray photoelectron emission microscopes (XPEEM), we have characterised the imaging performance of a back-illuminated monolithic active pixel sensor (MAPS) operating under both integrating and counting modes for electrons in the energy range 10-20 keV. For integrating mode, we present the detective quantum efficiency (DQE), which shows marked improvements over conventional indirect detectors based on microchannel plates. We also present the modulation transfer function (MTF) and noise power spectrum (NPS), again demonstrating significantly improved performance. For counting mode, we present the quantum efficiency (QE) as a function of incident electron energy. We have evaluated the charge collection efficiency (CCE) and we thereby demonstrate the presence of a ∼200 nm thick dead layer that is linked with reduced CCE at low electron energies. Based on our findings, we believe that the MAPS technology is well matched to future XPEEM instruments using aberration correction.

  15. Simultaneous live cell imaging using dual FRET sensors with a single excitation light.

    Directory of Open Access Journals (Sweden)

    Yusuke Niino

    Full Text Available Fluorescence resonance energy transfer (FRET between fluorescent proteins is a powerful tool for visualization of signal transduction in living cells, and recently, some strategies for imaging of dual FRET pairs in a single cell have been reported. However, these necessitate alteration of excitation light between two different wavelengths to avoid the spectral overlap, resulting in sequential detection with a lag time. Thus, to follow fast signal dynamics or signal changes in highly motile cells, a single-excitation dual-FRET method should be required. Here we reported this by using four-color imaging with a single excitation light and subsequent linear unmixing to distinguish fluorescent proteins. We constructed new FRET sensors with Sapphire/RFP to combine with CFP/YFP, and accomplished simultaneous imaging of cAMP and cGMP in single cells. We confirmed that signal amplitude of our dual FRET measurement is comparable to of conventional single FRET measurement. Finally, we demonstrated to monitor both intracellular Ca(2+ and cAMP in highly motile cardiac myocytes. To cancel out artifacts caused by the movement of the cell, this method expands the applicability of the combined use of dual FRET sensors for cell samples with high motility.

  16. A 256×256 low-light-level CMOS imaging sensor with digital CDS

    Science.gov (United States)

    Zou, Mei; Chen, Nan; Zhong, Shengyou; Li, Zhengfen; Zhang, Jicun; Yao, Li-bin

    2016-10-01

    In order to achieve high sensitivity for low-light-level CMOS image sensors (CIS), a capacitive transimpedance amplifier (CTIA) pixel circuit with a small integration capacitor is used. As the pixel and the column area are highly constrained, it is difficult to achieve analog correlated double sampling (CDS) to remove the noise for low-light-level CIS. So a digital CDS is adopted, which realizes the subtraction algorithm between the reset signal and pixel signal off-chip. The pixel reset noise and part of the column fixed-pattern noise (FPN) can be greatly reduced. A 256×256 CIS with CTIA array and digital CDS is implemented in the 0.35μm CMOS technology. The chip size is 7.7mm×6.75mm, and the pixel size is 15μm×15μm with a fill factor of 20.6%. The measured pixel noise is 24LSB with digital CDS in RMS value at dark condition, which shows 7.8× reduction compared to the image sensor without digital CDS. Running at 7fps, this low-light-level CIS can capture recognizable images with the illumination down to 0.1lux.

  17. A Low Power Digital Accumulation Technique for Digital-Domain CMOS TDI Image Sensor.

    Science.gov (United States)

    Yu, Changwei; Nie, Kaiming; Xu, Jiangtao; Gao, Jing

    2016-09-23

    In this paper, an accumulation technique suitable for digital domain CMOS time delay integration (TDI) image sensors is proposed to reduce power consumption without degrading the rate of imaging. In terms of the slight variations of quantization codes among different pixel exposures towards the same object, the pixel array is divided into two groups: one is for coarse quantization of high bits only, and the other one is for fine quantization of low bits. Then, the complete quantization codes are composed of both results from the coarse-and-fine quantization. The equivalent operation comparably reduces the total required bit numbers of the quantization. In the 0.18 µm CMOS process, two versions of 16-stage digital domain CMOS TDI image sensor chains based on a 10-bit successive approximate register (SAR) analog-to-digital converter (ADC), with and without the proposed technique, are designed. The simulation results show that the average power consumption of slices of the two versions are 6 . 47 × 10 - 8 J/line and 7 . 4 × 10 - 8 J/line, respectively. Meanwhile, the linearity of the two versions are 99.74% and 99.99%, respectively.

  18. Optimizing Floating Guard Ring Designs for FASPAX N-in-P Silicon Sensors

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Kyung-Wook [Argonne; Bradford, Robert [Argonne; Lipton, Ronald [Fermilab; Deptuch, Gregory [Fermilab; Fahim, Farah [Fermilab; Madden, Tim [Argonne; Zimmerman, Tom [Fermilab

    2016-10-06

    FASPAX (Fermi-Argonne Semiconducting Pixel Array X-ray detector) is being developed as a fast integrating area detector with wide dynamic range for time resolved applications at the upgraded Advanced Photon Source (APS.) A burst mode detector with intended $\\mbox{13 $MHz$}$ image rate, FASPAX will also incorporate a novel integration circuit to achieve wide dynamic range, from single photon sensitivity to $10^{\\text{5}}$ x-rays/pixel/pulse. To achieve these ambitious goals, a novel silicon sensor design is required. This paper will detail early design of the FASPAX sensor. Results from TCAD optimization studies, and characterization of prototype sensors will be presented.

  19. Imaging Voltage in Genetically Defined Neuronal Subpopulations with a Cre Recombinase-Targeted Hybrid Voltage Sensor.

    Science.gov (United States)

    Bayguinov, Peter O; Ma, Yihe; Gao, Yu; Zhao, Xinyu; Jackson, Meyer B

    2017-09-20

    Genetically encoded voltage indicators create an opportunity to monitor electrical activity in defined sets of neurons as they participate in the complex patterns of coordinated electrical activity that underlie nervous system function. Taking full advantage of genetically encoded voltage indicators requires a generalized strategy for targeting the probe to genetically defined populations of cells. To this end, we have generated a mouse line with an optimized hybrid voltage sensor (hVOS) probe within a locus designed for efficient Cre recombinase-dependent expression. Crossing this mouse with Cre drivers generated double transgenics expressing hVOS probe in GABAergic, parvalbumin, and calretinin interneurons, as well as hilar mossy cells, new adult-born neurons, and recently active neurons. In each case, imaging in brain slices from male or female animals revealed electrically evoked optical signals from multiple individual neurons in single trials. These imaging experiments revealed action potentials, dynamic aspects of dendritic integration, and trial-to-trial fluctuations in response latency. The rapid time response of hVOS imaging revealed action potentials with high temporal fidelity, and enabled accurate measurements of spike half-widths characteristic of each cell type. Simultaneous recording of rapid voltage changes in multiple neurons with a common genetic signature offers a powerful approach to the study of neural circuit function and the investigation of how neural networks encode, process, and store information. SIGNIFICANCE STATEMENT Genetically encoded voltage indicators hold great promise in the study of neural circuitry, but realizing their full potential depends on targeting the sensor to distinct cell types. Here we present a new mouse line that expresses a hybrid optical voltage sensor under the control of Cre recombinase. Crossing this line with Cre drivers generated double-transgenic mice, which express this sensor in targeted cell types. In

  20. A generalized logarithmic image processing model based on the gigavision sensor model.

    Science.gov (United States)

    Deng, Guang

    2012-03-01

    The logarithmic image processing (LIP) model is a mathematical theory providing generalized linear operations for image processing. The gigavision sensor (GVS) is a new imaging device that can be described by a statistical model. In this paper, by studying these two seemingly unrelated models, we develop a generalized LIP (GLIP) model. With the LIP model being its special case, the GLIP model not only provides new insights into the LIP model but also defines new image representations and operations for solving general image processing problems that are not necessarily related to the GVS. A new parametric LIP model is also developed. To illustrate the application of the new scalar multiplication operation, we propose an energy-preserving algorithm for tone mapping, which is a necessary step in image dehazing. By comparing with results using two state-of-the-art algorithms, we show that the new scalar multiplication operation is an effective tool for tone mapping.