WorldWideScience

Sample records for imaging perimeter sensor

  1. System overview and applications of a panoramic imaging perimeter sensor

    International Nuclear Information System (INIS)

    Pritchard, D.A.

    1995-01-01

    This paper presents an overview of the design and potential applications of a 360-degree scanning, multi-spectral intrusion detection sensor. This moderate-resolution, true panoramic imaging sensor is intended for exterior use at ranges from 50 to 1,500 meters. This Advanced Exterior Sensor (AES) simultaneously uses three sensing technologies (infrared, visible, and radar) along with advanced data processing methods to provide low false-alarm intrusion detection, tracking, and immediate visual assessment. The images from the infrared and visible detector sets and the radar range data are updated as the sensors rotate once per second. The radar provides range data with one-meter resolution. This sensor has been designed for easy use and rapid deployment to cover wide areas beyond or in place of typical perimeters, and tactical applications around fixed or temporary high-value assets. AES prototypes are in development. Applications discussed in this paper include replacements, augmentations, or new installations at fixed sites where topological features, atmospheric conditions, environmental restrictions, ecological regulations, and archaeological features limit the use of conventional security components and systems

  2. Perimeter intrusion sensors

    International Nuclear Information System (INIS)

    Eaton, M.J.

    1977-01-01

    To obtain an effective perimeter intrusion detection system requires careful sensor selection, procurement, and installation. The selection process involves a thorough understanding of the unique site features and how these features affect the performance of each type of sensor. It is necessary to develop procurement specifications to establish acceptable sensor performance limits. Careful explanation and inspection of critical installation dimensions is required during on-site construction. The implementation of these activities at a particular site is discussed

  3. An armored-cable-based fiber Bragg grating sensor array for perimeter fence intrusion detection

    Science.gov (United States)

    Hao, Jianzhong; Dong, Bo; Varghese, Paulose; Phua, Jiliang; Foo, Siang Fook

    2012-01-01

    In this paper, an armored-cable-based optical fiber Bragg grating (FBG) sensor array, for perimeter fence intrusion detection, is demonstrated and some of the field trial results are reported. The field trial was conducted at a critical local installation in Singapore in December 2010. The sensor array was put through a series of both simulated and live intrusion scenarios to test the stability and suitability of operation in the local environmental conditions and to determine its capabilities in detecting and reporting these intrusions accurately to the control station. Such a sensor array can provide perimeter intrusion detection with fine granularity and preset pin-pointing accuracy. The various types of intrusions included aided or unaided climbs, tampering and cutting of the fence, etc. The unique sensor packaging structure provides high sensitivity, crush resistance and protection against rodents. It is also capable of resolving nuisance events such as rain, birds sitting on the fence or seismic vibrations. These sensors are extremely sensitive with a response time of a few seconds. They can be customized for a desired spatial resolution and pre-determined sensitivity. Furthermore, it is easy to cascade a series of such sensors to monitor and detect intrusion events over a long stretch of fence line. Such sensors can be applied to real-time intrusion detection for perimeter security, pipeline security and communications link security.

  4. DAVID: A new video motion sensor for outdoor perimeter applications

    International Nuclear Information System (INIS)

    Alexander, J.C.

    1986-01-01

    To be effective, a perimeter intrusion detection system must comprise both sensor and rapid assessment components. The use of closed circuit television (CCTV) to provide the rapid assessment capability, makes possible the use of video motion detection (VMD) processing as a system sensor component. Despite it's conceptual appeal, video motion detection has not been widely used in outdoor perimeter systems because of an inability to discriminate between genuine intrusions and numerous environmental effects such as cloud shadows, wind motion, reflections, precipitation, etc. The result has been an unacceptably high false alarm rate and operator work-load. DAVID (Digital Automatic Video Intrusion Detector) utilizes new digital signal processing techniques to achieve a dramatic improvement in discrimination performance thereby making video motion detection practical for outdoor applications. This paper begins with a discussion of the key considerations in implementing an outdoor video intrusion detection system, followed by a description of the DAVID design in light of these considerations

  5. Integrated multisensor perimeter detection systems

    Science.gov (United States)

    Kent, P. J.; Fretwell, P.; Barrett, D. J.; Faulkner, D. A.

    2007-10-01

    The report describes the results of a multi-year programme of research aimed at the development of an integrated multi-sensor perimeter detection system capable of being deployed at an operational site. The research was driven by end user requirements in protective security, particularly in threat detection and assessment, where effective capability was either not available or prohibitively expensive. Novel video analytics have been designed to provide robust detection of pedestrians in clutter while new radar detection and tracking algorithms provide wide area day/night surveillance. A modular integrated architecture based on commercially available components has been developed. A graphical user interface allows intuitive interaction and visualisation with the sensors. The fusion of video, radar and other sensor data provides the basis of a threat detection capability for real life conditions. The system was designed to be modular and extendable in order to accommodate future and legacy surveillance sensors. The current sensor mix includes stereoscopic video cameras, mmWave ground movement radar, CCTV and a commercially available perimeter detection cable. The paper outlines the development of the system and describes the lessons learnt after deployment in a pilot trial.

  6. Perimeter intrusion detection and assessment system

    International Nuclear Information System (INIS)

    Eaton, M.J.; Jacobs, J.; McGovern, D.E.

    1977-11-01

    To obtain an effective perimeter intrusion detection system requires careful sensor selection, procurement, and installation. The selection process involves a thorough understanding of the unique site features and how these features affect the performance of each type of sensor. It is necessary to develop procurement specifications to establish acceptable sensor performance limits. Careful explanation and inspection of critical installation dimensions is required during on-site construction. The implementation of these activities at a particular site is discussed

  7. Sparse Detector Imaging Sensor with Two-Class Silhouette Classification

    Directory of Open Access Journals (Sweden)

    David Russomanno

    2008-12-01

    Full Text Available This paper presents the design and test of a simple active near-infrared sparse detector imaging sensor. The prototype of the sensor is novel in that it can capture remarkable silhouettes or profiles of a wide-variety of moving objects, including humans, animals, and vehicles using a sparse detector array comprised of only sixteen sensing elements deployed in a vertical configuration. The prototype sensor was built to collect silhouettes for a variety of objects and to evaluate several algorithms for classifying the data obtained from the sensor into two classes: human versus non-human. Initial tests show that the classification of individually sensed objects into two classes can be achieved with accuracy greater than ninety-nine percent (99% with a subset of the sixteen detectors using a representative dataset consisting of 512 signatures. The prototype also includes a Webservice interface such that the sensor can be tasked in a network-centric environment. The sensor appears to be a low-cost alternative to traditional, high-resolution focal plane array imaging sensors for some applications. After a power optimization study, appropriate packaging, and testing with more extensive datasets, the sensor may be a good candidate for deployment in vast geographic regions for a myriad of intelligent electronic fence and persistent surveillance applications, including perimeter security scenarios.

  8. Nuclear-power-plant perimeter-intrusion alarm systems

    International Nuclear Information System (INIS)

    Halsey, D.J.

    1982-04-01

    Timely intercept of an intruder requires the examination of perimeter barriers and sensors in terms of reliable detection, immediate assessment and prompt response provisions. Perimeter security equipment and operations must at the same time meet the requirements of the Code of Federal Regulations, 10 CFR 73.55 with some attention to the performance and testing figures of Nuclear Regulatory Guide 5.44, Revision 2, May 1980. A baseline system is defined which recommends a general approach to implementing perimeter security elements: barriers, lighting, intrusion detection, alarm assessment. The baseline approach emphasizes cost/effectiveness achieved by detector layering and logic processing of alarm signals to produce reliable alarms and low nuisance alarm rates. A cost benefit of layering along with video assessment is reduction in operating expense. The concept of layering is also shown to minimize testing costs where detectability performance as suggested by Regulatory Guide 5.44 is to be performed. Synthesis of the perimeter intrusion alarm system and limited testing of CCTV and Video Motion Detectors (VMD), were performed at E-Systems, Greenville Division, Greenville, Texas during 1981

  9. Design and evaluation of the ReKon : an integrated detection and assessment perimeter system.

    Energy Technology Data Exchange (ETDEWEB)

    Dabling, Jeffrey Glenn; Andersen, Jason Jann; McLaughlin, James O. [Stonewater Control Systems, Inc., Kannapolis, NC

    2013-02-01

    Kontek Industries (Kannapolis, NC) and their subsidiary, Stonewater Control Systems (Kannapolis, NC), have entered into a cooperative research and development agreement with Sandia to jointly develop and evaluate an integrated perimeter security system solution, one that couples access delay with detection and assessment. This novel perimeter solution was designed to be configurable for use at facilities ranging from high-security military sites to commercial power plants, to petro/chemical facilities of various kinds. A prototype section of the perimeter has been produced and installed at the Sandia Test and Evaluation Center in Albuquerque, NM. This prototype system integrated fiber optic break sensors, active infrared sensors, fence disturbance sensors, video motion detection, and ground sensors. This report documents the design, testing, and performance evaluation of the developed ReKon system. The ability of the system to properly detect pedestrian or vehicle attempts to bypass, breach, or otherwise defeat the system is characterized, as well as the Nuisance Alarm Rate.

  10. Hardware implementation of adaptive filtering using charge-coupled devices. [For perimeter security sensors

    Energy Technology Data Exchange (ETDEWEB)

    Donohoe, G.W.

    1977-01-01

    Sandia Laboratories' Digital Systems Division/1734, as part of its work on the Base and Installation Security Systems (BISS) program has been making use of adaptive digital filters to improve the signal-to-noise ratio of perimeter sensor signals. In particular, the Widrow-Hoff least-mean-squares algorithm has been used extensively. This non-recursive linear predictor has been successful in extracting aperiodic signals from periodic noise. The adaptive filter generates a predictor signal which is subtracted from the input signal to produce an error signal. The value of this error is fed back to the filter to improve the quality of the next prediction. Implementation of the Widrow adaptive filter using a Charge-Coupled Device tapped analog delay line, analog voltage multipliers and operational amplifiers is described. The resulting filter adapts to signals with frequency components as high as several megahertz.

  11. Fiber optic perimeter system for security in smart city

    Science.gov (United States)

    Cubik, Jakub; Kepak, Stanislav; Nedoma, Jan; Fajkus, Marcel; Zboril, Ondrej; Novak, Martin; Jargus, Jan; Vasinek, Vladimir

    2017-10-01

    Protection of persons and assets is the key challenge of Smart City safeguards technologies. Conventional security technologies are often outdated and easy to breach. Therefore, new technologies that could complement existing systems or replace them are developed. The use of optical fibers and their subsequent application in sensing is a trend of recent years. This article discusses the use of fiber-optic sensors in perimeter protection. The sensor consists of optical fibers and couplers only and being constructed without wires and metal parts bring many advantages. These include an absence of interference with electromagnetic waves, system presence can be difficult to detect as well as affect its operation. Testing installation of perimeter system was carried out under reinforced concrete structure. Subjects walked over the bridge at different speeds and over the different routes. The task for the system was an absolute detection of all subjects. The proposed system should find application mainly in areas with the presence of volatile substances, strong electromagnetic fields, or in explosive areas.

  12. Fiber-optic perimeter security system based on WDM technology

    Science.gov (United States)

    Polyakov, Alexandre V.

    2017-10-01

    Intelligent underground fiber optic perimeter security system is presented. Their structure, operation, software and hardware with neural networks elements are described. System allows not only to establish the fact of violation of the perimeter, but also to locate violations. This is achieved through the use of WDM-technology division spectral information channels. As used quasi-distributed optoelectronic recirculation system as a discrete sensor. The principle of operation is based on registration of the recirculation period change in the closed optoelectronic circuit at different wavelengths under microstrain exposed optical fiber. As a result microstrain fiber having additional power loss in a fiber optical propagating pulse, which causes a time delay as a result of switching moments of the threshold device. To separate the signals generated by intruder noise and interference, the signal analyzer is used, based on the principle of a neural network. The system detects walking, running or crawling intruder, as well as undermining attempts to register under the perimeter line. These alarm systems can be used to protect the perimeters of facilities such as airports, nuclear reactors, power plants, warehouses, and other extended territory.

  13. Nanophotonic Image Sensors.

    Science.gov (United States)

    Chen, Qin; Hu, Xin; Wen, Long; Yu, Yan; Cumming, David R S

    2016-09-01

    The increasing miniaturization and resolution of image sensors bring challenges to conventional optical elements such as spectral filters and polarizers, the properties of which are determined mainly by the materials used, including dye polymers. Recent developments in spectral filtering and optical manipulating techniques based on nanophotonics have opened up the possibility of an alternative method to control light spectrally and spatially. By integrating these technologies into image sensors, it will become possible to achieve high compactness, improved process compatibility, robust stability and tunable functionality. In this Review, recent representative achievements on nanophotonic image sensors are presented and analyzed including image sensors with nanophotonic color filters and polarizers, metamaterial-based THz image sensors, filter-free nanowire image sensors and nanostructured-based multispectral image sensors. This novel combination of cutting edge photonics research and well-developed commercial products may not only lead to an important application of nanophotonics but also offer great potential for next generation image sensors beyond Moore's Law expectations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Photon-counting image sensors

    CERN Document Server

    Teranishi, Nobukazu; Theuwissen, Albert; Stoppa, David; Charbon, Edoardo

    2017-01-01

    The field of photon-counting image sensors is advancing rapidly with the development of various solid-state image sensor technologies including single photon avalanche detectors (SPADs) and deep-sub-electron read noise CMOS image sensor pixels. This foundational platform technology will enable opportunities for new imaging modalities and instrumentation for science and industry, as well as new consumer applications. Papers discussing various photon-counting image sensor technologies and selected new applications are presented in this all-invited Special Issue.

  15. Thermal infrared panoramic imaging sensor

    Science.gov (United States)

    Gutin, Mikhail; Tsui, Eddy K.; Gutin, Olga; Wang, Xu-Ming; Gutin, Alexey

    2006-05-01

    Panoramic cameras offer true real-time, 360-degree coverage of the surrounding area, valuable for a variety of defense and security applications, including force protection, asset protection, asset control, security including port security, perimeter security, video surveillance, border control, airport security, coastguard operations, search and rescue, intrusion detection, and many others. Automatic detection, location, and tracking of targets outside protected area ensures maximum protection and at the same time reduces the workload on personnel, increases reliability and confidence of target detection, and enables both man-in-the-loop and fully automated system operation. Thermal imaging provides the benefits of all-weather, 24-hour day/night operation with no downtime. In addition, thermal signatures of different target types facilitate better classification, beyond the limits set by camera's spatial resolution. The useful range of catadioptric panoramic cameras is affected by their limited resolution. In many existing systems the resolution is optics-limited. Reflectors customarily used in catadioptric imagers introduce aberrations that may become significant at large camera apertures, such as required in low-light and thermal imaging. Advantages of panoramic imagers with high image resolution include increased area coverage with fewer cameras, instantaneous full horizon detection, location and tracking of multiple targets simultaneously, extended range, and others. The Automatic Panoramic Thermal Integrated Sensor (APTIS), being jointly developed by Applied Science Innovative, Inc. (ASI) and the Armament Research, Development and Engineering Center (ARDEC) combines the strengths of improved, high-resolution panoramic optics with thermal imaging in the 8 - 14 micron spectral range, leveraged by intelligent video processing for automated detection, location, and tracking of moving targets. The work in progress supports the Future Combat Systems (FCS) and the

  16. Data fusion concept in multispectral system for perimeter protection of stationary and moving objects

    Science.gov (United States)

    Ciurapiński, Wieslaw; Dulski, Rafal; Kastek, Mariusz; Szustakowski, Mieczyslaw; Bieszczad, Grzegorz; Życzkowski, Marek; Trzaskawka, Piotr; Piszczek, Marek

    2009-09-01

    The paper presents the concept of multispectral protection system for perimeter protection for stationary and moving objects. The system consists of active ground radar, thermal and visible cameras. The radar allows the system to locate potential intruders and to control an observation area for system cameras. The multisensor construction of the system ensures significant improvement of detection probability of intruder and reduction of false alarms. A final decision from system is worked out using image data. The method of data fusion used in the system has been presented. The system is working under control of FLIR Nexus system. The Nexus offers complete technology and components to create network-based, high-end integrated systems for security and surveillance applications. Based on unique "plug and play" architecture, system provides unmatched flexibility and simplistic integration of sensors and devices in TCP/IP networks. Using a graphical user interface it is possible to control sensors and monitor streaming video and other data over the network, visualize the results of data fusion process and obtain detailed information about detected intruders over a digital map. System provides high-level applications and operator workload reduction with features such as sensor to sensor cueing from detection devices, automatic e-mail notification and alarm triggering.

  17. Perimeter security for Minnesota correctional facilities

    Energy Technology Data Exchange (ETDEWEB)

    Crist, D. [Minnesota Department of Corrections, St. Paul, MN (United States); Spencer, D.D. [Sandia National Labs., Albuquerque, NM (United States)

    1996-12-31

    For the past few years, the Minnesota Department of Corrections, assisted by Sandia National Laboratories, has developed a set of standards for perimeter security at medium, close, and maximum custody correctional facilities in the state. During this process, the threat to perimeter security was examined and concepts about correctional perimeter security were developed. This presentation and paper will review the outcomes of this effort, some of the lessons learned, and the concepts developed during this process and in the course of working with architects, engineers and construction firms as the state upgraded perimeter security at some facilities and planned new construction at other facilities.

  18. Focus on image sensors

    NARCIS (Netherlands)

    Jos Gunsing; Daniël Telgen; Johan van Althuis; Jaap van de Loosdrecht; Mark Stappers; Peter Klijn

    2013-01-01

    Robots need sensors to operate properly. Using a single image sensor, various aspects of a robot operating in its environment can be measured or monitored. Over the past few years, image sensors have improved a lot: frame rate and resolution have increased, while prices have fallen. As a result,

  19. Fire Perimeters

    Data.gov (United States)

    California Natural Resource Agency — The Fire Perimeters data consists of CDF fires 300 acres and greater in size and USFS fires 10 acres and greater throughout California from 1950 to 2003. Some fires...

  20. Large area CMOS image sensors

    International Nuclear Information System (INIS)

    Turchetta, R; Guerrini, N; Sedgwick, I

    2011-01-01

    CMOS image sensors, also known as CMOS Active Pixel Sensors (APS) or Monolithic Active Pixel Sensors (MAPS), are today the dominant imaging devices. They are omnipresent in our daily life, as image sensors in cellular phones, web cams, digital cameras, ... In these applications, the pixels can be very small, in the micron range, and the sensors themselves tend to be limited in size. However, many scientific applications, like particle or X-ray detection, require large format, often with large pixels, as well as other specific performance, like low noise, radiation hardness or very fast readout. The sensors are also required to be sensitive to a broad spectrum of radiation: photons from the silicon cut-off in the IR down to UV and X- and gamma-rays through the visible spectrum as well as charged particles. This requirement calls for modifications to the substrate to be introduced to provide optimized sensitivity. This paper will review existing CMOS image sensors, whose size can be as large as a single CMOS wafer, and analyse the technical requirements and specific challenges of large format CMOS image sensors.

  1. Edge pixel response studies of edgeless silicon sensor technology for pixellated imaging detectors

    Science.gov (United States)

    Maneuski, D.; Bates, R.; Blue, A.; Buttar, C.; Doonan, K.; Eklund, L.; Gimenez, E. N.; Hynds, D.; Kachkanov, S.; Kalliopuska, J.; McMullen, T.; O'Shea, V.; Tartoni, N.; Plackett, R.; Vahanen, S.; Wraight, K.

    2015-03-01

    Silicon sensor technologies with reduced dead area at the sensor's perimeter are under development at a number of institutes. Several fabrication methods for sensors which are sensitive close to the physical edge of the device are under investigation utilising techniques such as active-edges, passivated edges and current-terminating rings. Such technologies offer the goal of a seamlessly tiled detection surface with minimum dead space between the individual modules. In order to quantify the performance of different geometries and different bulk and implant types, characterisation of several sensors fabricated using active-edge technology were performed at the B16 beam line of the Diamond Light Source. The sensors were fabricated by VTT and bump-bonded to Timepix ROICs. They were 100 and 200 μ m thick sensors, with the last pixel-to-edge distance of either 50 or 100 μ m. The sensors were fabricated as either n-on-n or n-on-p type devices. Using 15 keV monochromatic X-rays with a beam spot of 2.5 μ m, the performance at the outer edge and corners pixels of the sensors was evaluated at three bias voltages. The results indicate a significant change in the charge collection properties between the edge and 5th (up to 275 μ m) from edge pixel for the 200 μ m thick n-on-n sensor. The edge pixel performance of the 100 μ m thick n-on-p sensors is affected only for the last two pixels (up to 110 μ m) subject to biasing conditions. Imaging characteristics of all sensor types investigated are stable over time and the non-uniformities can be minimised by flat-field corrections. The results from the synchrotron tests combined with lab measurements are presented along with an explanation of the observed effects.

  2. Temperature Sensors Integrated into a CMOS Image Sensor

    NARCIS (Netherlands)

    Abarca Prouza, A.N.; Xie, S.; Markenhof, Jules; Theuwissen, A.J.P.

    2017-01-01

    In this work, a novel approach is presented for measuring relative temperature variations inside the pixel array of a CMOS image sensor itself. This approach can give important information when compensation for dark (current) fixed pattern noise (FPN) is needed. The test image sensor consists of

  3. 24 CFR 3285.307 - Perimeter support piers.

    Science.gov (United States)

    2010-04-01

    ... URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS Foundations § 3285.307 Perimeter support piers. (a) Piers required at mate-line supports, perimeter piers, and piers at exterior wall...

  4. Experimental study of abdominal CT scanning exposal doses adjusted on the basis of pediatric abdominal perimeter

    International Nuclear Information System (INIS)

    Wei Wenzhou; Zhu Gongsheng; Zeng Lingyan; Yin Xianglin; Yang Fuwen; Liu Changsheng

    2006-01-01

    Objective: To optimize the abdominal helical CT scanning parameters in pediatric patients and to reduce its radiation hazards. Methods: 60 canines were evenly grouped into 4 groups on the basis of pediatric abdominal perimeter, scanned with 110,150,190 and 240 mAs, and their qualities of canine CT images were analyzed. 120 pediafric patients with clinic suspected abdominal diseases were divided into 4 groups on the basis of abdominal perimeter, scanned by optimal parameters and their image qualities were analyzed. Results: After CT exposure were reduced, the percentages of total A and B were 90.9 % and 92.0 % in experimental canines and in pediatric patients, respectively. Compared with conventional CT scanning, the exposure and single slice CT dose index weighted (CTDIw) were reduced to 45.8%-79.17%. Conclusion: By adjusted the pediatric helical CT parameters basedon the of pediatric abdominal perimeter, exposure of patient to the hazards of radiation is reduced. (authors)

  5. Divergence-Measure Fields, Sets of Finite Perimeter, and Conservation Laws

    Science.gov (United States)

    Chen, Gui-Qiang; Torres, Monica

    2005-02-01

    Divergence-measure fields in L∞ over sets of finite perimeter are analyzed. A notion of normal traces over boundaries of sets of finite perimeter is introduced, and the Gauss-Green formula over sets of finite perimeter is established for divergence-measure fields in L∞. The normal trace introduced here over a class of surfaces of finite perimeter is shown to be the weak-star limit of the normal traces introduced in Chen & Frid [6] over the Lipschitz deformation surfaces, which implies their consistency. As a corollary, an extension theorem of divergence-measure fields in L∞ over sets of finite perimeter is also established. Then we apply the theory to the initial-boundary value problem of nonlinear hyperbolic conservation laws over sets of finite perimeter.

  6. Automated Registration Of Images From Multiple Sensors

    Science.gov (United States)

    Rignot, Eric J. M.; Kwok, Ronald; Curlander, John C.; Pang, Shirley S. N.

    1994-01-01

    Images of terrain scanned in common by multiple Earth-orbiting remote sensors registered automatically with each other and, where possible, on geographic coordinate grid. Simulated image of terrain viewed by sensor computed from ancillary data, viewing geometry, and mathematical model of physics of imaging. In proposed registration algorithm, simulated and actual sensor images matched by area-correlation technique.

  7. Fire Perimeters - Southern California, Fall 2007 [ds385

    Data.gov (United States)

    California Natural Resource Agency — Southern Callifornia fire perimeters for the Fall 2007 wildfires. The perimeters were assembled from various sources by California Department of Fish and Game (DFG)...

  8. Tooth angulation and dental arch perimeter-the effect of orthodontic bracket prescription.

    Science.gov (United States)

    Pontes, Luana F; Cecim, Rodolpho L; Machado, Sissy M; Normando, David

    2015-08-01

    The aim of this study was to evaluate the effects of upper incisors and canine angulations introduced by different bracket prescriptions on dental arch perimeter. Cone beam computerized tomography scans collected using I-Cat (Imaging Sciences International, Hatfield, PA, USA) were selected conveniently from a database of routine exams of a clinical radiology center. Crown and radicular measurements of upper incisors and canines were made and exported to the Autocad 2011 software to create a virtual dental model. The virtual teeth were positioned with an angulation of zero; thereafter, a reference value for the perimeter of the arch was measured. Furthermore, teeth angulations were applied according to the standards of the Edgewise bracket system and the Straight-wire systems: MBT, Capelozza, Andrews, and Roth. The largest linear distances for tooth crown (anterior arch perimeter) and root (radicular distance) were obtained for each bracket prescription. The anterior perimeter for well-aligned incisors and canines without angulation was used as reference (crown: 47.34mm; root: 39.13mm). An increase in the arch perimeter was obtained for all bracket prescriptions evaluated, which ranged from 0.28 and 3.19mm in the Edgewise technique, for the crown and root measurements, respectively, to 1.09 and 11.28mm for the Roth prescription. Bracket prescriptions with greater angulation led to an increased use of space within the dental arch, mainly in the radicular region. The consequence of this radicular angular displacement will need to be further investigated. © The Author 2014. Published by Oxford University Press on behalf of the European Orthodontic Society. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  9. Image-based occupancy sensor

    Science.gov (United States)

    Polese, Luigi Gentile; Brackney, Larry

    2015-05-19

    An image-based occupancy sensor includes a motion detection module that receives and processes an image signal to generate a motion detection signal, a people detection module that receives the image signal and processes the image signal to generate a people detection signal, a face detection module that receives the image signal and processes the image signal to generate a face detection signal, and a sensor integration module that receives the motion detection signal from the motion detection module, receives the people detection signal from the people detection module, receives the face detection signal from the face detection module, and generates an occupancy signal using the motion detection signal, the people detection signal, and the face detection signal, with the occupancy signal indicating vacancy or occupancy, with an occupancy indication specifying that one or more people are detected within the monitored volume.

  10. CMOS foveal image sensor chip

    Science.gov (United States)

    Bandera, Cesar (Inventor); Scott, Peter (Inventor); Sridhar, Ramalingam (Inventor); Xia, Shu (Inventor)

    2002-01-01

    A foveal image sensor integrated circuit comprising a plurality of CMOS active pixel sensors arranged both within and about a central fovea region of the chip. The pixels in the central fovea region have a smaller size than the pixels arranged in peripheral rings about the central region. A new photocharge normalization scheme and associated circuitry normalizes the output signals from the different size pixels in the array. The pixels are assembled into a multi-resolution rectilinear foveal image sensor chip using a novel access scheme to reduce the number of analog RAM cells needed. Localized spatial resolution declines monotonically with offset from the imager's optical axis, analogous to biological foveal vision.

  11. CMOS image sensor-based implantable glucose sensor using glucose-responsive fluorescent hydrogel.

    Science.gov (United States)

    Tokuda, Takashi; Takahashi, Masayuki; Uejima, Kazuhiro; Masuda, Keita; Kawamura, Toshikazu; Ohta, Yasumi; Motoyama, Mayumi; Noda, Toshihiko; Sasagawa, Kiyotaka; Okitsu, Teru; Takeuchi, Shoji; Ohta, Jun

    2014-11-01

    A CMOS image sensor-based implantable glucose sensor based on an optical-sensing scheme is proposed and experimentally verified. A glucose-responsive fluorescent hydrogel is used as the mediator in the measurement scheme. The wired implantable glucose sensor was realized by integrating a CMOS image sensor, hydrogel, UV light emitting diodes, and an optical filter on a flexible polyimide substrate. Feasibility of the glucose sensor was verified by both in vitro and in vivo experiments.

  12. A 128 x 128 CMOS Active Pixel Image Sensor for Highly Integrated Imaging Systems

    Science.gov (United States)

    Mendis, Sunetra K.; Kemeny, Sabrina E.; Fossum, Eric R.

    1993-01-01

    A new CMOS-based image sensor that is intrinsically compatible with on-chip CMOS circuitry is reported. The new CMOS active pixel image sensor achieves low noise, high sensitivity, X-Y addressability, and has simple timing requirements. The image sensor was fabricated using a 2 micrometer p-well CMOS process, and consists of a 128 x 128 array of 40 micrometer x 40 micrometer pixels. The CMOS image sensor technology enables highly integrated smart image sensors, and makes the design, incorporation and fabrication of such sensors widely accessible to the integrated circuit community.

  13. CMOS Image Sensors: Electronic Camera On A Chip

    Science.gov (United States)

    Fossum, E. R.

    1995-01-01

    Recent advancements in CMOS image sensor technology are reviewed, including both passive pixel sensors and active pixel sensors. On- chip analog to digital converters and on-chip timing and control circuits permit realization of an electronic camera-on-a-chip. Highly miniaturized imaging systems based on CMOS image sensor technology are emerging as a competitor to charge-coupled devices for low cost uses.

  14. Image Sensor

    OpenAIRE

    Jerram, Paul; Stefanov, Konstantin

    2017-01-01

    An image sensor of the type for providing charge multiplication by impact ionisation has plurality of multiplication elements. Each element is arranged to receive charge from photosensitive elements of an image area and each element comprises a sequence of electrodes to move charge along a transport path. Each of the electrodes has an edge defining a boundary with a first electrode, a maximum width across the charge transport path and a leading edge that defines a boundary with a second elect...

  15. Priority image transmission in wireless sensor networks

    International Nuclear Information System (INIS)

    Nasri, M.; Helali, A.; Sghaier, H.; Maaref, H.

    2011-01-01

    The emerging technology during the last years allowed the development of new sensors equipped with wireless communication which can be organized into a cooperative autonomous network. Some application areas for wireless sensor networks (WSNs) are home automations, health care services, military domain, and environment monitoring. The required constraints are limited capacity of processing, limited storage capability, and especially these nodes are limited in energy. In addition, such networks are tiny battery powered which their lifetime is very limited. During image processing and transmission to the destination, the lifetime of sensor network is decreased quickly due to battery and processing power constraints. Therefore, digital image transmissions are a significant challenge for image sensor based Wireless Sensor Networks (WSNs). Based on a wavelet image compression, we propose a novel, robust and energy-efficient scheme, called Priority Image Transmission (PIT) in WSN by providing various priority levels during image transmissions. Different priorities in the compressed image are considered. The information for the significant wavelet coeffcients are transmitted with higher quality assurance, whereas relatively less important coefficients are transmitted with lower overhead. Simulation results show that the proposed scheme prolongs the system lifetime and achieves higher energy efficiency in WSN with an acceptable compromise on the image quality.

  16. Imaging in scattering media using correlation image sensors and sparse convolutional coding

    KAUST Repository

    Heide, Felix; Xiao, Lei; Kolb, Andreas; Hullin, Matthias B.; Heidrich, Wolfgang

    2014-01-01

    Correlation image sensors have recently become popular low-cost devices for time-of-flight, or range cameras. They usually operate under the assumption of a single light path contributing to each pixel. We show that a more thorough analysis of the sensor data from correlation sensors can be used can be used to analyze the light transport in much more complex environments, including applications for imaging through scattering and turbid media. The key of our method is a new convolutional sparse coding approach for recovering transient (light-in-flight) images from correlation image sensors. This approach is enabled by an analysis of sparsity in complex transient images, and the derivation of a new physically-motivated model for transient images with drastically improved sparsity.

  17. Imaging in scattering media using correlation image sensors and sparse convolutional coding

    KAUST Repository

    Heide, Felix

    2014-10-17

    Correlation image sensors have recently become popular low-cost devices for time-of-flight, or range cameras. They usually operate under the assumption of a single light path contributing to each pixel. We show that a more thorough analysis of the sensor data from correlation sensors can be used can be used to analyze the light transport in much more complex environments, including applications for imaging through scattering and turbid media. The key of our method is a new convolutional sparse coding approach for recovering transient (light-in-flight) images from correlation image sensors. This approach is enabled by an analysis of sparsity in complex transient images, and the derivation of a new physically-motivated model for transient images with drastically improved sparsity.

  18. Fleet Protection Using a Small UAV Based IR Sensor

    National Research Council Canada - National Science Library

    Buss, James R; Ax, Jr, George R

    2005-01-01

    A study was performed to define candidate electro-optical and infrared (EO/IR) sensor configurations and assess their potential utility as small UAV-based sensors surveilling a perimeter around surface fleet assets...

  19. Beam imaging sensor and method for using same

    Energy Technology Data Exchange (ETDEWEB)

    McAninch, Michael D.; Root, Jeffrey J.

    2017-01-03

    The present invention relates generally to the field of sensors for beam imaging and, in particular, to a new and useful beam imaging sensor for use in determining, for example, the power density distribution of a beam including, but not limited to, an electron beam or an ion beam. In one embodiment, the beam imaging sensor of the present invention comprises, among other items, a circumferential slit that is either circular, elliptical or polygonal in nature. In another embodiment, the beam imaging sensor of the present invention comprises, among other things, a discontinuous partially circumferential slit. Also disclosed is a method for using the various beams sensor embodiments of the present invention.

  20. Intrusion recognition for optic fiber vibration sensor based on the selective attention mechanism

    Science.gov (United States)

    Xu, Haiyan; Xie, Yingjuan; Li, Min; Zhang, Zhuo; Zhang, Xuewu

    2017-11-01

    Distributed fiber-optic vibration sensors receive extensive investigation and play a significant role in the sensor panorama. A fiber optic perimeter detection system based on all-fiber interferometric sensor is proposed, through the back-end analysis, processing and intelligent identification, which can distinguish effects of different intrusion activities. In this paper, an intrusion recognition based on the auditory selective attention mechanism is proposed. Firstly, considering the time-frequency of vibration, the spectrogram is calculated. Secondly, imitating the selective attention mechanism, the color, direction and brightness map of the spectrogram is computed. Based on these maps, the feature matrix is formed after normalization. The system could recognize the intrusion activities occurred along the perimeter sensors. Experiment results show that the proposed method for the perimeter is able to differentiate intrusion signals from ambient noises. What's more, the recognition rate of the system is improved while deduced the false alarm rate, the approach is proved by large practical experiment and project.

  1. Image-based environmental monitoring sensor application using an embedded wireless sensor network.

    Science.gov (United States)

    Paek, Jeongyeup; Hicks, John; Coe, Sharon; Govindan, Ramesh

    2014-08-28

    This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet's built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Cannot Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions.

  2. Image-Based Environmental Monitoring Sensor Application Using an Embedded Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Jeongyeup Paek

    2014-08-01

    Full Text Available This article discusses the experiences from the development and deployment of two image-based environmental monitoring sensor applications using an embedded wireless sensor network. Our system uses low-power image sensors and the Tenet general purpose sensing system for tiered embedded wireless sensor networks. It leverages Tenet’s built-in support for reliable delivery of high rate sensing data, scalability and its flexible scripting language, which enables mote-side image compression and the ease of deployment. Our first deployment of a pitfall trap monitoring application at the James San Jacinto Mountain Reserve provided us with insights and lessons learned into the deployment of and compression schemes for these embedded wireless imaging systems. Our three month-long deployment of a bird nest monitoring application resulted in over 100,000 images collected from a 19-camera node network deployed over an area of 0.05 square miles, despite highly variable environmental conditions. Our biologists found the on-line, near-real-time access to images to be useful for obtaining data on answering their biological questions.

  3. Micro-digital sun sensor: an imaging sensor for space applications

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.; Büttgen, B.; Hakkesteegt, H.C.; Jasen, H.; Leijtens, J.A.P.

    2010-01-01

    Micro-Digital Sun Sensor is an attitude sensor which senses relative position of micro-satellites to the sun in space. It is composed of a solar cell power supply, a RF communication block and an imaging chip which is called APS+. The APS+ integrates a CMOS Active Pixel Sensor (APS) of 512×512

  4. Toward CMOS image sensor based glucose monitoring.

    Science.gov (United States)

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2012-09-07

    Complementary metal oxide semiconductor (CMOS) image sensor is a powerful tool for biosensing applications. In this present study, CMOS image sensor has been exploited for detecting glucose levels by simple photon count variation with high sensitivity. Various concentrations of glucose (100 mg dL(-1) to 1000 mg dL(-1)) were added onto a simple poly-dimethylsiloxane (PDMS) chip and the oxidation of glucose was catalyzed with the aid of an enzymatic reaction. Oxidized glucose produces a brown color with the help of chromogen during enzymatic reaction and the color density varies with the glucose concentration. Photons pass through the PDMS chip with varying color density and hit the sensor surface. Photon count was recognized by CMOS image sensor depending on the color density with respect to the glucose concentration and it was converted into digital form. By correlating the obtained digital results with glucose concentration it is possible to measure a wide range of blood glucose levels with great linearity based on CMOS image sensor and therefore this technique will promote a convenient point-of-care diagnosis.

  5. Establishing imaging sensor specifications for digital still cameras

    Science.gov (United States)

    Kriss, Michael A.

    2007-02-01

    Digital Still Cameras, DSCs, have now displaced conventional still cameras in most markets. The heart of a DSC is thought to be the imaging sensor, be it Full Frame CCD, and Interline CCD, a CMOS sensor or the newer Foveon buried photodiode sensors. There is a strong tendency by consumers to consider only the number of mega-pixels in a camera and not to consider the overall performance of the imaging system, including sharpness, artifact control, noise, color reproduction, exposure latitude and dynamic range. This paper will provide a systematic method to characterize the physical requirements of an imaging sensor and supporting system components based on the desired usage. The analysis is based on two software programs that determine the "sharpness", potential for artifacts, sensor "photographic speed", dynamic range and exposure latitude based on the physical nature of the imaging optics, sensor characteristics (including size of pixels, sensor architecture, noise characteristics, surface states that cause dark current, quantum efficiency, effective MTF, and the intrinsic full well capacity in terms of electrons per square centimeter). Examples will be given for consumer, pro-consumer, and professional camera systems. Where possible, these results will be compared to imaging system currently on the market.

  6. Fusion of Images from Dissimilar Sensor Systems

    National Research Council Canada - National Science Library

    Chow, Khin

    2004-01-01

    Different sensors exploit different regions of the electromagnetic spectrum; therefore a multi-sensor image fusion system can take full advantage of the complementary capabilities of individual sensors in the suit...

  7. Collaborative Image Coding and Transmission over Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Min Wu

    2007-01-01

    Full Text Available The imaging sensors are able to provide intuitive visual information for quick recognition and decision. However, imaging sensors usually generate vast amount of data. Therefore, processing and coding of image data collected in a sensor network for the purpose of energy efficient transmission poses a significant technical challenge. In particular, multiple sensors may be collecting similar visual information simultaneously. We propose in this paper a novel collaborative image coding and transmission scheme to minimize the energy for data transmission. First, we apply a shape matching method to coarsely register images to find out maximal overlap to exploit the spatial correlation between images acquired from neighboring sensors. For a given image sequence, we transmit background image only once. A lightweight and efficient background subtraction method is employed to detect targets. Only the regions of target and their spatial locations are transmitted to the monitoring center. The whole image can then be reconstructed by fusing the background and the target images as well as their spatial locations. Experimental results show that the energy for image transmission can indeed be greatly reduced with collaborative image coding and transmission.

  8. An Imaging Sensor-Aided Vision Navigation Approach that Uses a Geo-Referenced Image Database.

    Science.gov (United States)

    Li, Yan; Hu, Qingwu; Wu, Meng; Gao, Yang

    2016-01-28

    In determining position and attitude, vision navigation via real-time image processing of data collected from imaging sensors is advanced without a high-performance global positioning system (GPS) and an inertial measurement unit (IMU). Vision navigation is widely used in indoor navigation, far space navigation, and multiple sensor-integrated mobile mapping. This paper proposes a novel vision navigation approach aided by imaging sensors and that uses a high-accuracy geo-referenced image database (GRID) for high-precision navigation of multiple sensor platforms in environments with poor GPS. First, the framework of GRID-aided vision navigation is developed with sequence images from land-based mobile mapping systems that integrate multiple sensors. Second, a highly efficient GRID storage management model is established based on the linear index of a road segment for fast image searches and retrieval. Third, a robust image matching algorithm is presented to search and match a real-time image with the GRID. Subsequently, the image matched with the real-time scene is considered to calculate the 3D navigation parameter of multiple sensor platforms. Experimental results show that the proposed approach retrieves images efficiently and has navigation accuracies of 1.2 m in a plane and 1.8 m in height under GPS loss in 5 min and within 1500 m.

  9. CMOS Active-Pixel Image Sensor With Intensity-Driven Readout

    Science.gov (United States)

    Langenbacher, Harry T.; Fossum, Eric R.; Kemeny, Sabrina

    1996-01-01

    Proposed complementary metal oxide/semiconductor (CMOS) integrated-circuit image sensor automatically provides readouts from pixels in order of decreasing illumination intensity. Sensor operated in integration mode. Particularly useful in number of image-sensing tasks, including diffractive laser range-finding, three-dimensional imaging, event-driven readout of sparse sensor arrays, and star tracking.

  10. Commercial CMOS image sensors as X-ray imagers and particle beam monitors

    International Nuclear Information System (INIS)

    Castoldi, A.; Guazzoni, C.; Maffessanti, S.; Montemurro, G.V.; Carraresi, L.

    2015-01-01

    CMOS image sensors are widely used in several applications such as mobile handsets webcams and digital cameras among others. Furthermore they are available across a wide range of resolutions with excellent spectral and chromatic responses. In order to fulfill the need of cheap systems as beam monitors and high resolution image sensors for scientific applications we exploited the possibility of using commercial CMOS image sensors as X-rays and proton detectors. Two different sensors have been mounted and tested. An Aptina MT9v034, featuring 752 × 480 pixels, 6μm × 6μm pixel size has been mounted and successfully tested as bi-dimensional beam profile monitor, able to take pictures of the incoming proton bunches at the DeFEL beamline (1–6 MeV pulsed proton beam) of the LaBeC of INFN in Florence. The naked sensor is able to successfully detect the interactions of the single protons. The sensor point-spread-function (PSF) has been qualified with 1MeV protons and is equal to one pixel (6 mm) r.m.s. in both directions. A second sensor MT9M032, featuring 1472 × 1096 pixels, 2.2 × 2.2 μm pixel size has been mounted on a dedicated board as high-resolution imager to be used in X-ray imaging experiments with table-top generators. In order to ease and simplify the data transfer and the image acquisition the system is controlled by a dedicated micro-processor board (DM3730 1GHz SoC ARM Cortex-A8) on which a modified LINUX kernel has been implemented. The paper presents the architecture of the sensor systems and the results of the experimental measurements

  11. Virtual View Image over Wireless Visual Sensor Network

    Directory of Open Access Journals (Sweden)

    Gamantyo Hendrantoro

    2011-12-01

    Full Text Available In general, visual sensors are applied to build virtual view images. When number of visual sensors increases then quantity and quality of the information improves. However, the view images generation is a challenging task in Wireless Visual Sensor Network environment due to energy restriction, computation complexity, and bandwidth limitation. Hence this paper presents a new method of virtual view images generation from selected cameras on Wireless Visual Sensor Network. The aim of the paper is to meet bandwidth and energy limitations without reducing information quality. The experiment results showed that this method could minimize number of transmitted imageries with sufficient information.

  12. Imaging system design and image interpolation based on CMOS image sensor

    Science.gov (United States)

    Li, Yu-feng; Liang, Fei; Guo, Rui

    2009-11-01

    An image acquisition system is introduced, which consists of a color CMOS image sensor (OV9620), SRAM (CY62148), CPLD (EPM7128AE) and DSP (TMS320VC5509A). The CPLD implements the logic and timing control to the system. SRAM stores the image data, and DSP controls the image acquisition system through the SCCB (Omni Vision Serial Camera Control Bus). The timing sequence of the CMOS image sensor OV9620 is analyzed. The imaging part and the high speed image data memory unit are designed. The hardware and software design of the image acquisition and processing system is given. CMOS digital cameras use color filter arrays to sample different spectral components, such as red, green, and blue. At the location of each pixel only one color sample is taken, and the other colors must be interpolated from neighboring samples. We use the edge-oriented adaptive interpolation algorithm for the edge pixels and bilinear interpolation algorithm for the non-edge pixels to improve the visual quality of the interpolated images. This method can get high processing speed, decrease the computational complexity, and effectively preserve the image edges.

  13. Smart CMOS image sensor for lightning detection and imaging.

    Science.gov (United States)

    Rolando, Sébastien; Goiffon, Vincent; Magnan, Pierre; Corbière, Franck; Molina, Romain; Tulet, Michel; Bréart-de-Boisanger, Michel; Saint-Pé, Olivier; Guiry, Saïprasad; Larnaudie, Franck; Leone, Bruno; Perez-Cuevas, Leticia; Zayer, Igor

    2013-03-01

    We present a CMOS image sensor dedicated to lightning detection and imaging. The detector has been designed to evaluate the potentiality of an on-chip lightning detection solution based on a smart sensor. This evaluation is performed in the frame of the predevelopment phase of the lightning detector that will be implemented in the Meteosat Third Generation Imager satellite for the European Space Agency. The lightning detection process is performed by a smart detector combining an in-pixel frame-to-frame difference comparison with an adjustable threshold and on-chip digital processing allowing an efficient localization of a faint lightning pulse on the entire large format array at a frequency of 1 kHz. A CMOS prototype sensor with a 256×256 pixel array and a 60 μm pixel pitch has been fabricated using a 0.35 μm 2P 5M technology and tested to validate the selected detection approach.

  14. Robust Dehaze Algorithm for Degraded Image of CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Chen Qu

    2017-09-01

    Full Text Available The CMOS (Complementary Metal-Oxide-Semiconductor is a new type of solid image sensor device widely used in object tracking, object recognition, intelligent navigation fields, and so on. However, images captured by outdoor CMOS sensor devices are usually affected by suspended atmospheric particles (such as haze, causing a reduction in image contrast, color distortion problems, and so on. In view of this, we propose a novel dehazing approach based on a local consistent Markov random field (MRF framework. The neighboring clique in traditional MRF is extended to the non-neighboring clique, which is defined on local consistent blocks based on two clues, where both the atmospheric light and transmission map satisfy the character of local consistency. In this framework, our model can strengthen the restriction of the whole image while incorporating more sophisticated statistical priors, resulting in more expressive power of modeling, thus, solving inadequate detail recovery effectively and alleviating color distortion. Moreover, the local consistent MRF framework can obtain details while maintaining better results for dehazing, which effectively improves the image quality captured by the CMOS image sensor. Experimental results verified that the method proposed has the combined advantages of detail recovery and color preservation.

  15. Oriented Edge-Based Feature Descriptor for Multi-Sensor Image Alignment and Enhancement

    Directory of Open Access Journals (Sweden)

    Myung-Ho Ju

    2013-10-01

    Full Text Available In this paper, we present an efficient image alignment and enhancement method for multi-sensor images. The shape of the object captured in a multi-sensor images can be determined by comparing variability of contrast using corresponding edges across multi-sensor image. Using this cue, we construct a robust feature descriptor based on the magnitudes of the oriented edges. Our proposed method enables fast image alignment by identifying matching features in multi-sensor images. We enhance the aligned multi-sensor images through the fusion of the salient regions from each image. The results of stitching the multi-sensor images and their enhancement demonstrate that our proposed method can align and enhance multi-sensor images more efficiently than previous methods.

  16. Contact CMOS imaging of gaseous oxygen sensor array.

    Science.gov (United States)

    Daivasagaya, Daisy S; Yao, Lei; Yi Yung, Ka; Hajj-Hassan, Mohamad; Cheung, Maurice C; Chodavarapu, Vamsy P; Bright, Frank V

    2011-10-01

    We describe a compact luminescent gaseous oxygen (O 2 ) sensor microsystem based on the direct integration of sensor elements with a polymeric optical filter and placed on a low power complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC). The sensor operates on the measurement of excited-state emission intensity of O 2 -sensitive luminophore molecules tris(4,7-diphenyl-1,10-phenanthroline) ruthenium(II) ([Ru(dpp) 3 ] 2+ ) encapsulated within sol-gel derived xerogel thin films. The polymeric optical filter is made with polydimethylsiloxane (PDMS) that is mixed with a dye (Sudan-II). The PDMS membrane surface is molded to incorporate arrays of trapezoidal microstructures that serve to focus the optical sensor signals on to the imager pixels. The molded PDMS membrane is then attached with the PDMS color filter. The xerogel sensor arrays are contact printed on top of the PDMS trapezoidal lens-like microstructures. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. Correlated double sampling circuit, pixel address, digital control and signal integration circuits are also implemented on-chip. The CMOS imager data is read out as a serial coded signal. The CMOS imager consumes a static power of 320 µW and an average dynamic power of 625 µW when operating at 100 Hz sampling frequency and 1.8 V DC. This CMOS sensor system provides a useful platform for the development of miniaturized optical chemical gas sensors.

  17. Multi-sensor image fusion and its applications

    CERN Document Server

    Blum, Rick S

    2005-01-01

    Taking another lesson from nature, the latest advances in image processing technology seek to combine image data from several diverse types of sensors in order to obtain a more accurate view of the scene: very much the same as we rely on our five senses. Multi-Sensor Image Fusion and Its Applications is the first text dedicated to the theory and practice of the registration and fusion of image data, covering such approaches as statistical methods, color-related techniques, model-based methods, and visual information display strategies.After a review of state-of-the-art image fusion techniques,

  18. Fragmentation of percolation cluster perimeters

    Science.gov (United States)

    Debierre, Jean-Marc; Bradley, R. Mark

    1996-05-01

    We introduce a model for the fragmentation of porous random solids under the action of an external agent. In our model, the solid is represented by a bond percolation cluster on the square lattice and bonds are removed only at the external perimeter (or `hull') of the cluster. This model is shown to be related to the self-avoiding walk on the Manhattan lattice and to the disconnection events at a diffusion front. These correspondences are used to predict the leading and the first correction-to-scaling exponents for several quantities defined for hull fragmentation. Our numerical results support these predictions. In addition, the algorithm used to construct the perimeters reveals itself to be a very efficient tool for detecting subtle correlations in the pseudo-random number generator used. We present a quantitative test of two generators which supports recent results reported in more systematic studies.

  19. Fully wireless pressure sensor based on endoscopy images

    Science.gov (United States)

    Maeda, Yusaku; Mori, Hirohito; Nakagawa, Tomoaki; Takao, Hidekuni

    2018-04-01

    In this paper, the result of developing a fully wireless pressure sensor based on endoscopy images for an endoscopic surgery is reported for the first time. The sensor device has structural color with a nm-scale narrow gap, and the gap is changed by air pressure. The structural color of the sensor is acquired from camera images. Pressure detection can be realized with existing endoscope configurations only. The inner air pressure of the human body should be measured under flexible-endoscope operation using the sensor. Air pressure monitoring, has two important purposes. The first is to quantitatively measure tumor size under a constant air pressure for treatment selection. The second purpose is to prevent the endangerment of a patient due to over transmission of air. The developed sensor was evaluated, and the detection principle based on only endoscopy images has been successfully demonstrated.

  20. 3D-LSI technology for image sensor

    International Nuclear Information System (INIS)

    Motoyoshi, Makoto; Koyanagi, Mitsumasa

    2009-01-01

    Recently, the development of three-dimensional large-scale integration (3D-LSI) technologies has accelerated and has advanced from the research level or the limited production level to the investigation level, which might lead to mass production. By separating 3D-LSI technology into elementary technologies such as (1) through silicon via (TSV) formation, (2) bump formation, (3) wafer thinning, (4) chip/wafer alignment, and (5) chip/wafer stacking and reconstructing the entire process and structure, many methods to realize 3D-LSI devices can be developed. However, by considering a specific application, the supply chain of base wafers, and the purpose of 3D integration, a few suitable combinations can be identified. In this paper, we focus on the application of 3D-LSI technologies to image sensors. We describe the process and structure of the chip size package (CSP), developed on the basis of current and advanced 3D-LSI technologies, to be used in CMOS image sensors. Using the current LSI technologies, CSPs for 1.3 M, 2 M, and 5 M pixel CMOS image sensors were successfully fabricated without any performance degradation. 3D-LSI devices can be potentially employed in high-performance focal-plane-array image sensors. We propose a high-speed image sensor with an optical fill factor of 100% to be developed using next-generation 3D-LSI technology and fabricated using micro(μ)-bumps and micro(μ)-TSVs.

  1. Image acquisition system using on sensor compressed sampling technique

    Science.gov (United States)

    Gupta, Pravir Singh; Choi, Gwan Seong

    2018-01-01

    Advances in CMOS technology have made high-resolution image sensors possible. These image sensors pose significant challenges in terms of the amount of raw data generated, energy efficiency, and frame rate. This paper presents a design methodology for an imaging system and a simplified image sensor pixel design to be used in the system so that the compressed sensing (CS) technique can be implemented easily at the sensor level. This results in significant energy savings as it not only cuts the raw data rate but also reduces transistor count per pixel; decreases pixel size; increases fill factor; simplifies analog-to-digital converter, JPEG encoder, and JPEG decoder design; decreases wiring; and reduces the decoder size by half. Thus, CS has the potential to increase the resolution of image sensors for a given technology and die size while significantly decreasing the power consumption and design complexity. We show that it has potential to reduce power consumption by about 23% to 65%.

  2. CMOS sensors for atmospheric imaging

    Science.gov (United States)

    Pratlong, Jérôme; Burt, David; Jerram, Paul; Mayer, Frédéric; Walker, Andrew; Simpson, Robert; Johnson, Steven; Hubbard, Wendy

    2017-09-01

    Recent European atmospheric imaging missions have seen a move towards the use of CMOS sensors for the visible and NIR parts of the spectrum. These applications have particular challenges that are completely different to those that have driven the development of commercial sensors for applications such as cell-phone or SLR cameras. This paper will cover the design and performance of general-purpose image sensors that are to be used in the MTG (Meteosat Third Generation) and MetImage satellites and the technology challenges that they have presented. We will discuss how CMOS imagers have been designed with 4T pixel sizes of up to 250 μm square achieving good charge transfer efficiency, or low lag, with signal levels up to 2M electrons and with high line rates. In both devices a low noise analogue read-out chain is used with correlated double sampling to suppress the readout noise and give a maximum dynamic range that is significantly larger than in standard commercial devices. Radiation hardness is a particular challenge for CMOS detectors and both of these sensors have been designed to be fully radiation hard with high latch-up and single-event-upset tolerances, which is now silicon proven on MTG. We will also cover the impact of ionising radiation on these devices. Because with such large pixels the photodiodes have a large open area, front illumination technology is sufficient to meet the detection efficiency requirements but with thicker than standard epitaxial silicon to give improved IR response (note that this makes latch up protection even more important). However with narrow band illumination reflections from the front and back of the dielectric stack on the top of the sensor produce Fabry-Perot étalon effects, which have been minimised with process modifications. We will also cover the addition of precision narrow band filters inside the MTG package to provide a complete imaging subsystem. Control of reflected light is also critical in obtaining the

  3. A Wildlife Monitoring System Based on Wireless Image Sensor Networks

    Directory of Open Access Journals (Sweden)

    Junguo Zhang

    2014-10-01

    Full Text Available Survival and development of wildlife sustains the balance and stability of the entire ecosystem. Wildlife monitoring can provide lots of information such as wildlife species, quantity, habits, quality of life and habitat conditions, to help researchers grasp the status and dynamics of wildlife resources, and to provide basis for the effective protection, sustainable use, and scientific management of wildlife resources. Wildlife monitoring is the foundation of wildlife protection and management. Wireless Sensor Networks (WSN technology has become the most popular technology in the field of information. With advance of the CMOS image sensor technology, wireless sensor networks combined with image sensors, namely Wireless Image Sensor Networks (WISN technology, has emerged as an alternative in monitoring applications. Monitoring wildlife is one of its most promising applications. In this paper, system architecture of the wildlife monitoring system based on the wireless image sensor networks was presented to overcome the shortcomings of the traditional monitoring methods. Specifically, some key issues including design of wireless image sensor nodes and software process design have been studied and presented. A self-powered rotatable wireless infrared image sensor node based on ARM and an aggregation node designed for large amounts of data were developed. In addition, their corresponding software was designed. The proposed system is able to monitor wildlife accurately, automatically, and remotely in all-weather condition, which lays foundations for applications of wireless image sensor networks in wildlife monitoring.

  4. A time-resolved image sensor for tubeless streak cameras

    Science.gov (United States)

    Yasutomi, Keita; Han, SangMan; Seo, Min-Woong; Takasawa, Taishi; Kagawa, Keiichiro; Kawahito, Shoji

    2014-03-01

    This paper presents a time-resolved CMOS image sensor with draining-only modulation (DOM) pixels for tube-less streak cameras. Although the conventional streak camera has high time resolution, the device requires high voltage and bulky system due to the structure with a vacuum tube. The proposed time-resolved imager with a simple optics realize a streak camera without any vacuum tubes. The proposed image sensor has DOM pixels, a delay-based pulse generator, and a readout circuitry. The delay-based pulse generator in combination with an in-pixel logic allows us to create and to provide a short gating clock to the pixel array. A prototype time-resolved CMOS image sensor with the proposed pixel is designed and implemented using 0.11um CMOS image sensor technology. The image array has 30(Vertical) x 128(Memory length) pixels with the pixel pitch of 22.4um. .

  5. CMOS Imaging Sensor Technology for Aerial Mapping Cameras

    Science.gov (United States)

    Neumann, Klaus; Welzenbach, Martin; Timm, Martin

    2016-06-01

    In June 2015 Leica Geosystems launched the first large format aerial mapping camera using CMOS sensor technology, the Leica DMC III. This paper describes the motivation to change from CCD sensor technology to CMOS for the development of this new aerial mapping camera. In 2002 the DMC first generation was developed by Z/I Imaging. It was the first large format digital frame sensor designed for mapping applications. In 2009 Z/I Imaging designed the DMC II which was the first digital aerial mapping camera using a single ultra large CCD sensor to avoid stitching of smaller CCDs. The DMC III is now the third generation of large format frame sensor developed by Z/I Imaging and Leica Geosystems for the DMC camera family. It is an evolution of the DMC II using the same system design with one large monolithic PAN sensor and four multi spectral camera heads for R,G, B and NIR. For the first time a 391 Megapixel large CMOS sensor had been used as PAN chromatic sensor, which is an industry record. Along with CMOS technology goes a range of technical benefits. The dynamic range of the CMOS sensor is approx. twice the range of a comparable CCD sensor and the signal to noise ratio is significantly better than with CCDs. Finally results from the first DMC III customer installations and test flights will be presented and compared with other CCD based aerial sensors.

  6. Improved Denoising via Poisson Mixture Modeling of Image Sensor Noise.

    Science.gov (United States)

    Zhang, Jiachao; Hirakawa, Keigo

    2017-04-01

    This paper describes a study aimed at comparing the real image sensor noise distribution to the models of noise often assumed in image denoising designs. A quantile analysis in pixel, wavelet transform, and variance stabilization domains reveal that the tails of Poisson, signal-dependent Gaussian, and Poisson-Gaussian models are too short to capture real sensor noise behavior. A new Poisson mixture noise model is proposed to correct the mismatch of tail behavior. Based on the fact that noise model mismatch results in image denoising that undersmoothes real sensor data, we propose a mixture of Poisson denoising method to remove the denoising artifacts without affecting image details, such as edge and textures. Experiments with real sensor data verify that denoising for real image sensor data is indeed improved by this new technique.

  7. Self-Similarity Superresolution for Resource-Constrained Image Sensor Node in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yuehai Wang

    2014-01-01

    Full Text Available Wireless sensor networks, in combination with image sensors, open up a grand sensing application field. It is a challenging problem to recover a high resolution (HR image from its low resolution (LR counterpart, especially for low-cost resource-constrained image sensors with limited resolution. Sparse representation-based techniques have been developed recently and increasingly to solve this ill-posed inverse problem. Most of these solutions are based on an external dictionary learned from huge image gallery, consequently needing tremendous iteration and long time to match. In this paper, we explore the self-similarity inside the image itself, and propose a new combined self-similarity superresolution (SR solution, with low computation cost and high recover performance. In the self-similarity image super resolution model (SSIR, a small size sparse dictionary is learned from the image itself by the methods such as KSVD. The most similar patch is searched and specially combined during the sparse regulation iteration. Detailed information, such as edge sharpness, is preserved more faithfully and clearly. Experiment results confirm the effectiveness and efficiency of this double self-learning method in the image super resolution.

  8. Coupled wave sensor technology

    International Nuclear Information System (INIS)

    Maki, M.C.

    1988-01-01

    Buried line guided radar sensors have been used successfully for a number of years to provide perimeter security for high value resources. This paper introduces a new complementary sensor advancement at Computing Devices termed 'coupled wave device technology' (CWD). It provides many of the inherent advantages of leakey cable sensors, such as terrain-following and the ability to discriminate between humans and small animals. It also is able to provide a high or wide detection zone, and allows the sensor to be mounted aerially and adjacent to a wall or fence. Several alternative sensors have been developed which include a single-line sensor, a dual-line hybrid sensor that combines the elements of ported coax and CWD technology, and a rapid-deployment portable sensor for temporary or mobile applications. A description of the technology, the sensors, and their characteristics is provided

  9. High-speed imaging using CMOS image sensor with quasi pixel-wise exposure

    Science.gov (United States)

    Sonoda, T.; Nagahara, H.; Endo, K.; Sugiyama, Y.; Taniguchi, R.

    2017-02-01

    Several recent studies in compressive video sensing have realized scene capture beyond the fundamental trade-off limit between spatial resolution and temporal resolution using random space-time sampling. However, most of these studies showed results for higher frame rate video that were produced by simulation experiments or using an optically simulated random sampling camera, because there are currently no commercially available image sensors with random exposure or sampling capabilities. We fabricated a prototype complementary metal oxide semiconductor (CMOS) image sensor with quasi pixel-wise exposure timing that can realize nonuniform space-time sampling. The prototype sensor can reset exposures independently by columns and fix these amount of exposure by rows for each 8x8 pixel block. This CMOS sensor is not fully controllable via the pixels, and has line-dependent controls, but it offers flexibility when compared with regular CMOS or charge-coupled device sensors with global or rolling shutters. We propose a method to realize pseudo-random sampling for high-speed video acquisition that uses the flexibility of the CMOS sensor. We reconstruct the high-speed video sequence from the images produced by pseudo-random sampling using an over-complete dictionary.

  10. 21 CFR 886.1605 - Perimeter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Perimeter. 886.1605 Section 886.1605 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES... or manual device intended to determine the extent of the peripheral visual field of a patient. The...

  11. Thermoelectric infrared imaging sensors for automotive applications

    Science.gov (United States)

    Hirota, Masaki; Nakajima, Yasushi; Saito, Masanori; Satou, Fuminori; Uchiyama, Makoto

    2004-07-01

    This paper describes three low-cost thermoelectric infrared imaging sensors having a 1,536, 2,304, and 10,800 element thermoelectric focal plane array (FPA) respectively and two experimental automotive application systems. The FPAs are basically fabricated with a conventional IC process and micromachining technologies and have a low cost potential. Among these sensors, the sensor having 2,304 elements provide high responsivity of 5,500 V/W and a very small size with adopting a vacuum-sealed package integrated with a wide-angle ZnS lens. One experimental system incorporated in the Nissan ASV-2 is a blind spot pedestrian warning system that employs four infrared imaging sensors. This system helps alert the driver to the presence of a pedestrian in a blind spot by detecting the infrared radiation emitted from the person"s body. The system can also prevent the vehicle from moving in the direction of the pedestrian. The other is a rearview camera system with an infrared detection function. This system consists of a visible camera and infrared sensors, and it helps alert the driver to the presence of a pedestrian in a rear blind spot. Various issues that will need to be addressed in order to expand the automotive applications of IR imaging sensors in the future are also summarized. This performance is suitable for consumer electronics as well as automotive applications.

  12. CMOS image sensor-based immunodetection by refractive-index change.

    Science.gov (United States)

    Devadhasan, Jasmine P; Kim, Sanghyo

    2012-01-01

    A complementary metal oxide semiconductor (CMOS) image sensor is an intriguing technology for the development of a novel biosensor. Indeed, the CMOS image sensor mechanism concerning the detection of the antigen-antibody (Ag-Ab) interaction at the nanoscale has been ambiguous so far. To understand the mechanism, more extensive research has been necessary to achieve point-of-care diagnostic devices. This research has demonstrated a CMOS image sensor-based analysis of cardiovascular disease markers, such as C-reactive protein (CRP) and troponin I, Ag-Ab interactions on indium nanoparticle (InNP) substrates by simple photon count variation. The developed sensor is feasible to detect proteins even at a fg/mL concentration under ordinary room light. Possible mechanisms, such as dielectric constant and refractive-index changes, have been studied and proposed. A dramatic change in the refractive index after protein adsorption on an InNP substrate was observed to be a predominant factor involved in CMOS image sensor-based immunoassay.

  13. Numerical quantification and minimization of perimeter losses in high-efficiency silicon solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Altermatt, P.P.; Heiser, Gernot; Green, M.A. [New South Wales Univ., Kensington, NSW (Australia)

    1996-09-01

    This paper presents a quantitative analysis of perimeter losses in high-efficiency silicon solar cells. A new method of numerical modelling is used, which provides the means to simulate a full-sized solar cell, including its perimeter region. We analyse the reduction in efficiency due to perimeter losses as a function of the distance between the active cell area and the cut edge. It is shown how the optimum distance depends on whether the cells in the panel are shingled or not. The simulations also indicate that passivating the cut-face with a thermal oxide does not increase cell efficiency substantially. Therefore, doping schemes for the perimeter domain are suggested in order to increase efficiency levels above present standards. Finally, perimeter effects in cells that remain embedded in the wafer during the efficiency measurement are outlined. (author)

  14. Vision communications based on LED array and imaging sensor

    Science.gov (United States)

    Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    In this paper, we propose a brand new communication concept, called as "vision communication" based on LED array and image sensor. This system consists of LED array as a transmitter and digital device which include image sensor such as CCD and CMOS as receiver. In order to transmit data, the proposed communication scheme simultaneously uses the digital image processing and optical wireless communication scheme. Therefore, the cognitive communication scheme is possible with the help of recognition techniques used in vision system. By increasing data rate, our scheme can use LED array consisting of several multi-spectral LEDs. Because arranged each LED can emit multi-spectral optical signal such as visible, infrared and ultraviolet light, the increase of data rate is possible similar to WDM and MIMO skills used in traditional optical and wireless communications. In addition, this multi-spectral capability also makes it possible to avoid the optical noises in communication environment. In our vision communication scheme, the data packet is composed of Sync. data and information data. Sync. data is used to detect the transmitter area and calibrate the distorted image snapshots obtained by image sensor. By making the optical rate of LED array be same with the frame rate (frames per second) of image sensor, we can decode the information data included in each image snapshot based on image processing and optical wireless communication techniques. Through experiment based on practical test bed system, we confirm the feasibility of the proposed vision communications based on LED array and image sensor.

  15. Design and Fabrication of Vertically-Integrated CMOS Image Sensors

    Science.gov (United States)

    Skorka, Orit; Joseph, Dileepan

    2011-01-01

    Technologies to fabricate integrated circuits (IC) with 3D structures are an emerging trend in IC design. They are based on vertical stacking of active components to form heterogeneous microsystems. Electronic image sensors will benefit from these technologies because they allow increased pixel-level data processing and device optimization. This paper covers general principles in the design of vertically-integrated (VI) CMOS image sensors that are fabricated by flip-chip bonding. These sensors are composed of a CMOS die and a photodetector die. As a specific example, the paper presents a VI-CMOS image sensor that was designed at the University of Alberta, and fabricated with the help of CMC Microsystems and Micralyne Inc. To realize prototypes, CMOS dies with logarithmic active pixels were prepared in a commercial process, and photodetector dies with metal-semiconductor-metal devices were prepared in a custom process using hydrogenated amorphous silicon. The paper also describes a digital camera that was developed to test the prototype. In this camera, scenes captured by the image sensor are read using an FPGA board, and sent in real time to a PC over USB for data processing and display. Experimental results show that the VI-CMOS prototype has a higher dynamic range and a lower dark limit than conventional electronic image sensors. PMID:22163860

  16. Historical Fire Perimeters - Southern California [ds384

    Data.gov (United States)

    California Natural Resource Agency — CDF, USDA Forest Service Region 5, BLM, NPS, Contract Counties and other agencies jointly maintain a comprehensive fire perimeter GIS layer for public and private...

  17. CMOS-sensors for energy-resolved X-ray imaging

    International Nuclear Information System (INIS)

    Doering, D.; Amar-Youcef, S.; Deveaux, M.; Linnik, B.; Müntz, C.; Stroth, Joachim; Baudot, J.; Dulinski, W.; Kachel, M.

    2016-01-01

    Due to their low noise, CMOS Monolithic Active Pixel Sensors are suited to sense X-rays with a few keV quantum energy, which is of interest for high resolution X-ray imaging. Moreover, the good energy resolution of the silicon sensors might be used to measure this quantum energy. Combining both features with the good spatial resolution of CMOS sensors opens the potential to build ''color sensitive' X-ray cameras. Taking such colored images is hampered by the need to operate the CMOS sensors in a single photon counting mode, which restricts the photon flux capability of the sensors. More importantly, the charge sharing between the pixels smears the potentially good energy resolution of the sensors. Based on our experience with CMOS sensors for charged particle tracking, we studied techniques to overcome the latter by means of an offline processing of the data obtained from a CMOS sensor prototype. We found that the energy resolution of the pixels can be recovered at the expense of reduced quantum efficiency. We will introduce the results of our study and discuss the feasibility of taking colored X-ray pictures with CMOS sensors

  18. Study of photoconductor-based radiological image sensors

    International Nuclear Information System (INIS)

    Beaumont, Francois

    1989-01-01

    Because of the evolution of medical imaging techniques to digital Systems, it is necessary to replace radiological film which has many drawbacks, by a detector quite as efficient and quickly giving a digitizable signal. The purpose of this thesis is to find new X-ray digital imaging processes using photoconductor materials such as amorphous selenium. After reviewing the principle of direct radiology and functions to be served by the X-ray sensor (i.e. detection, memory, assignment, visualization), we explain specification. We especially show the constraints due to the object to be radiographed (condition of minimal exposure), and to the reading signal (electronic noise detection associated with a reading frequency). As a result of this study, a first photoconductor sensor could be designed. Its principle is based on photo-carrier trapping at dielectric-photoconductor structure interface. The reading System needs the scanning of a laser beam upon the sensor surface. The dielectric-photoconductor structure enabled us to estimate the possibilities offered by the sensor and to build a complete x-ray imaging System. The originality of thermo-dielectric sensor, that was next studied, is to allow a thermal assignment reading. The chosen System consists in varying the ferroelectric polymer capacity whose dielectric permittivity is weak at room temperature. The thermo-dielectric material was studied by thermal or Joule effect stimulation. During our experiments, trapping was found in a sensor made of amorphous selenium between two electrodes. This new effect was performed and enabled us to expose a first interpretation. Eventually, the comparison of these new sensor concepts with radiological film shows the advantage of the proposed solution. (author) [fr

  19. Fabricating Optical Fiber Imaging Sensors Using Inkjet Printing Technology: a pH Sensor Proof-of-Concept

    Energy Technology Data Exchange (ETDEWEB)

    Carter, J C; Alvis, R M; Brown, S B; Langry, K C; Wilson, T S; McBride, M T; Myrick, M L; Cox, W R; Grove, M E; Colston, B W

    2005-03-01

    We demonstrate the feasibility of using Drop-on-Demand microjet printing technology for fabricating imaging sensors by reproducibly printing an array of photopolymerizable sensing elements, containing a pH sensitive indicator, on the surface of an optical fiber image guide. The reproducibility of the microjet printing process is excellent for microdot (i.e. micron-sized polymer) sensor diameter (92.2 {+-} 2.2 microns), height (35.0 {+-} 1.0 microns), and roundness (0.00072 {+-} 0.00023). pH sensors were evaluated in terms of pH sensing ability ({le}2% sensor variation), response time, and hysteresis using a custom fluorescence imaging system. In addition, the microjet technique has distinct advantages over other fabrication methods, which are discussed in detail.

  20. CMOS Imaging of Pin-Printed Xerogel-Based Luminescent Sensor Microarrays.

    Science.gov (United States)

    Yao, Lei; Yung, Ka Yi; Khan, Rifat; Chodavarapu, Vamsy P; Bright, Frank V

    2010-12-01

    We present the design and implementation of a luminescence-based miniaturized multisensor system using pin-printed xerogel materials which act as host media for chemical recognition elements. We developed a CMOS imager integrated circuit (IC) to image the luminescence response of the xerogel-based sensor array. The imager IC uses a 26 × 20 (520 elements) array of active pixel sensors and each active pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. The imager includes a correlated double sampling circuit and pixel address/digital control circuit; the image data is read-out as coded serial signal. The sensor system uses a light-emitting diode (LED) to excite the target analyte responsive luminophores doped within discrete xerogel-based sensor elements. As a prototype, we developed a 4 × 4 (16 elements) array of oxygen (O 2 ) sensors. Each group of 4 sensor elements in the array (arranged in a row) is designed to provide a different and specific sensitivity to the target gaseous O 2 concentration. This property of multiple sensitivities is achieved by using a strategic mix of two oxygen sensitive luminophores ([Ru(dpp) 3 ] 2+ and ([Ru(bpy) 3 ] 2+ ) in each pin-printed xerogel sensor element. The CMOS imager consumes an average power of 8 mW operating at 1 kHz sampling frequency driven at 5 V. The developed prototype system demonstrates a low cost and miniaturized luminescence multisensor system.

  1. Lightning Imaging Sensor (LIS) on TRMM Science Data V4

    Data.gov (United States)

    National Aeronautics and Space Administration — The Lightning Imaging Sensor (LIS) Science Data was collected by the Lightning Imaging Sensor (LIS), which was an instrument on the Tropical Rainfall Measurement...

  2. Fingerprint image reconstruction for swipe sensor using Predictive Overlap Method

    Directory of Open Access Journals (Sweden)

    Mardiansyah Ahmad Zafrullah

    2018-01-01

    Full Text Available Swipe sensor is one of many biometric authentication sensor types that widely applied to embedded devices. The sensor produces an overlap on every pixel block of the image, so the picture requires a reconstruction process before heading to the feature extraction process. Conventional reconstruction methods require extensive computation, causing difficult to apply to embedded devices that have limited computing process. In this paper, image reconstruction is proposed using predictive overlap method, which determines the image block shift from the previous set of change data. The experiments were performed using 36 images generated by a swipe sensor with 128 x 8 pixels size of the area, where each image has an overlap in each block. The results reveal computation can increase up to 86.44% compared with conventional methods, with accuracy decreasing to 0.008% in average.

  3. The challenge of sCMOS image sensor technology to EMCCD

    Science.gov (United States)

    Chang, Weijing; Dai, Fang; Na, Qiyue

    2018-02-01

    In the field of low illumination image sensor, the noise of the latest scientific-grade CMOS image sensor is close to EMCCD, and the industry thinks it has the potential to compete and even replace EMCCD. Therefore we selected several typical sCMOS and EMCCD image sensors and cameras to compare their performance parameters. The results show that the signal-to-noise ratio of sCMOS is close to EMCCD, and the other parameters are superior. But signal-to-noise ratio is very important for low illumination imaging, and the actual imaging results of sCMOS is not ideal. EMCCD is still the first choice in the high-performance application field.

  4. A bio-image sensor for simultaneous detection of multi-neurotransmitters.

    Science.gov (United States)

    Lee, You-Na; Okumura, Koichi; Horio, Tomoko; Iwata, Tatsuya; Takahashi, Kazuhiro; Hattori, Toshiaki; Sawada, Kazuaki

    2018-03-01

    We report here a new bio-image sensor for simultaneous detection of spatial and temporal distribution of multi-neurotransmitters. It consists of multiple enzyme-immobilized membranes on a 128 × 128 pixel array with read-out circuit. Apyrase and acetylcholinesterase (AChE), as selective elements, are used to recognize adenosine 5'-triphosphate (ATP) and acetylcholine (ACh), respectively. To enhance the spatial resolution, hydrogen ion (H + ) diffusion barrier layers are deposited on top of the bio-image sensor and demonstrated their prevention capability. The results are used to design the space among enzyme-immobilized pixels and the null H + sensor to minimize the undesired signal overlap by H + diffusion. Using this bio-image sensor, we can obtain H + diffusion-independent imaging of concentration gradients of ATP and ACh in real-time. The sensing characteristics, such as sensitivity and detection of limit, are determined experimentally. With the proposed bio-image sensor the possibility exists for customizable monitoring of the activities of various neurochemicals by using different kinds of proton-consuming or generating enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. High dynamic range imaging sensors and architectures

    CERN Document Server

    Darmont, Arnaud

    2013-01-01

    Illumination is a crucial element in many applications, matching the luminance of the scene with the operational range of a camera. When luminance cannot be adequately controlled, a high dynamic range (HDR) imaging system may be necessary. These systems are being increasingly used in automotive on-board systems, road traffic monitoring, and other industrial, security, and military applications. This book provides readers with an intermediate discussion of HDR image sensors and techniques for industrial and non-industrial applications. It describes various sensor and pixel architectures capable

  6. Experimental single-chip color HDTV image acquisition system with 8M-pixel CMOS image sensor

    Science.gov (United States)

    Shimamoto, Hiroshi; Yamashita, Takayuki; Funatsu, Ryohei; Mitani, Kohji; Nojiri, Yuji

    2006-02-01

    We have developed an experimental single-chip color HDTV image acquisition system using 8M-pixel CMOS image sensor. The sensor has 3840 × 2160 effective pixels and is progressively scanned at 60 frames per second. We describe the color filter array and interpolation method to improve image quality with a high-pixel-count single-chip sensor. We also describe an experimental image acquisition system we used to measured spatial frequency characteristics in the horizontal direction. The results indicate good prospects for achieving a high quality single chip HDTV camera that reduces pseudo signals and maintains high spatial frequency characteristics within the frequency band for HDTV.

  7. Real-time estimation of wildfire perimeters from curated crowdsourcing

    Science.gov (United States)

    Zhong, Xu; Duckham, Matt; Chong, Derek; Tolhurst, Kevin

    2016-04-01

    Real-time information about the spatial extents of evolving natural disasters, such as wildfire or flood perimeters, can assist both emergency responders and the general public during an emergency. However, authoritative information sources can suffer from bottlenecks and delays, while user-generated social media data usually lacks the necessary structure and trustworthiness for reliable automated processing. This paper describes and evaluates an automated technique for real-time tracking of wildfire perimeters based on publicly available “curated” crowdsourced data about telephone calls to the emergency services. Our technique is based on established data mining tools, and can be adjusted using a small number of intuitive parameters. Experiments using data from the devastating Black Saturday wildfires (2009) in Victoria, Australia, demonstrate the potential for the technique to detect and track wildfire perimeters automatically, in real time, and with moderate accuracy. Accuracy can be further increased through combination with other authoritative demographic and environmental information, such as population density and dynamic wind fields. These results are also independently validated against data from the more recent 2014 Mickleham-Dalrymple wildfires.

  8. Performance study of double SOI image sensors

    Science.gov (United States)

    Miyoshi, T.; Arai, Y.; Fujita, Y.; Hamasaki, R.; Hara, K.; Ikegami, Y.; Kurachi, I.; Nishimura, R.; Ono, S.; Tauchi, K.; Tsuboyama, T.; Yamada, M.

    2018-02-01

    Double silicon-on-insulator (DSOI) sensors composed of two thin silicon layers and one thick silicon layer have been developed since 2011. The thick substrate consists of high resistivity silicon with p-n junctions while the thin layers are used as SOI-CMOS circuitry and as shielding to reduce the back-gate effect and crosstalk between the sensor and the circuitry. In 2014, a high-resolution integration-type pixel sensor, INTPIX8, was developed based on the DSOI concept. This device is fabricated using a Czochralski p-type (Cz-p) substrate in contrast to a single SOI (SSOI) device having a single thin silicon layer and a Float Zone p-type (FZ-p) substrate. In the present work, X-ray spectra of both DSOI and SSOI sensors were obtained using an Am-241 radiation source at four gain settings. The gain of the DSOI sensor was found to be approximately three times that of the SSOI device because the coupling capacitance is reduced by the DSOI structure. An X-ray imaging demonstration was also performed and high spatial resolution X-ray images were obtained.

  9. Special Sensor Microwave Imager/Sounder (SSMIS) Sensor Data Record (SDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager/Sounder (SSMIS) is a series of passive microwave conically scanning imagers and sounders onboard the DMSP satellites beginning...

  10. APPLICATION OF SENSOR FUSION TO IMPROVE UAV IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    S. Jabari

    2017-08-01

    Full Text Available Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan camera along with either a colour camera or a four-band multi-spectral (MS camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC. We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  11. Application of Sensor Fusion to Improve Uav Image Classification

    Science.gov (United States)

    Jabari, S.; Fathollahi, F.; Zhang, Y.

    2017-08-01

    Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.

  12. Two-Level Evaluation on Sensor Interoperability of Features in Fingerprint Image Segmentation

    Directory of Open Access Journals (Sweden)

    Ya-Shuo Li

    2012-03-01

    Full Text Available Features used in fingerprint segmentation significantly affect the segmentation performance. Various features exhibit different discriminating abilities on fingerprint images derived from different sensors. One feature which has better discriminating ability on images derived from a certain sensor may not adapt to segment images derived from other sensors. This degrades the segmentation performance. This paper empirically analyzes the sensor interoperability problem of segmentation feature, which refers to the feature’s ability to adapt to the raw fingerprints captured by different sensors. To address this issue, this paper presents a two-level feature evaluation method, including the first level feature evaluation based on segmentation error rate and the second level feature evaluation based on decision tree. The proposed method is performed on a number of fingerprint databases which are obtained from various sensors. Experimental results show that the proposed method can effectively evaluate the sensor interoperability of features, and the features with good evaluation results acquire better segmentation accuracies of images originating from different sensors.

  13. Computing nonsimple polygons of minimum perimeter

    NARCIS (Netherlands)

    Fekete, S.P.; Haas, A.; Hemmer, M.; Hoffmann, M.; Kostitsyna, I.; Krupke, D.; Maurer, F.; Mitchell, J.S.B.; Schmidt, A.; Schmidt, C.; Troegel, J.

    2018-01-01

    We consider the Minimum Perimeter Polygon Problem (MP3): for a given set V of points in the plane, find a polygon P with holes that has vertex set V , such that the total boundary length is smallest possible. The MP3 can be considered a natural geometric generalization of the Traveling Salesman

  14. Parametric Optimization of Lateral NIPIN Phototransistors for Flexible Image Sensors

    Directory of Open Access Journals (Sweden)

    Min Seok Kim

    2017-08-01

    Full Text Available Curved image sensors, which are a key component in bio-inspired imaging systems, have been widely studied because they can improve an imaging system in various aspects such as low optical aberrations, small-form, and simple optics configuration. Many methods and materials to realize a curvilinear imager have been proposed to address the drawbacks of conventional imaging/optical systems. However, there have been few theoretical studies in terms of electronics on the use of a lateral photodetector as a flexible image sensor. In this paper, we demonstrate the applicability of a Si-based lateral phototransistor as the pixel of a high-efficiency curved photodetector by conducting various electrical simulations with technology computer aided design (TCAD. The single phototransistor is analyzed with different device parameters: the thickness of the active cell, doping concentration, and structure geometry. This work presents a method to improve the external quantum efficiency (EQE, linear dynamic range (LDR, and mechanical stability of the phototransistor. We also evaluated the dark current in a matrix form of phototransistors to estimate the feasibility of the device as a flexible image sensor. Moreover, we fabricated and demonstrated an array of phototransistors based on our study. The theoretical study and design guidelines of a lateral phototransistor create new opportunities in flexible image sensors.

  15. Research-grade CMOS image sensors for demanding space applications

    Science.gov (United States)

    Saint-Pé, Olivier; Tulet, Michel; Davancens, Robert; Larnaudie, Franck; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Belliot, Pierre

    2017-11-01

    Imaging detectors are key elements for optical instruments and sensors on board space missions dedicated to Earth observation (high resolution imaging, atmosphere spectroscopy...), Solar System exploration (micro cameras, guidance for autonomous vehicle...) and Universe observation (space telescope focal planes, guiding sensors...). This market has been dominated by CCD technology for long. Since the mid- 90s, CMOS Image Sensors (CIS) have been competing with CCDs for more and more consumer domains (webcams, cell phones, digital cameras...). Featuring significant advantages over CCD sensors for space applications (lower power consumption, smaller system size, better radiations behaviour...), CMOS technology is also expanding in this field, justifying specific R&D and development programs funded by national and European space agencies (mainly CNES, DGA, and ESA). All along the 90s and thanks to their increasingly improving performances, CIS have started to be successfully used for more and more demanding applications, from vision and control functions requiring low-level performances to guidance applications requiring medium-level performances. Recent technology improvements have made possible the manufacturing of research-grade CIS that are able to compete with CCDs in the high-performances arena. After an introduction outlining the growing interest of optical instruments designers for CMOS image sensors, this talk will present the existing and foreseen ways to reach high-level electro-optics performances for CIS. The developments of CIS prototypes built using an imaging CMOS process and of devices based on improved designs will be presented.

  16. Autonomous vision networking: miniature wireless sensor networks with imaging technology

    Science.gov (United States)

    Messinger, Gioia; Goldberg, Giora

    2006-09-01

    The recent emergence of integrated PicoRadio technology, the rise of low power, low cost, System-On-Chip (SOC) CMOS imagers, coupled with the fast evolution of networking protocols and digital signal processing (DSP), created a unique opportunity to achieve the goal of deploying large-scale, low cost, intelligent, ultra-low power distributed wireless sensor networks for the visualization of the environment. Of all sensors, vision is the most desired, but its applications in distributed sensor networks have been elusive so far. Not any more. The practicality and viability of ultra-low power vision networking has been proven and its applications are countless, from security, and chemical analysis to industrial monitoring, asset tracking and visual recognition, vision networking represents a truly disruptive technology applicable to many industries. The presentation discusses some of the critical components and technologies necessary to make these networks and products affordable and ubiquitous - specifically PicoRadios, CMOS imagers, imaging DSP, networking and overall wireless sensor network (WSN) system concepts. The paradigm shift, from large, centralized and expensive sensor platforms, to small, low cost, distributed, sensor networks, is possible due to the emergence and convergence of a few innovative technologies. Avaak has developed a vision network that is aided by other sensors such as motion, acoustic and magnetic, and plans to deploy it for use in military and commercial applications. In comparison to other sensors, imagers produce large data files that require pre-processing and a certain level of compression before these are transmitted to a network server, in order to minimize the load on the network. Some of the most innovative chemical detectors currently in development are based on sensors that change color or pattern in the presence of the desired analytes. These changes are easily recorded and analyzed by a CMOS imager and an on-board DSP processor

  17. Single-event transient imaging with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor.

    Science.gov (United States)

    Mochizuki, Futa; Kagawa, Keiichiro; Okihara, Shin-ichiro; Seo, Min-Woong; Zhang, Bo; Takasawa, Taishi; Yasutomi, Keita; Kawahito, Shoji

    2016-02-22

    In the work described in this paper, an image reproduction scheme with an ultra-high-speed temporally compressive multi-aperture CMOS image sensor was demonstrated. The sensor captures an object by compressing a sequence of images with focal-plane temporally random-coded shutters, followed by reconstruction of time-resolved images. Because signals are modulated pixel-by-pixel during capturing, the maximum frame rate is defined only by the charge transfer speed and can thus be higher than those of conventional ultra-high-speed cameras. The frame rate and optical efficiency of the multi-aperture scheme are discussed. To demonstrate the proposed imaging method, a 5×3 multi-aperture image sensor was fabricated. The average rising and falling times of the shutters were 1.53 ns and 1.69 ns, respectively. The maximum skew among the shutters was 3 ns. The sensor observed plasma emission by compressing it to 15 frames, and a series of 32 images at 200 Mfps was reconstructed. In the experiment, by correcting disparities and considering temporal pixel responses, artifacts in the reconstructed images were reduced. An improvement in PSNR from 25.8 dB to 30.8 dB was confirmed in simulations.

  18. Visualization of heavy ion-induced charge production in a CMOS image sensor

    CERN Document Server

    Végh, J; Klamra, W; Molnár, J; Norlin, LO; Novák, D; Sánchez-Crespo, A; Van der Marel, J; Fenyvesi, A; Valastyan, I; Sipos, A

    2004-01-01

    A commercial CMOS image sensor was irradiated with heavy ion beams in the several MeV energy range. The image sensor is equipped with a standard video output. The data were collected on-line through frame grabbing and analysed off-line after digitisation. It was shown that the response of the image sensor to the heavy ion bombardment varied with the type and energy of the projectiles. The sensor will be used for the CMS Barrel Muon Alignment system.

  19. Ageing effects on image sensors due to terrestrial cosmic radiation

    NARCIS (Netherlands)

    Nampoothiri, G.G.; Horemans, M.L.R.; Theuwissen, A.J.P.

    2011-01-01

    We analyze the “ageing” effect on image sensors introduced by neutrons present in natural (terrestrial) cosmic environment. The results obtained at sea level are corroborated for the first time with accelerated neutron beam tests and for various image sensor operation conditions. The results reveal

  20. CMOS Imaging of Temperature Effects on Pin-Printed Xerogel Sensor Microarrays.

    Science.gov (United States)

    Lei Yao; Ka Yi Yung; Chodavarapu, Vamsy P; Bright, Frank V

    2011-04-01

    In this paper, we study the effect of temperature on the operation and performance of a xerogel-based sensor microarrays coupled to a complementary metal-oxide semiconductor (CMOS) imager integrated circuit (IC) that images the photoluminescence response from the sensor microarray. The CMOS imager uses a 32 × 32 (1024 elements) array of active pixel sensors and each pixel includes a high-gain phototransistor to convert the detected optical signals into electrical currents. A correlated double sampling circuit and pixel address/digital control/signal integration circuit are also implemented on-chip. The CMOS imager data are read out as a serial coded signal. The sensor system uses a light-emitting diode to excite target analyte responsive organometallic luminophores doped within discrete xerogel-based sensor elements. As a proto type, we developed a 3 × 3 (9 elements) array of oxygen (O2) sensors. Each group of three sensor elements in the array (arranged in a column) is designed to provide a different and specific sensitivity to the target gaseous O2 concentration. This property of multiple sensitivities is achieved by using a mix of two O2 sensitive luminophores in each pin-printed xerogel sensor element. The CMOS imager is designed to be low noise and consumes a static power of 320.4 μW and an average dynamic power of 624.6 μW when operating at 100-Hz sampling frequency and 1.8-V dc power supply.

  1. Characterization of modulated time-of-flight range image sensors

    Science.gov (United States)

    Payne, Andrew D.; Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.

    2009-01-01

    A number of full field image sensors have been developed that are capable of simultaneously measuring intensity and distance (range) for every pixel in a given scene using an indirect time-of-flight measurement technique. A light source is intensity modulated at a frequency between 10-100 MHz, and an image sensor is modulated at the same frequency, synchronously sampling light reflected from objects in the scene (homodyne detection). The time of flight is manifested as a phase shift in the illumination modulation envelope, which can be determined from the sampled data simultaneously for each pixel in the scene. This paper presents a method of characterizing the high frequency modulation response of these image sensors, using a pico-second laser pulser. The characterization results allow the optimal operating parameters, such as the modulation frequency, to be identified in order to maximize the range measurement precision for a given sensor. A number of potential sources of error exist when using these sensors, including deficiencies in the modulation waveform shape, duty cycle, or phase, resulting in contamination of the resultant range data. From the characterization data these parameters can be identified and compensated for by modifying the sensor hardware or through post processing of the acquired range measurements.

  2. ANALYSIS OF SPECTRAL CHARACTERISTICS AMONG DIFFERENT SENSORS BY USE OF SIMULATED RS IMAGES

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    This research, by use of RS image-simulating method, simulated apparent reflectance images at sensor level and ground-reflectance images of SPOT-HRV,CBERS-CCD,Landsat-TM and NOAA14-AVHRR' s corresponding bands. These images were used to analyze sensor's differences caused by spectral sensitivity and atmospheric impacts. The differences were analyzed on Normalized Difference Vegetation Index(NDVI). The results showed that the differences of sensors' spectral characteristics cause changes of their NDVI and reflectance. When multiple sensors' data are applied to digital analysis, the error should be taken into account. Atmospheric effect makes NDVI smaller, and atn~pheric correction has the tendency of increasing NDVI values. The reflectance and their NDVIs of different sensors can be used to analyze the differences among sensor' s features. The spectral analysis method based on RS simulated images can provide a new way to design the spectral characteristics of new sensors.

  3. Three dimensional multi perspective imaging with randomly distributed sensors

    International Nuclear Information System (INIS)

    DaneshPanah, Mehdi; Javidi, Bahrain

    2008-01-01

    In this paper, we review a three dimensional (3D) passive imaging system that exploits the visual information captured from the scene from multiple perspectives to reconstruct the scene voxel by voxel in 3D space. The primary contribution of this work is to provide a computational reconstruction scheme based on randomly distributed sensor locations in space. In virtually all of multi perspective techniques (e.g. integral imaging, synthetic aperture integral imaging, etc), there is an implicit assumption that the sensors lie on a simple, regular pickup grid. Here, we relax this assumption and suggest a computational reconstruction framework that unifies the available methods as its special cases. The importance of this work is that it enables three dimensional imaging technology to be implemented in a multitude of novel application domains such as 3D aerial imaging, collaborative imaging, long range 3D imaging and etc, where sustaining a regular pickup grid is not possible and/or the parallax requirements call for a irregular or sparse synthetic aperture mode. Although the sensors can be distributed in any random arrangement, we assume that the pickup position is measured at the time of capture of each elemental image. We demonstrate the feasibility of the methods proposed here by experimental results.

  4. SENSOR CORRECTION AND RADIOMETRIC CALIBRATION OF A 6-BAND MULTISPECTRAL IMAGING SENSOR FOR UAV REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    J. Kelcey

    2012-07-01

    Full Text Available The increased availability of unmanned aerial vehicles (UAVs has resulted in their frequent adoption for a growing range of remote sensing tasks which include precision agriculture, vegetation surveying and fine-scale topographic mapping. The development and utilisation of UAV platforms requires broad technical skills covering the three major facets of remote sensing: data acquisition, data post-processing, and image analysis. In this study, UAV image data acquired by a miniature 6-band multispectral imaging sensor was corrected and calibrated using practical image-based data post-processing techniques. Data correction techniques included dark offset subtraction to reduce sensor noise, flat-field derived per-pixel look-up-tables to correct vignetting, and implementation of the Brown- Conrady model to correct lens distortion. Radiometric calibration was conducted with an image-based empirical line model using pseudo-invariant features (PIFs. Sensor corrections and radiometric calibration improve the quality of the data, aiding quantitative analysis and generating consistency with other calibrated datasets.

  5. Impact of sensor-scene interaction on the design of an IR security surveillance system

    International Nuclear Information System (INIS)

    Claassen, J.P.; Phipps, G.S.

    1982-01-01

    Recent encouraging developments in infrared staring arrays with CCD readouts and in real time image processors working on and off the focal plane have suggested that technologies suitable for infrared security surveillance may be available in a two-to-five year time frame. In anticipation of these emerging technologies, an investigation has been undertaken to establish the design potential of a passive IR perimeter security system incorporating both detection and verification capabilities. To establish the design potential, it is necessary to characterize the interactions between the scene ad the sensor. To this end, theoretical and experimental findings were employed to document (1) the emission properties of scenes to include an intruder, (2) the propagation and emission characteristics of the intervening atmosphere, and (3) the reception properties of the imaging sensor. The impact of these findings are summarized in the light of the application constraints. Optimal wavelengths, intruder and background emission characteristics, weather limitations, and basic sensor design considerations are treated. Although many system design features have been identified to this date, continued efforts are required to complete a detailed system design to include the identifying processing requirements. A program to accomplish these objectives is presented

  6. Study of x-ray CCD image sensor and application

    Science.gov (United States)

    Wang, Shuyun; Li, Tianze

    2008-12-01

    In this paper, we expounded the composing, specialty, parameter, its working process, key techniques and methods for charge coupled devices (CCD) twice value treatment. Disposal process for CCD video signal quantification was expatiated; X-ray image intensifier's constitutes, function of constitutes, coupling technique of X-ray image intensifier and CCD were analyzed. We analyzed two effective methods to reduce the harm to human beings when X-ray was used in the medical image. One was to reduce X-ray's radiation and adopt to intensify the image penetrated by X-ray to gain the same effect. The other was to use the image sensor to transfer the images to the safe area for observation. On this base, a new method was presented that CCD image sensor and X-ray image intensifier were combined organically. A practical medical X-ray photo electricity system was designed which can be used in the records and time of the human's penetrating images. The system was mainly made up with the medical X-ray, X-ray image intensifier, CCD vidicon with high resolution, image processor, display and so on. Its characteristics are: change the invisible X-ray into the visible light image; output the vivid images; short image recording time etc. At the same time we analyzed the main aspects which affect the system's resolution. Medical photo electricity system using X-ray image sensor can reduce the X-ray harm to human sharply when it is used in the medical diagnoses. At last we analyzed and looked forward the system's application in medical engineering and the related fields.

  7. A Wireless Sensor Network for Vineyard Monitoring That Uses Image Processing

    Science.gov (United States)

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis. PMID:22163948

  8. A wireless sensor network for vineyard monitoring that uses image processing.

    Science.gov (United States)

    Lloret, Jaime; Bosch, Ignacio; Sendra, Sandra; Serrano, Arturo

    2011-01-01

    The first step to detect when a vineyard has any type of deficiency, pest or disease is to observe its stems, its grapes and/or its leaves. To place a sensor in each leaf of every vineyard is obviously not feasible in terms of cost and deployment. We should thus look for new methods to detect these symptoms precisely and economically. In this paper, we present a wireless sensor network where each sensor node takes images from the field and internally uses image processing techniques to detect any unusual status in the leaves. This symptom could be caused by a deficiency, pest, disease or other harmful agent. When it is detected, the sensor node sends a message to a sink node through the wireless sensor network in order to notify the problem to the farmer. The wireless sensor uses the IEEE 802.11 a/b/g/n standard, which allows connections from large distances in open air. This paper describes the wireless sensor network design, the wireless sensor deployment, how the node processes the images in order to monitor the vineyard, and the sensor network traffic obtained from a test bed performed in a flat vineyard in Spain. Although the system is not able to distinguish between deficiency, pest, disease or other harmful agents, a symptoms image database and a neuronal network could be added in order learn from the experience and provide an accurate problem diagnosis.

  9. Low-power high-accuracy micro-digital sun sensor by means of a CMOS image sensor

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.

    2013-01-01

    A micro-digital sun sensor (?DSS) is a sun detector which senses a satellite’s instant attitude angle with respect to the sun. The core of this sensor is a system-on-chip imaging chip which is referred to as APS+. The APS+ integrates a CMOS active pixel sensor (APS) array of 368×368??pixels , a

  10. Camera sensor arrangement for crop/weed detection accuracy in agronomic images.

    Science.gov (United States)

    Romeo, Juan; Guerrero, José Miguel; Montalvo, Martín; Emmi, Luis; Guijarro, María; Gonzalez-de-Santos, Pablo; Pajares, Gonzalo

    2013-04-02

    In Precision Agriculture, images coming from camera-based sensors are commonly used for weed identification and crop line detection, either to apply specific treatments or for vehicle guidance purposes. Accuracy of identification and detection is an important issue to be addressed in image processing. There are two main types of parameters affecting the accuracy of the images, namely: (a) extrinsic, related to the sensor's positioning in the tractor; (b) intrinsic, related to the sensor specifications, such as CCD resolution, focal length or iris aperture, among others. Moreover, in agricultural applications, the uncontrolled illumination, existing in outdoor environments, is also an important factor affecting the image accuracy. This paper is exclusively focused on two main issues, always with the goal to achieve the highest image accuracy in Precision Agriculture applications, making the following two main contributions: (a) camera sensor arrangement, to adjust extrinsic parameters and (b) design of strategies for controlling the adverse illumination effects.

  11. Current Density Distribution on the Perimeter of Waveguide Exciter Cylindrical Vibrator Conductor

    OpenAIRE

    Zakharia, Yosyp

    2010-01-01

    On ground of electrodynamic analysis the surface current distribution nonuniformity on the perimeter of waveguide-exciter cylindrical conductor is found. Considerable influence of current distribution nonuniformity on exciter input reactance is established. It is also showed, that the current distribution on the vibrator perimeter, for conductor radius no greater then 0,07 of waveguide cross section breadth, approximately uniform is.

  12. Space-based infrared sensors of space target imaging effect analysis

    Science.gov (United States)

    Dai, Huayu; Zhang, Yasheng; Zhou, Haijun; Zhao, Shuang

    2018-02-01

    Target identification problem is one of the core problem of ballistic missile defense system, infrared imaging simulation is an important means of target detection and recognition. This paper first established the space-based infrared sensors ballistic target imaging model of point source on the planet's atmosphere; then from two aspects of space-based sensors camera parameters and target characteristics simulated atmosphere ballistic target of infrared imaging effect, analyzed the camera line of sight jitter, camera system noise and different imaging effects of wave on the target.

  13. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    Science.gov (United States)

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  14. Determination of a BNI perimeter. Guide nr 9, Release of the 2013/10/31

    International Nuclear Information System (INIS)

    2013-01-01

    This guide specifies modalities of definition of the perimeter of a BNI (basic nuclear installation), and elements of assessment of criteria for the inclusion of installations, works and equipment within this perimeter, in compliance with legal measures specified in the French Code of the Environment and in various decrees. It first proposes an overview of the concerned articles in these legal texts, and notices some articulation with other legal texts which do not address nuclear issues. It specifies the various criteria of definition of a BNI perimeter by distinguishing installations, works and equipment under the responsibility of the operator and needed for the BNI exploitation, and installations, works and equipment not needed for the BNI exploitation but under the responsibility of the operator and susceptible to modify risks or inconveniences of the BNI. It addresses some peculiar situations: effluent processing installations and plants, control piezometers, the dredging of water and release sampling works, underground or underwater installations, and some specific departments (medicine, laundry, and so on). It outlines some important aspects of the procedure of modification of a BNI perimeter. An appendix addresses the methodology related to the inclusion of installations, equipment or works within the perimeter

  15. The lucky image-motion prediction for simple scene observation based soft-sensor technology

    Science.gov (United States)

    Li, Yan; Su, Yun; Hu, Bin

    2015-08-01

    High resolution is important to earth remote sensors, while the vibration of the platforms of the remote sensors is a major factor restricting high resolution imaging. The image-motion prediction and real-time compensation are key technologies to solve this problem. For the reason that the traditional autocorrelation image algorithm cannot meet the demand for the simple scene image stabilization, this paper proposes to utilize soft-sensor technology in image-motion prediction, and focus on the research of algorithm optimization in imaging image-motion prediction. Simulations results indicate that the improving lucky image-motion stabilization algorithm combining the Back Propagation Network (BP NN) and support vector machine (SVM) is the most suitable for the simple scene image stabilization. The relative error of the image-motion prediction based the soft-sensor technology is below 5%, the training computing speed of the mathematical predication model is as fast as the real-time image stabilization in aerial photography.

  16. Laser Doppler Blood Flow Imaging Using a CMOS Imaging Sensor with On-Chip Signal Processing

    Directory of Open Access Journals (Sweden)

    Cally Gill

    2013-09-01

    Full Text Available The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  17. Laser doppler blood flow imaging using a CMOS imaging sensor with on-chip signal processing.

    Science.gov (United States)

    He, Diwei; Nguyen, Hoang C; Hayes-Gill, Barrie R; Zhu, Yiqun; Crowe, John A; Gill, Cally; Clough, Geraldine F; Morgan, Stephen P

    2013-09-18

    The first fully integrated 2D CMOS imaging sensor with on-chip signal processing for applications in laser Doppler blood flow (LDBF) imaging has been designed and tested. To obtain a space efficient design over 64 × 64 pixels means that standard processing electronics used off-chip cannot be implemented. Therefore the analog signal processing at each pixel is a tailored design for LDBF signals with balanced optimization for signal-to-noise ratio and silicon area. This custom made sensor offers key advantages over conventional sensors, viz. the analog signal processing at the pixel level carries out signal normalization; the AC amplification in combination with an anti-aliasing filter allows analog-to-digital conversion with a low number of bits; low resource implementation of the digital processor enables on-chip processing and the data bottleneck that exists between the detector and processing electronics has been overcome. The sensor demonstrates good agreement with simulation at each design stage. The measured optical performance of the sensor is demonstrated using modulated light signals and in vivo blood flow experiments. Images showing blood flow changes with arterial occlusion and an inflammatory response to a histamine skin-prick demonstrate that the sensor array is capable of detecting blood flow signals from tissue.

  18. Wireless image-data transmission from an implanted image sensor through a living mouse brain by intra body communication

    Science.gov (United States)

    Hayami, Hajime; Takehara, Hiroaki; Nagata, Kengo; Haruta, Makito; Noda, Toshihiko; Sasagawa, Kiyotaka; Tokuda, Takashi; Ohta, Jun

    2016-04-01

    Intra body communication technology allows the fabrication of compact implantable biomedical sensors compared with RF wireless technology. In this paper, we report the fabrication of an implantable image sensor of 625 µm width and 830 µm length and the demonstration of wireless image-data transmission through a brain tissue of a living mouse. The sensor was designed to transmit output signals of pixel values by pulse width modulation (PWM). The PWM signals from the sensor transmitted through a brain tissue were detected by a receiver electrode. Wireless data transmission of a two-dimensional image was successfully demonstrated in a living mouse brain. The technique reported here is expected to provide useful methods of data transmission using micro sized implantable biomedical sensors.

  19. CMOS SPAD-based image sensor for single photon counting and time of flight imaging

    OpenAIRE

    Dutton, Neale Arthur William

    2016-01-01

    The facility to capture the arrival of a single photon, is the fundamental limit to the detection of quantised electromagnetic radiation. An image sensor capable of capturing a picture with this ultimate optical and temporal precision is the pinnacle of photo-sensing. The creation of high spatial resolution, single photon sensitive, and time-resolved image sensors in complementary metal oxide semiconductor (CMOS) technology offers numerous benefits in a wide field of applications....

  20. A Biologically Inspired CMOS Image Sensor

    CERN Document Server

    Sarkar, Mukul

    2013-01-01

    Biological systems are a source of inspiration in the development of small autonomous sensor nodes. The two major types of optical vision systems found in nature are the single aperture human eye and the compound eye of insects. The latter are among the most compact and smallest vision sensors. The eye is a compound of individual lenses with their own photoreceptor arrays.  The visual system of insects allows them to fly with a limited intelligence and brain processing power. A CMOS image sensor replicating the perception of vision in insects is discussed and designed in this book for industrial (machine vision) and medical applications. The CMOS metal layer is used to create an embedded micro-polarizer able to sense polarization information. This polarization information is shown to be useful in applications like real time material classification and autonomous agent navigation. Further the sensor is equipped with in pixel analog and digital memories which allow variation of the dynamic range and in-pixel b...

  1. Technical guidance for the development of a solid state image sensor for human low vision image warping

    Science.gov (United States)

    Vanderspiegel, Jan

    1994-01-01

    This report surveys different technologies and approaches to realize sensors for image warping. The goal is to study the feasibility, technical aspects, and limitations of making an electronic camera with special geometries which implements certain transformations for image warping. This work was inspired by the research done by Dr. Juday at NASA Johnson Space Center on image warping. The study has looked into different solid-state technologies to fabricate image sensors. It is found that among the available technologies, CMOS is preferred over CCD technology. CMOS provides more flexibility to design different functions into the sensor, is more widely available, and is a lower cost solution. By using an architecture with row and column decoders one has the added flexibility of addressing the pixels at random, or read out only part of the image.

  2. Highly curved image sensors: a practical approach for improved optical performance.

    Science.gov (United States)

    Guenter, Brian; Joshi, Neel; Stoakley, Richard; Keefe, Andrew; Geary, Kevin; Freeman, Ryan; Hundley, Jake; Patterson, Pamela; Hammon, David; Herrera, Guillermo; Sherman, Elena; Nowak, Andrew; Schubert, Randall; Brewer, Peter; Yang, Louis; Mott, Russell; McKnight, Geoff

    2017-06-12

    The significant optical and size benefits of using a curved focal surface for imaging systems have been well studied yet never brought to market for lack of a high-quality, mass-producible, curved image sensor. In this work we demonstrate that commercial silicon CMOS image sensors can be thinned and formed into accurate, highly curved optical surfaces with undiminished functionality. Our key development is a pneumatic forming process that avoids rigid mechanical constraints and suppresses wrinkling instabilities. A combination of forming-mold design, pressure membrane elastic properties, and controlled friction forces enables us to gradually contact the die at the corners and smoothly press the sensor into a spherical shape. Allowing the die to slide into the concave target shape enables a threefold increase in the spherical curvature over prior approaches having mechanical constraints that resist deformation, and create a high-stress, stretch-dominated state. Our process creates a bridge between the high precision and low-cost but planar CMOS process, and ideal non-planar component shapes such as spherical imagers for improved optical systems. We demonstrate these curved sensors in prototype cameras with custom lenses, measuring exceptional resolution of 3220 line-widths per picture height at an aperture of f/1.2 and nearly 100% relative illumination across the field. Though we use a 1/2.3" format image sensor in this report, we also show this process is generally compatible with many state of the art imaging sensor formats. By example, we report photogrammetry test data for an APS-C sized silicon die formed to a 30° subtended spherical angle. These gains in sharpness and relative illumination enable a new generation of ultra-high performance, manufacturable, digital imaging systems for scientific, industrial, and artistic use.

  3. X-ray imaging characterization of active edge silicon pixel sensors

    International Nuclear Information System (INIS)

    Ponchut, C; Ruat, M; Kalliopuska, J

    2014-01-01

    The aim of this work was the experimental characterization of edge effects in active-edge silicon pixel sensors, in the frame of X-ray pixel detectors developments for synchrotron experiments. We produced a set of active edge pixel sensors with 300 to 500 μm thickness, edge widths ranging from 100 μm to 150 μm, and n or p pixel contact types. The sensors with 256 × 256 pixels and 55 × 55 μm 2 pixel pitch were then bump-bonded to Timepix readout chips for X-ray imaging measurements. The reduced edge widths makes the edge pixels more sensitive to the electrical field distribution at the sensor boundaries. We characterized this effect by mapping the spatial response of the sensor edges with a finely focused X-ray synchrotron beam. One of the samples showed a distortion-free response on all four edges, whereas others showed variable degrees of distortions extending at maximum to 300 micron from the sensor edge. An application of active edge pixel sensors to coherent diffraction imaging with synchrotron beams is described

  4. RADIOMETRIC NORMALIZATION OF LARGE AIRBORNE IMAGE DATA SETS ACQUIRED BY DIFFERENT SENSOR TYPES

    Directory of Open Access Journals (Sweden)

    S. Gehrke

    2016-06-01

    Full Text Available Generating seamless mosaics of aerial images is a particularly challenging task when the mosaic comprises a large number of im-ages, collected over longer periods of time and with different sensors under varying imaging conditions. Such large mosaics typically consist of very heterogeneous image data, both spatially (different terrain types and atmosphere and temporally (unstable atmo-spheric properties and even changes in land coverage. We present a new radiometric normalization or, respectively, radiometric aerial triangulation approach that takes advantage of our knowledge about each sensor’s properties. The current implementation supports medium and large format airborne imaging sensors of the Leica Geosystems family, namely the ADS line-scanner as well as DMC and RCD frame sensors. A hierarchical modelling – with parameters for the overall mosaic, the sensor type, different flight sessions, strips and individual images – allows for adaptation to each sensor’s geometric and radiometric properties. Additional parameters at different hierarchy levels can compensate radiome-tric differences of various origins to compensate for shortcomings of the preceding radiometric sensor calibration as well as BRDF and atmospheric corrections. The final, relative normalization is based on radiometric tie points in overlapping images, absolute radiometric control points and image statistics. It is computed in a global least squares adjustment for the entire mosaic by altering each image’s histogram using a location-dependent mathematical model. This model involves contrast and brightness corrections at radiometric fix points with bilinear interpolation for corrections in-between. The distribution of the radiometry fixes is adaptive to each image and generally increases with image size, hence enabling optimal local adaptation even for very long image strips as typi-cally captured by a line-scanner sensor. The normalization approach is implemented in

  5. Photon detection with CMOS sensors for fast imaging

    International Nuclear Information System (INIS)

    Baudot, J.; Dulinski, W.; Winter, M.; Barbier, R.; Chabanat, E.; Depasse, P.; Estre, N.

    2009-01-01

    Pixel detectors employed in high energy physics aim to detect single minimum ionizing particle with micrometric positioning resolution. Monolithic CMOS sensors succeed in this task thanks to a low equivalent noise charge per pixel of around 10 to 15 e - , and a pixel pitch varying from 10 to a few 10 s of microns. Additionally, due to the possibility for integration of some data treatment in the sensor itself, readout times of 100μs have been reached for 100 kilo-pixels sensors. These aspects of CMOS sensors are attractive for applications in photon imaging. For X-rays of a few keV, the efficiency is limited to a few % due to the thin sensitive volume. For visible photons, the back-thinned version of CMOS sensor is sensitive to low intensity sources, of a few hundred photons. When a back-thinned CMOS sensor is combined with a photo-cathode, a new hybrid detector results (EBCMOS) and operates as a fast single photon imager. The first EBCMOS was produced in 2007 and demonstrated single photon counting with low dark current capability in laboratory conditions. It has been compared, in two different biological laboratories, with existing CCD-based 2D cameras for fluorescence microscopy. The current EBCMOS sensitivity and frame rate is comparable to existing EMCCDs. On-going developments aim at increasing this frame rate by, at least, an order of magnitude. We report in conclusion, the first test of a new CMOS sensor, LUCY, which reaches 1000 frames per second.

  6. Evaluation of a HDR image sensor with logarithmic response for mobile video-based applications

    Science.gov (United States)

    Tektonidis, Marco; Pietrzak, Mateusz; Monnin, David

    2017-10-01

    The performance of mobile video-based applications using conventional LDR (Low Dynamic Range) image sensors highly depends on the illumination conditions. As an alternative, HDR (High Dynamic Range) image sensors with logarithmic response are capable to acquire illumination-invariant HDR images in a single shot. We have implemented a complete image processing framework for a HDR sensor, including preprocessing methods (nonuniformity correction (NUC), cross-talk correction (CTC), and demosaicing) as well as tone mapping (TM). We have evaluated the HDR sensor for video-based applications w.r.t. the display of images and w.r.t. image analysis techniques. Regarding the display we have investigated the image intensity statistics over time, and regarding image analysis we assessed the number of feature correspondences between consecutive frames of temporal image sequences. For the evaluation we used HDR image data recorded from a vehicle on outdoor or combined outdoor/indoor itineraries, and we performed a comparison with corresponding conventional LDR image data.

  7. Imaging moving objects from multiply scattered waves and multiple sensors

    International Nuclear Information System (INIS)

    Miranda, Analee; Cheney, Margaret

    2013-01-01

    In this paper, we develop a linearized imaging theory that combines the spatial, temporal and spectral components of multiply scattered waves as they scatter from moving objects. In particular, we consider the case of multiple fixed sensors transmitting and receiving information from multiply scattered waves. We use a priori information about the multipath background. We use a simple model for multiple scattering, namely scattering from a fixed, perfectly reflecting (mirror) plane. We base our image reconstruction and velocity estimation technique on a modification of a filtered backprojection method that produces a phase-space image. We plot examples of point-spread functions for different geometries and waveforms, and from these plots, we estimate the resolution in space and velocity. Through this analysis, we are able to identify how the imaging system depends on parameters such as bandwidth and number of sensors. We ultimately show that enhanced phase-space resolution for a distribution of moving and stationary targets in a multipath environment may be achieved using multiple sensors. (paper)

  8. Implementation of large area CMOS image sensor module using the precision align inspection

    International Nuclear Information System (INIS)

    Kim, Byoung Wook; Kim, Toung Ju; Ryu, Cheol Woo; Lee, Kyung Yong; Kim, Jin Soo; Kim, Myung Soo; Cho, Gyu Seong

    2014-01-01

    This paper describes a large area CMOS image sensor module Implementation using the precision align inspection program. This work is needed because wafer cutting system does not always have high precision. The program check more than 8 point of sensor edges and align sensors with moving table. The size of a 2×1 butted CMOS image sensor module which except for the size of PCB is 170 mm×170 mm. And the pixel size is 55 μm×55 μm and the number of pixels is 3,072×3,072. The gap between the two CMOS image sensor module was arranged in less than one pixel size

  9. Implementation of large area CMOS image sensor module using the precision align inspection

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byoung Wook; Kim, Toung Ju; Ryu, Cheol Woo [Radiation Imaging Technology Center, JBTP, Iksan (Korea, Republic of); Lee, Kyung Yong; Kim, Jin Soo [Nano Sol-Tech INC., Iksan (Korea, Republic of); Kim, Myung Soo; Cho, Gyu Seong [Dept. of Nuclear and Quantum Engineering, KAIST, Daejeon (Korea, Republic of)

    2014-12-15

    This paper describes a large area CMOS image sensor module Implementation using the precision align inspection program. This work is needed because wafer cutting system does not always have high precision. The program check more than 8 point of sensor edges and align sensors with moving table. The size of a 2×1 butted CMOS image sensor module which except for the size of PCB is 170 mm×170 mm. And the pixel size is 55 μm×55 μm and the number of pixels is 3,072×3,072. The gap between the two CMOS image sensor module was arranged in less than one pixel size.

  10. Optical and Electric Multifunctional CMOS Image Sensors for On-Chip Biosensing Applications

    Directory of Open Access Journals (Sweden)

    Kiyotaka Sasagawa

    2010-12-01

    Full Text Available In this review, the concept, design, performance, and a functional demonstration of multifunctional complementary metal-oxide-semiconductor (CMOS image sensors dedicated to on-chip biosensing applications are described. We developed a sensor architecture that allows flexible configuration of a sensing pixel array consisting of optical and electric sensing pixels, and designed multifunctional CMOS image sensors that can sense light intensity and electric potential or apply a voltage to an on-chip measurement target. We describe the sensors’ architecture on the basis of the type of electric measurement or imaging functionalities.

  11. Unconventional applications of conventional intrusion detection sensors

    International Nuclear Information System (INIS)

    Williams, J.D.; Matter, J.C.

    1983-01-01

    A number of conventional intrusion detection sensors exists for the detection of persons entering buildings, moving within a given volume, and crossing a perimeter isolation zone. Unconventional applications of some of these sensors have recently been investigated. Some of the applications which are discussed include detection on the edges and tops of buildings, detection in storm sewers, detection on steam and other types of large pipes, and detection of unauthorized movement within secure enclosures. The enclosures can be used around complicated control valves, electrical control panels, emergency generators, etc

  12. Visual field examination method using virtual reality glasses compared with the Humphrey perimeter

    Directory of Open Access Journals (Sweden)

    Tsapakis S

    2017-08-01

    Full Text Available Stylianos Tsapakis, Dimitrios Papaconstantinou, Andreas Diagourtas, Konstantinos Droutsas, Konstantinos Andreanos, Marilita M Moschos, Dimitrios Brouzas 1st Department of Ophthalmology, National and Kapodistrian University of Athens, Athens, Greece Purpose: To present a visual field examination method using virtual reality glasses and evaluate the reliability of the method by comparing the results with those of the Humphrey perimeter.Materials and methods: Virtual reality glasses, a smartphone with a 6 inch display, and software that implements a fast-threshold 3 dB step staircase algorithm for the central 24° of visual field (52 points were used to test 20 eyes of 10 patients, who were tested in a random and consecutive order as they appeared in our glaucoma department. The results were compared with those obtained from the same patients using the Humphrey perimeter.Results: High correlation coefficient (r=0.808, P<0.0001 was found between the virtual reality visual field test and the Humphrey perimeter visual field.Conclusion: Visual field examination results using virtual reality glasses have a high correlation with the Humphrey perimeter allowing the method to be suitable for probable clinical use. Keywords: visual fields, virtual reality glasses, perimetry, visual fields software, smartphone

  13. X-ray detectors based on image sensors

    International Nuclear Information System (INIS)

    Costa, A.P.R.

    1983-01-01

    X-ray detectors based on image sensors are described and a comparison is made between the advantages and the disadvantages of such a kind of detectors with the position sensitive detectors. (L.C.) [pt

  14. Multi sensor satellite imagers for commercial remote sensing

    Science.gov (United States)

    Cronje, T.; Burger, H.; Du Plessis, J.; Du Toit, J. F.; Marais, L.; Strumpfer, F.

    2005-10-01

    This paper will discuss and compare recent refractive and catodioptric imager designs developed and manufactured at SunSpace for Multi Sensor Satellite Imagers with Panchromatic, Multi-spectral, Area and Hyperspectral sensors on a single Focal Plane Array (FPA). These satellite optical systems were designed with applications to monitor food supplies, crop yield and disaster monitoring in mind. The aim of these imagers is to achieve medium to high resolution (2.5m to 15m) spatial sampling, wide swaths (up to 45km) and noise equivalent reflectance (NER) values of less than 0.5%. State-of-the-art FPA designs are discussed and address the choice of detectors to achieve these performances. Special attention is given to thermal robustness and compactness, the use of folding prisms to place multiple detectors in a large FPA and a specially developed process to customize the spectral selection with the need to minimize mass, power and cost. A refractive imager with up to 6 spectral bands (6.25m GSD) and a catodioptric imager with panchromatic (2.7m GSD), multi-spectral (6 bands, 4.6m GSD), hyperspectral (400nm to 2.35μm, 200 bands, 15m GSD) sensors on the same FPA will be discussed. Both of these imagers are also equipped with real time video view finding capabilities. The electronic units could be subdivided into the Front-End Electronics and Control Electronics with analogue and digital signal processing. A dedicated Analogue Front-End is used for Correlated Double Sampling (CDS), black level correction, variable gain and up to 12-bit digitizing and high speed LVDS data link to a mass memory unit.

  15. Miniature infrared hyperspectral imaging sensor for airborne applications

    Science.gov (United States)

    Hinnrichs, Michele; Hinnrichs, Bradford; McCutchen, Earl

    2017-05-01

    Pacific Advanced Technology (PAT) has developed an infrared hyperspectral camera, both MWIR and LWIR, small enough to serve as a payload on a miniature unmanned aerial vehicles. The optical system has been integrated into the cold-shield of the sensor enabling the small size and weight of the sensor. This new and innovative approach to infrared hyperspectral imaging spectrometer uses micro-optics and will be explained in this paper. The micro-optics are made up of an area array of diffractive optical elements where each element is tuned to image a different spectral region on a common focal plane array. The lenslet array is embedded in the cold-shield of the sensor and actuated with a miniature piezo-electric motor. This approach enables rapid infrared spectral imaging with multiple spectral images collected and processed simultaneously each frame of the camera. This paper will present our optical mechanical design approach which results in an infrared hyper-spectral imaging system that is small enough for a payload on a mini-UAV or commercial quadcopter. The diffractive optical elements used in the lenslet array are blazed gratings where each lenslet is tuned for a different spectral bandpass. The lenslets are configured in an area array placed a few millimeters above the focal plane and embedded in the cold-shield to reduce the background signal normally associated with the optics. We have developed various systems using a different number of lenslets in the area array. Depending on the size of the focal plane and the diameter of the lenslet array will determine the spatial resolution. A 2 x 2 lenslet array will image four different spectral images of the scene each frame and when coupled with a 512 x 512 focal plane array will give spatial resolution of 256 x 256 pixel each spectral image. Another system that we developed uses a 4 x 4 lenslet array on a 1024 x 1024 pixel element focal plane array which gives 16 spectral images of 256 x 256 pixel resolution each

  16. High-content analysis of single cells directly assembled on CMOS sensor based on color imaging.

    Science.gov (United States)

    Tanaka, Tsuyoshi; Saeki, Tatsuya; Sunaga, Yoshihiko; Matsunaga, Tadashi

    2010-12-15

    A complementary metal oxide semiconductor (CMOS) image sensor was applied to high-content analysis of single cells which were assembled closely or directly onto the CMOS sensor surface. The direct assembling of cell groups on CMOS sensor surface allows large-field (6.66 mm×5.32 mm in entire active area of CMOS sensor) imaging within a second. Trypan blue-stained and non-stained cells in the same field area on the CMOS sensor were successfully distinguished as white- and blue-colored images under white LED light irradiation. Furthermore, the chemiluminescent signals of each cell were successfully visualized as blue-colored images on CMOS sensor only when HeLa cells were placed directly on the micro-lens array of the CMOS sensor. Our proposed approach will be a promising technique for real-time and high-content analysis of single cells in a large-field area based on color imaging. Copyright © 2010 Elsevier B.V. All rights reserved.

  17. Multi-image acquisition-based distance sensor using agile laser spot beam.

    Science.gov (United States)

    Riza, Nabeel A; Amin, M Junaid

    2014-09-01

    We present a novel laser-based distance measurement technique that uses multiple-image-based spatial processing to enable distance measurements. Compared with the first-generation distance sensor using spatial processing, the modified sensor is no longer hindered by the classic Rayleigh axial resolution limit for the propagating laser beam at its minimum beam waist location. The proposed high-resolution distance sensor design uses an electronically controlled variable focus lens (ECVFL) in combination with an optical imaging device, such as a charged-coupled device (CCD), to produce and capture different laser spot size images on a target with these beam spot sizes different from the minimal spot size possible at this target distance. By exploiting the unique relationship of the target located spot sizes with the varying ECVFL focal length for each target distance, the proposed distance sensor can compute the target distance with a distance measurement resolution better than the axial resolution via the Rayleigh resolution criterion. Using a 30 mW 633 nm He-Ne laser coupled with an electromagnetically actuated liquid ECVFL, along with a 20 cm focal length bias lens, and using five spot images captured per target position by a CCD-based Nikon camera, a proof-of-concept proposed distance sensor is successfully implemented in the laboratory over target ranges from 10 to 100 cm with a demonstrated sub-cm axial resolution, which is better than the axial Rayleigh resolution limit at these target distances. Applications for the proposed potentially cost-effective distance sensor are diverse and include industrial inspection and measurement and 3D object shape mapping and imaging.

  18. Urban Design Dimension Of Informality At The Perimeter Of Brawijaya University And UIN Maliki Malang

    Directory of Open Access Journals (Sweden)

    Tyaghita Cesarin Binar

    2018-01-01

    Full Text Available Informality is one of the commonly emerged issues in urban design which rarely explored, especially informality within university’s perimeter. Brawijaya as one of the biggest and oldest University in Malang over time has boosted the development of several of its perimeter, provided several hotspots for students and youth. These rapid hotspots growth is related to the growth of informal practices. For cities that developed by its universities, it is necessary to understand both of the formal and informal practices within its perimeter. Through this study I would like to know the characteristic of informality within university perimeter, which formed by Brawijaya, UIN Maliki, and ITN; and to frame it within Carmona’s Urban Design Dimension. Mapping is utilized as a primary method to analyze both formal and informality within site. The formal aspect consists of formal activities, function and site user. While informality mapper consists of street vendors and street art. The research found that while urban informality within Brawijaya and UIN Maliki are related to the character and morphology, its formal structure has; its relation is reciprocal. The morphological, visual and functional dimension of university perimeter is driven by the social, perceptual and temporal dimension formed by its user and showed through informality.

  19. Wearable sensors for patient-specific boundary shape estimation to improve the forward model for electrical impedance tomography (EIT) of neonatal lung function.

    Science.gov (United States)

    Khor, Joo Moy; Tizzard, Andrew; Demosthenous, Andreas; Bayford, Richard

    2014-06-01

    Electrical impedance tomography (EIT) could be significantly advantageous to continuous monitoring of lung development in newborn and, in particular, preterm infants as it is non-invasive and safe to use within the intensive care unit. It has been demonstrated that accurate boundary form of the forward model is important to minimize artefacts in reconstructed electrical impedance images. This paper presents the outcomes of initial investigations for acquiring patient-specific thorax boundary information using a network of flexible sensors that imposes no restrictions on the patient's normal breathing and movements. The investigations include: (1) description of the basis of the reconstruction algorithms, (2) tests to determine a minimum number of bend sensors, (3) validation of two approaches to reconstruction and (4) an example of a commercially available bend sensor and its performance. Simulation results using ideal sensors show that, in the worst case, a total shape error of less than 6% with respect to its total perimeter can be achieved.

  20. Multiple-Event, Single-Photon Counting Imaging Sensor

    Science.gov (United States)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  1. Object-Oriented Hierarchy Radiation Consistency for Different Temporal and Different Sensor Images

    Directory of Open Access Journals (Sweden)

    Nan Su

    2018-02-01

    Full Text Available In the paper, we propose a novel object-oriented hierarchy radiation consistency method for dense matching of different temporal and different sensor data in the 3D reconstruction. For different temporal images, our illumination consistency method is proposed to solve both the illumination uniformity for a single image and the relative illumination normalization for image pairs. Especially in the relative illumination normalization step, singular value equalization and linear relationship of the invariant pixels is combined used for the initial global illumination normalization and the object-oriented refined illumination normalization in detail, respectively. For different sensor images, we propose the union group sparse method, which is based on improving the original group sparse model. The different sensor images are set to a similar smoothness level by the same threshold of singular value from the union group matrix. Our method comprehensively considered the influence factors on the dense matching of the different temporal and different sensor stereoscopic image pairs to simultaneously improve the illumination consistency and the smoothness consistency. The radiation consistency experimental results verify the effectiveness and superiority of the proposed method by comparing two other methods. Moreover, in the dense matching experiment of the mixed stereoscopic image pairs, our method has more advantages for objects in the urban area.

  2. Intelligent Luminance Control of Lighting Systems Based on Imaging Sensor Feedback

    Directory of Open Access Journals (Sweden)

    Haoting Liu

    2017-02-01

    Full Text Available An imaging sensor-based intelligent Light Emitting Diode (LED lighting system for desk use is proposed. In contrast to the traditional intelligent lighting system, such as the photosensitive resistance sensor-based or the infrared sensor-based system, the imaging sensor can realize a finer perception of the environmental light; thus it can guide a more precise lighting control. Before this system works, first lots of typical imaging lighting data of the desk application are accumulated. Second, a series of subjective and objective Lighting Effect Evaluation Metrics (LEEMs are defined and assessed for these datasets above. Then the cluster benchmarks of these objective LEEMs can be obtained. Third, both a single LEEM-based control and a multiple LEEMs-based control are developed to realize a kind of optimal luminance tuning. When this system works, first it captures the lighting image using a wearable camera. Then it computes the objective LEEMs of the captured image and compares them with the cluster benchmarks of the objective LEEMs. Finally, the single LEEM-based or the multiple LEEMs-based control can be implemented to get a kind of optimal lighting effect. Many experiment results have shown the proposed system can tune the LED lamp automatically according to environment luminance changes.

  3. An improved triangulation laser rangefinder using a custom CMOS HDR linear image sensor

    Science.gov (United States)

    Liscombe, Michael

    3-D triangulation laser rangefinders are used in many modern applications, from terrain mapping to biometric identification. Although a wide variety of designs have been proposed, laser speckle noise still provides a fundamental limitation on range accuracy. These works propose a new triangulation laser rangefinder designed specifically to mitigate the effects of laser speckle noise. The proposed rangefinder uses a precision linear translator to laterally reposition the imaging system (e.g., image sensor and imaging lens). For a given spatial location of the laser spot, capturing N spatially uncorrelated laser spot profiles is shown to improve range accuracy by a factor of N . This technique has many advantages over past speckle-reduction technologies, such as a fixed system cost and form factor, and the ability to virtually eliminate laser speckle noise. These advantages are made possible through spatial diversity and come at the cost of increased acquisition time. The rangefinder makes use of the ICFYKWG1 linear image sensor, a custom CMOS sensor developed at the Vision Sensor Laboratory (York University). Tests are performed on the image sensor's innovative high dynamic range technology to determine its effects on range accuracy. As expected, experimental results have shown that the sensor provides a trade-off between dynamic range and range accuracy.

  4. Retinal fundus imaging with a plenoptic sensor

    Science.gov (United States)

    Thurin, Brice; Bloch, Edward; Nousias, Sotiris; Ourselin, Sebastien; Keane, Pearse; Bergeles, Christos

    2018-02-01

    Vitreoretinal surgery is moving towards 3D visualization of the surgical field. This require acquisition system capable of recording such 3D information. We propose a proof of concept imaging system based on a light-field camera where an array of micro-lenses is placed in front of a conventional sensor. With a single snapshot, a stack of images focused at different depth are produced on the fly, which provides enhanced depth perception for the surgeon. Difficulty in depth localization of features and frequent focus-change during surgery are making current vitreoretinal heads-up surgical imaging systems cumbersome to use. To improve the depth perception and eliminate the need to manually refocus on the instruments during the surgery, we designed and implemented a proof-of-concept ophthalmoscope equipped with a commercial light-field camera. The sensor of our camera is composed of an array of micro-lenses which are projecting an array of overlapped micro-images. We show that with a single light-field snapshot we can digitally refocus between the retina and a tool located in front of the retina or display an extended depth-of-field image where everything is in focus. The design and system performances of the plenoptic fundus camera are detailed. We will conclude by showing in vivo data recorded with our device.

  5. Two-dimensional pixel image lag simulation and optimization in a 4-T CMOS image sensor

    Energy Technology Data Exchange (ETDEWEB)

    Yu Junting; Li Binqiao; Yu Pingping; Xu Jiangtao [School of Electronics Information Engineering, Tianjin University, Tianjin 300072 (China); Mou Cun, E-mail: xujiangtao@tju.edu.c [Logistics Management Office, Hebei University of Technology, Tianjin 300130 (China)

    2010-09-15

    Pixel image lag in a 4-T CMOS image sensor is analyzed and simulated in a two-dimensional model. Strategies of reducing image lag are discussed from transfer gate channel threshold voltage doping adjustment, PPD N-type doping dose/implant tilt adjustment and transfer gate operation voltage adjustment for signal electron transfer. With the computer analysis tool ISE-TCAD, simulation results show that minimum image lag can be obtained at a pinned photodiode n-type doping dose of 7.0 x 10{sup 12} cm{sup -2}, an implant tilt of -2{sup 0}, a transfer gate channel doping dose of 3.0 x 10{sup 12} cm{sup -2} and an operation voltage of 3.4 V. The conclusions of this theoretical analysis can be a guideline for pixel design to improve the performance of 4-T CMOS image sensors. (semiconductor devices)

  6. 77 FR 26787 - Certain CMOS Image Sensors and Products Containing Same; Notice of Receipt of Complaint...

    Science.gov (United States)

    2012-05-07

    ... INTERNATIONAL TRADE COMMISSION [Docket No. 2895] Certain CMOS Image Sensors and Products.... International Trade Commission has received a complaint entitled Certain CMOS Image Sensors and Products... importation, and the sale within the United States after importation of certain CMOS image sensors and...

  7. Methods and apparatuses for detection of radiation with semiconductor image sensors

    Science.gov (United States)

    Cogliati, Joshua Joseph

    2018-04-10

    A semiconductor image sensor is repeatedly exposed to high-energy photons while a visible light obstructer is in place to block visible light from impinging on the sensor to generate a set of images from the exposures. A composite image is generated from the set of images with common noise substantially removed so the composite image includes image information corresponding to radiated pixels that absorbed at least some energy from the high-energy photons. The composite image is processed to determine a set of bright points in the composite image, each bright point being above a first threshold. The set of bright points is processed to identify lines with two or more bright points that include pixels therebetween that are above a second threshold and identify a presence of the high-energy particles responsive to a number of lines.

  8. Median filters as a tool to determine dark noise thresholds in high resolution smartphone image sensors for scientific imaging

    Science.gov (United States)

    Igoe, Damien P.; Parisi, Alfio V.; Amar, Abdurazaq; Rummenie, Katherine J.

    2018-01-01

    An evaluation of the use of median filters in the reduction of dark noise in smartphone high resolution image sensors is presented. The Sony Xperia Z1 employed has a maximum image sensor resolution of 20.7 Mpixels, with each pixel having a side length of just over 1 μm. Due to the large number of photosites, this provides an image sensor with very high sensitivity but also makes them prone to noise effects such as hot-pixels. Similar to earlier research with older models of smartphone, no appreciable temperature effects were observed in the overall average pixel values for images taken in ambient temperatures between 5 °C and 25 °C. In this research, hot-pixels are defined as pixels with intensities above a specific threshold. The threshold is determined using the distribution of pixel values of a set of images with uniform statistical properties associated with the application of median-filters of increasing size. An image with uniform statistics was employed as a training set from 124 dark images, and the threshold was determined to be 9 digital numbers (DN). The threshold remained constant for multiple resolutions and did not appreciably change even after a year of extensive field use and exposure to solar ultraviolet radiation. Although the temperature effects' uniformity masked an increase in hot-pixel occurrences, the total number of occurrences represented less than 0.1% of the total image. Hot-pixels were removed by applying a median filter, with an optimum filter size of 7 × 7; similar trends were observed for four additional smartphone image sensors used for validation. Hot-pixels were also reduced by decreasing image resolution. The method outlined in this research provides a methodology to characterise the dark noise behavior of high resolution image sensors for use in scientific investigations, especially as pixel sizes decrease.

  9. Giga-pixel lensfree holographic microscopy and tomography using color image sensors.

    Directory of Open Access Journals (Sweden)

    Serhan O Isikman

    Full Text Available We report Giga-pixel lensfree holographic microscopy and tomography using color sensor-arrays such as CMOS imagers that exhibit Bayer color filter patterns. Without physically removing these color filters coated on the sensor chip, we synthesize pixel super-resolved lensfree holograms, which are then reconstructed to achieve ~350 nm lateral resolution, corresponding to a numerical aperture of ~0.8, across a field-of-view of ~20.5 mm(2. This constitutes a digital image with ~0.7 Billion effective pixels in both amplitude and phase channels (i.e., ~1.4 Giga-pixels total. Furthermore, by changing the illumination angle (e.g., ± 50° and scanning a partially-coherent light source across two orthogonal axes, super-resolved images of the same specimen from different viewing angles are created, which are then digitally combined to synthesize tomographic images of the object. Using this dual-axis lensfree tomographic imager running on a color sensor-chip, we achieve a 3D spatial resolution of ~0.35 µm × 0.35 µm × ~2 µm, in x, y and z, respectively, creating an effective voxel size of ~0.03 µm(3 across a sample volume of ~5 mm(3, which is equivalent to >150 Billion voxels. We demonstrate the proof-of-concept of this lensfree optical tomographic microscopy platform on a color CMOS image sensor by creating tomograms of micro-particles as well as a wild-type C. elegans nematode.

  10. CMOS Active-Pixel Image Sensor With Simple Floating Gates

    Science.gov (United States)

    Fossum, Eric R.; Nakamura, Junichi; Kemeny, Sabrina E.

    1996-01-01

    Experimental complementary metal-oxide/semiconductor (CMOS) active-pixel image sensor integrated circuit features simple floating-gate structure, with metal-oxide/semiconductor field-effect transistor (MOSFET) as active circuit element in each pixel. Provides flexibility of readout modes, no kTC noise, and relatively simple structure suitable for high-density arrays. Features desirable for "smart sensor" applications.

  11. Development of a 750x750 pixels CMOS imager sensor for tracking applications

    Science.gov (United States)

    Larnaudie, Franck; Guardiola, Nicolas; Saint-Pé, Olivier; Vignon, Bruno; Tulet, Michel; Davancens, Robert; Magnan, Pierre; Corbière, Franck; Martin-Gonthier, Philippe; Estribeau, Magali

    2017-11-01

    Solid-state optical sensors are now commonly used in space applications (navigation cameras, astronomy imagers, tracking sensors...). Although the charge-coupled devices are still widely used, the CMOS image sensor (CIS), which performances are continuously improving, is a strong challenger for Guidance, Navigation and Control (GNC) systems. This paper describes a 750x750 pixels CMOS image sensor that has been specially designed and developed for star tracker and tracking sensor applications. Such detector, that is featuring smart architecture enabling very simple and powerful operations, is built using the AMIS 0.5μm CMOS technology. It contains 750x750 rectangular pixels with 20μm pitch. The geometry of the pixel sensitive zone is optimized for applications based on centroiding measurements. The main feature of this device is the on-chip control and timing function that makes the device operation easier by drastically reducing the number of clocks to be applied. This powerful function allows the user to operate the sensor with high flexibility: measurement of dark level from masked lines, direct access to the windows of interest… A temperature probe is also integrated within the CMOS chip allowing a very precise measurement through the video stream. A complete electro-optical characterization of the sensor has been performed. The major parameters have been evaluated: dark current and its uniformity, read-out noise, conversion gain, Fixed Pattern Noise, Photo Response Non Uniformity, quantum efficiency, Modulation Transfer Function, intra-pixel scanning. The characterization tests are detailed in the paper. Co60 and protons irradiation tests have been also carried out on the image sensor and the results are presented. The specific features of the 750x750 image sensor such as low power CMOS design (3.3V, power consumption<100mW), natural windowing (that allows efficient and robust tracking algorithms), simple proximity electronics (because of the on

  12. 77 FR 74513 - Certain CMOS Image Sensors and Products Containing Same; Investigations: Terminations...

    Science.gov (United States)

    2012-12-14

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-846] Certain CMOS Image Sensors and Products Containing Same; Investigations: Terminations, Modifications and Rulings AGENCY: U.S... United States after importation of certain CMOS image sensors and products containing the same based on...

  13. Cloud Classification in Wide-Swath Passive Sensor Images Aided by Narrow-Swath Active Sensor Data

    Directory of Open Access Journals (Sweden)

    Hongxia Wang

    2018-05-01

    Full Text Available It is a challenge to distinguish between different cloud types because of the complexity and diversity of cloud coverage, which is a significant clutter source that impacts on target detection and identification from the images of space-based infrared sensors. In this paper, a novel strategy for cloud classification in wide-swath passive sensor images is developed, which is aided by narrow-swath active sensor data. The strategy consists of three steps, that is, the orbit registration, most matching donor pixel selection, and cloud type assignment for each recipient pixel. A new criterion for orbit registration is proposed so as to improve the matching accuracy. The most matching donor pixel is selected via the Euclidean distance and the square sum of the radiance relative differences between the recipient and the potential donor pixels. Each recipient pixel is then assigned a cloud type that corresponds to the most matching donor. The cloud classification of the Moderate Resolution Imaging Spectroradiometer (MODIS images is performed with the aid of the data from Cloud Profiling Radar (CPR. The results are compared with the CloudSat product 2B-CLDCLASS, as well as those that are obtained using the method of the International Satellite Cloud Climatology Project (ISCCP, which demonstrates the superior classification performance of the proposed strategy.

  14. Low-Power Smart Imagers for Vision-Enabled Sensor Networks

    CERN Document Server

    Fernández-Berni, Jorge; Rodríguez-Vázquez, Ángel

    2012-01-01

    This book presents a comprehensive, systematic approach to the development of vision system architectures that employ sensory-processing concurrency and parallel processing to meet the autonomy challenges posed by a variety of safety and surveillance applications.  Coverage includes a thorough analysis of resistive diffusion networks embedded within an image sensor array. This analysis supports a systematic approach to the design of spatial image filters and their implementation as vision chips in CMOS technology. The book also addresses system-level considerations pertaining to the embedding of these vision chips into vision-enabled wireless sensor networks.  Describes a system-level approach for designing of vision devices and  embedding them into vision-enabled, wireless sensor networks; Surveys state-of-the-art, vision-enabled WSN nodes; Includes details of specifications and challenges of vision-enabled WSNs; Explains architectures for low-energy CMOS vision chips with embedded, programmable spatial f...

  15. A multimodal image sensor system for identifying water stress in grapevines

    Science.gov (United States)

    Zhao, Yong; Zhang, Qin; Li, Minzan; Shao, Yongni; Zhou, Jianfeng; Sun, Hong

    2012-11-01

    Water stress is one of the most common limitations of fruit growth. Water is the most limiting resource for crop growth. In grapevines, as well as in other fruit crops, fruit quality benefits from a certain level of water deficit which facilitates to balance vegetative and reproductive growth and the flow of carbohydrates to reproductive structures. A multi-modal sensor system was designed to measure the reflectance signature of grape plant surfaces and identify different water stress levels in this paper. The multi-modal sensor system was equipped with one 3CCD camera (three channels in R, G, and IR). The multi-modal sensor can capture and analyze grape canopy from its reflectance features, and identify the different water stress levels. This research aims at solving the aforementioned problems. The core technology of this multi-modal sensor system could further be used as a decision support system that combines multi-modal sensory data to improve plant stress detection and identify the causes of stress. The images were taken by multi-modal sensor which could output images in spectral bands of near-infrared, green and red channel. Based on the analysis of the acquired images, color features based on color space and reflectance features based on image process method were calculated. The results showed that these parameters had the potential as water stress indicators. More experiments and analysis are needed to validate the conclusion.

  16. Image sensor system with bio-inspired efficient coding and adaptation.

    Science.gov (United States)

    Okuno, Hirotsugu; Yagi, Tetsuya

    2012-08-01

    We designed and implemented an image sensor system equipped with three bio-inspired coding and adaptation strategies: logarithmic transform, local average subtraction, and feedback gain control. The system comprises a field-programmable gate array (FPGA), a resistive network, and active pixel sensors (APS), whose light intensity-voltage characteristics are controllable. The system employs multiple time-varying reset voltage signals for APS in order to realize multiple logarithmic intensity-voltage characteristics, which are controlled so that the entropy of the output image is maximized. The system also employs local average subtraction and gain control in order to obtain images with an appropriate contrast. The local average is calculated by the resistive network instantaneously. The designed system was successfully used to obtain appropriate images of objects that were subjected to large changes in illumination.

  17. CMOS image sensors: State-of-the-art

    Science.gov (United States)

    Theuwissen, Albert J. P.

    2008-09-01

    This paper gives an overview of the state-of-the-art of CMOS image sensors. The main focus is put on the shrinkage of the pixels : what is the effect on the performance characteristics of the imagers and on the various physical parameters of the camera ? How is the CMOS pixel architecture optimized to cope with the negative performance effects of the ever-shrinking pixel size ? On the other hand, the smaller dimensions in CMOS technology allow further integration on column level and even on pixel level. This will make CMOS imagers even smarter that they are already.

  18. Convex lattice polygons of fixed area with perimeter-dependent weights.

    Science.gov (United States)

    Rajesh, R; Dhar, Deepak

    2005-01-01

    We study fully convex polygons with a given area, and variable perimeter length on square and hexagonal lattices. We attach a weight tm to a convex polygon of perimeter m and show that the sum of weights of all polygons with a fixed area s varies as s(-theta(conv))eK(t)square root(s) for large s and t less than a critical threshold tc, where K(t) is a t-dependent constant, and theta(conv) is a critical exponent which does not change with t. Using heuristic arguments, we find that theta(conv) is 1/4 for the square lattice, but -1/4 for the hexagonal lattice. The reason for this unexpected nonuniversality of theta(conv) is traced to existence of sharp corners in the asymptotic shape of these polygons.

  19. Evaluation of the AN/SAY-1 Thermal Imaging Sensor System

    National Research Council Canada - National Science Library

    Smith, John G; Middlebrook, Christopher T

    2002-01-01

    The AN/SAY-1 Thermal Imaging Sensor System "TISS" was developed to provide surface ships with a day/night imaging capability to detect low radar reflective, small cross-sectional area targets such as floating mines...

  20. 77 FR 33488 - Certain CMOS Image Sensors and Products Containing Same; Institution of Investigation Pursuant to...

    Science.gov (United States)

    2012-06-06

    ... INTERNATIONAL TRADE COMMISSION [Investigation No. 337-TA-846] Certain CMOS Image Sensors and... image sensors and products containing same by reason of infringement of certain claims of U.S. Patent No... image sensors and products containing same that infringe one or more of claims 1 and 2 of the `126...

  1. BIOME: An Ecosystem Remote Sensor Based on Imaging Interferometry

    Science.gov (United States)

    Peterson, David L.; Hammer, Philip; Smith, William H.; Lawless, James G. (Technical Monitor)

    1994-01-01

    Until recent times, optical remote sensing of ecosystem properties from space has been limited to broad band multispectral scanners such as Landsat and AVHRR. While these sensor data can be used to derive important information about ecosystem parameters, they are very limited for measuring key biogeochemical cycling parameters such as the chemical content of plant canopies. Such parameters, for example the lignin and nitrogen contents, are potentially amenable to measurements by very high spectral resolution instruments using a spectroscopic approach. Airborne sensors based on grating imaging spectrometers gave the first promise of such potential but the recent decision not to deploy the space version has left the community without many alternatives. In the past few years, advancements in high performance deep well digital sensor arrays coupled with a patented design for a two-beam interferometer has produced an entirely new design for acquiring imaging spectroscopic data at the signal to noise levels necessary for quantitatively estimating chemical composition (1000:1 at 2 microns). This design has been assembled as a laboratory instrument and the principles demonstrated for acquiring remote scenes. An airborne instrument is in production and spaceborne sensors being proposed. The instrument is extremely promising because of its low cost, lower power requirements, very low weight, simplicity (no moving parts), and high performance. For these reasons, we have called it the first instrument optimized for ecosystem studies as part of a Biological Imaging and Observation Mission to Earth (BIOME).

  2. Low-Power Low-Noise CMOS Imager Design : In Micro-Digital Sun Sensor Application

    NARCIS (Netherlands)

    Xie, N.

    2012-01-01

    A digital sun sensor is superior to an analog sun sensor in aspects of resolution, albedo immunity, and integration. The proposed Micro-Digital Sun Sensor (µDSS) is an autonomous digital sun sensor which is implemented by means of a CMOS image sensor, which is named APS+. The µDSS is designed

  3. Soft sensor design by multivariate fusion of image features and process measurements

    DEFF Research Database (Denmark)

    Lin, Bao; Jørgensen, Sten Bay

    2011-01-01

    This paper presents a multivariate data fusion procedure for design of dynamic soft sensors where suitably selected image features are combined with traditional process measurements to enhance the performance of data-driven soft sensors. A key issue of fusing multiple sensor data, i.e. to determine...... with a multivariate analysis technique from RGB pictures. The color information is also transformed to hue, saturation and intensity components. Both sets of image features are combined with traditional process measurements to obtain an inferential model by partial least squares (PLS) regression. A dynamic PLS model...... oxides (NOx) emission of cement kilns. On-site tests demonstrate improved performance over soft sensors based on conventional process measurements only....

  4. Broadband image sensor array based on graphene-CMOS integration

    Science.gov (United States)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  5. Extended Special Sensor Microwave Imager (SSM/I) Sensor Data Record (SDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager (SSM/I) is a seven-channel linearly polarized passive microwave radiometer that operates at frequencies of 19.36 (vertically and...

  6. A CMOS image sensor with row and column profiling means

    NARCIS (Netherlands)

    Xie, N.; Theuwissen, A.J.P.; Wang, X.; Leijtens, J.A.P.; Hakkesteegt, H.; Jansen, H.

    2008-01-01

    This paper describes the implementation and firstmeasurement results of a new way that obtains row and column profile data from a CMOS Image Sensor, which is developed for a micro-Digital Sun Sensor (μDSS).The basic profiling action is achieved by the pixels with p-type MOS transistors which realize

  7. Development of on-chip multi-imaging flow cytometry for identification of imaging biomarkers of clustered circulating tumor cells.

    Directory of Open Access Journals (Sweden)

    Hyonchol Kim

    Full Text Available An on-chip multi-imaging flow cytometry system has been developed to obtain morphometric parameters of cell clusters such as cell number, perimeter, total cross-sectional area, number of nuclei and size of clusters as "imaging biomarkers", with simultaneous acquisition and analysis of both bright-field (BF and fluorescent (FL images at 200 frames per second (fps; by using this system, we examined the effectiveness of using imaging biomarkers for the identification of clustered circulating tumor cells (CTCs. Sample blood of rats in which a prostate cancer cell line (MAT-LyLu had been pre-implanted was applied to a microchannel on a disposable microchip after staining the nuclei using fluorescent dye for their visualization, and the acquired images were measured and compared with those of healthy rats. In terms of the results, clustered cells having (1 cell area larger than 200 µm2 and (2 nucleus area larger than 90 µm2 were specifically observed in cancer cell-implanted blood, but were not observed in healthy rats. In addition, (3 clusters having more than 3 nuclei were specific for cancer-implanted blood and (4 a ratio between the actual perimeter and the perimeter calculated from the obtained area, which reflects a shape distorted from ideal roundness, of less than 0.90 was specific for all clusters having more than 3 nuclei and was also specific for cancer-implanted blood. The collected clusters larger than 300 µm2 were examined by quantitative gene copy number assay, and were identified as being CTCs. These results indicate the usefulness of the imaging biomarkers for characterizing clusters, and all of the four examined imaging biomarkers-cluster area, nuclei area, nuclei number, and ratio of perimeter-can identify clustered CTCs in blood with the same level of preciseness using multi-imaging cytometry.

  8. Optical Inspection In Hostile Industrial Environments: Single-Sensor VS. Imaging Methods

    Science.gov (United States)

    Cielo, P.; Dufour, M.; Sokalski, A.

    1988-11-01

    On-line and unsupervised industrial inspection for quality control and process monitoring is increasingly required in the modern automated factory. Optical techniques are particularly well suited to industrial inspection in hostile environments because of their noncontact nature, fast response time and imaging capabilities. Optical sensors can be used for remote inspection of high temperature products or otherwise inaccessible parts, provided they are in a line-of-sight relation with the sensor. Moreover, optical sensors are much easier to adapt to a variety of part shapes, position or orientation and conveyor speeds as compared to contact-based sensors. This is an important requirement in a flexible automation environment. A number of choices are possible in the design of optical inspection systems. General-purpose two-dimensional (2-D) or three-dimensional (3-D) imaging techniques have advanced very rapidly in the last years thanks to a substantial research effort as well as to the availability of increasingly powerful and affordable hardware and software. Imaging can be realized using 2-D arrays or simpler one-dimensional (1-D) line-array detectors. Alternatively, dedicated single-spot sensors require a smaller amount of data processing and often lead to robust sensors which are particularly appropriate to on-line operation in hostile industrial environments. Many specialists now feel that dedicated sensors or clusters of sensors are often more effective for specific industrial automation and control tasks, at least in the short run. This paper will discuss optomechanical and electro-optical choices with reference to the design of a number of on-line inspection sensors which have been recently developed at our institute. Case studies will include real-time surface roughness evaluation on polymer cables extruded at high speed, surface characterization of hot-rolled or galvanized-steel sheets, temperature evaluation and pinhole detection in aluminum foil, multi

  9. SNAPSHOT SPECTRAL AND COLOR IMAGING USING A REGULAR DIGITAL CAMERA WITH A MONOCHROMATIC IMAGE SENSOR

    Directory of Open Access Journals (Sweden)

    J. Hauser

    2017-10-01

    Full Text Available Spectral imaging (SI refers to the acquisition of the three-dimensional (3D spectral cube of spatial and spectral data of a source object at a limited number of wavelengths in a given wavelength range. Snapshot spectral imaging (SSI refers to the instantaneous acquisition (in a single shot of the spectral cube, a process suitable for fast changing objects. Known SSI devices exhibit large total track length (TTL, weight and production costs and relatively low optical throughput. We present a simple SSI camera based on a regular digital camera with (i an added diffusing and dispersing phase-only static optical element at the entrance pupil (diffuser and (ii tailored compressed sensing (CS methods for digital processing of the diffused and dispersed (DD image recorded on the image sensor. The diffuser is designed to mix the spectral cube data spectrally and spatially and thus to enable convergence in its reconstruction by CS-based algorithms. In addition to performing SSI, this SSI camera is capable to perform color imaging using a monochromatic or gray-scale image sensor without color filter arrays.

  10. Fast regional readout CMOS Image Sensor for dynamic MLC tracking

    Science.gov (United States)

    Zin, H.; Harris, E.; Osmond, J.; Evans, P.

    2014-03-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ~400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  11. Fast regional readout CMOS image sensor for dynamic MLC tracking

    International Nuclear Information System (INIS)

    Zin, H; Harris, E; Osmond, J; Evans, P

    2014-01-01

    Advanced radiotherapy techniques such as volumetric modulated arc therapy (VMAT) require verification of the complex beam delivery including tracking of multileaf collimators (MLC) and monitoring the dose rate. This work explores the feasibility of a prototype Complementary metal-oxide semiconductor Image Sensor (CIS) for tracking these complex treatments by utilising fast, region of interest (ROI) read out functionality. An automatic edge tracking algorithm was used to locate the MLC leaves edges moving at various speeds (from a moving triangle field shape) and imaged with various sensor frame rates. The CIS demonstrates successful edge detection of the dynamic MLC motion within accuracy of 1.0 mm. This demonstrates the feasibility of the sensor to verify treatment delivery involving dynamic MLC up to ∼400 frames per second (equivalent to the linac pulse rate), which is superior to any current techniques such as using electronic portal imaging devices (EPID). CIS provides the basis to an essential real-time verification tool, useful in accessing accurate delivery of complex high energy radiation to the tumour and ultimately to achieve better cure rates for cancer patients.

  12. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter

    Directory of Open Access Journals (Sweden)

    Hili Eidlin-Levy

    2017-12-01

    Full Text Available The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  13. Developmental Dyscalculia and Automatic Magnitudes Processing: Investigating Interference Effects between Area and Perimeter.

    Science.gov (United States)

    Eidlin-Levy, Hili; Rubinsten, Orly

    2017-01-01

    The relationship between numbers and other magnitudes has been extensively investigated in the scientific literature. Here, the objectives were to examine whether two continuous magnitudes, area and perimeter, are automatically processed and whether adults with developmental dyscalculia (DD) are deficient in their ability to automatically process one or both of these magnitudes. Fifty-seven students (30 with DD and 27 with typical development) performed a novel Stroop-like task requiring estimation of one aspect (area or perimeter) while ignoring the other. In order to track possible changes in automaticity due to practice, we measured performance after initial and continuous exposure to stimuli. Similar to previous findings, current results show a significant group × congruency interaction, evident beyond exposure level or magnitude type. That is, the DD group systematically showed larger Stroop effects. However, analysis of each exposure period showed that during initial exposure to stimuli the DD group showed larger Stroop effects in the perimeter and not in the area task. In contrast, during continuous exposure to stimuli no triple interaction was evident. It is concluded that both magnitudes are automatically processed. Nevertheless, individuals with DD are deficient in inhibiting irrelevant magnitude information in general and, specifically, struggle to inhibit salient area information after initial exposure to a perimeter comparison task. Accordingly, the findings support the assumption that DD involves a deficiency in multiple cognitive components, which include domain-specific and domain-general cognitive functions.

  14. VLC-based indoor location awareness using LED light and image sensors

    Science.gov (United States)

    Lee, Seok-Ju; Yoo, Jong-Ho; Jung, Sung-Yoon

    2012-11-01

    Recently, indoor LED lighting can be considered for constructing green infra with energy saving and additionally providing LED-IT convergence services such as visible light communication (VLC) based location awareness and navigation services. For example, in case of large complex shopping mall, location awareness to navigate the destination is very important issue. However, the conventional navigation using GPS is not working indoors. Alternative location service based on WLAN has a problem that the position accuracy is low. For example, it is difficult to estimate the height exactly. If the position error of the height is greater than the height between floors, it may cause big problem. Therefore, conventional navigation is inappropriate for indoor navigation. Alternative possible solution for indoor navigation is VLC based location awareness scheme. Because indoor LED infra will be definitely equipped for providing lighting functionality, indoor LED lighting has a possibility to provide relatively high accuracy of position estimation combined with VLC technology. In this paper, we provide a new VLC based positioning system using visible LED lights and image sensors. Our system uses location of image sensor lens and location of reception plane. By using more than two image sensor, we can determine transmitter position less than 1m position error. Through simulation, we verify the validity of the proposed VLC based new positioning system using visible LED light and image sensors.

  15. Coded aperture detector: an image sensor with sub 20-nm pixel resolution.

    Science.gov (United States)

    Miyakawa, Ryan; Mayer, Rafael; Wojdyla, Antoine; Vannier, Nicolas; Lesser, Ian; Aron-Dine, Shifrah; Naulleau, Patrick

    2014-08-11

    We describe the coded aperture detector, a novel image sensor based on uniformly redundant arrays (URAs) with customizable pixel size, resolution, and operating photon energy regime. In this sensor, a coded aperture is scanned laterally at the image plane of an optical system, and the transmitted intensity is measured by a photodiode. The image intensity is then digitally reconstructed using a simple convolution. We present results from a proof-of-principle optical prototype, demonstrating high-fidelity image sensing comparable to a CCD. A 20-nm half-pitch URA fabricated by the Center for X-ray Optics (CXRO) nano-fabrication laboratory is presented that is suitable for high-resolution image sensing at EUV and soft X-ray wavelengths.

  16. The Design of a Single-Bit CMOS Image Sensor for Iris Recognition Applications

    Directory of Open Access Journals (Sweden)

    Keunyeol Park

    2018-02-01

    Full Text Available This paper presents a single-bit CMOS image sensor (CIS that uses a data processing technique with an edge detection block for simple iris segmentation. In order to recognize the iris image, the image sensor conventionally captures high-resolution image data in digital code, extracts the iris data, and then compares it with a reference image through a recognition algorithm. However, in this case, the frame rate decreases by the time required for digital signal conversion of multi-bit digital data through the analog-to-digital converter (ADC in the CIS. In order to reduce the overall processing time as well as the power consumption, we propose a data processing technique with an exclusive OR (XOR logic gate to obtain single-bit and edge detection image data instead of multi-bit image data through the ADC. In addition, we propose a logarithmic counter to efficiently measure single-bit image data that can be applied to the iris recognition algorithm. The effective area of the proposed single-bit image sensor (174 × 144 pixel is 2.84 mm2 with a 0.18 μm 1-poly 4-metal CMOS image sensor process. The power consumption of the proposed single-bit CIS is 2.8 mW with a 3.3 V of supply voltage and 520 frame/s of the maximum frame rates. The error rate of the ADC is 0.24 least significant bit (LSB on an 8-bit ADC basis at a 50 MHz sampling frequency.

  17. The Design of a Single-Bit CMOS Image Sensor for Iris Recognition Applications.

    Science.gov (United States)

    Park, Keunyeol; Song, Minkyu; Kim, Soo Youn

    2018-02-24

    This paper presents a single-bit CMOS image sensor (CIS) that uses a data processing technique with an edge detection block for simple iris segmentation. In order to recognize the iris image, the image sensor conventionally captures high-resolution image data in digital code, extracts the iris data, and then compares it with a reference image through a recognition algorithm. However, in this case, the frame rate decreases by the time required for digital signal conversion of multi-bit digital data through the analog-to-digital converter (ADC) in the CIS. In order to reduce the overall processing time as well as the power consumption, we propose a data processing technique with an exclusive OR (XOR) logic gate to obtain single-bit and edge detection image data instead of multi-bit image data through the ADC. In addition, we propose a logarithmic counter to efficiently measure single-bit image data that can be applied to the iris recognition algorithm. The effective area of the proposed single-bit image sensor (174 × 144 pixel) is 2.84 mm² with a 0.18 μm 1-poly 4-metal CMOS image sensor process. The power consumption of the proposed single-bit CIS is 2.8 mW with a 3.3 V of supply voltage and 520 frame/s of the maximum frame rates. The error rate of the ADC is 0.24 least significant bit (LSB) on an 8-bit ADC basis at a 50 MHz sampling frequency.

  18. Crop status sensing system by multi-spectral imaging sensor, 1: Image processing and paddy field sensing

    International Nuclear Information System (INIS)

    Ishii, K.; Sugiura, R.; Fukagawa, T.; Noguchi, N.; Shibata, Y.

    2006-01-01

    The objective of the study is to construct a sensing system for precision farming. A Multi-Spectral Imaging Sensor (MSIS), which can obtain three images (G. R and NIR) simultaneously, was used for detecting growth status of plants. The sensor was mounted on an unmanned helicopter. An image processing method for acquiring information of crop status with high accuracy was developed. Crop parameters that were measured include SPAD, leaf height, and stems number. Both direct seeding variety and transplant variety of paddy rice were adopted in the research. The result of a field test showed that crop status of both varieties could be detected with sufficient accuracy to apply to precision farming

  19. Accuracy of Shack-Hartmann wavefront sensor using a coherent wound fibre image bundle

    Science.gov (United States)

    Zheng, Jessica R.; Goodwin, Michael; Lawrence, Jon

    2018-03-01

    Shack-Hartmannwavefront sensors using wound fibre image bundles are desired for multi-object adaptive optical systems to provide large multiplex positioned by Starbugs. The use of a large-sized wound fibre image bundle provides the flexibility to use more sub-apertures wavefront sensor for ELTs. These compact wavefront sensors take advantage of large focal surfaces such as the Giant Magellan Telescope. The focus of this paper is to study the wound fibre image bundle structure defects effect on the centroid measurement accuracy of a Shack-Hartmann wavefront sensor. We use the first moment centroid method to estimate the centroid of a focused Gaussian beam sampled by a simulated bundle. Spot estimation accuracy with wound fibre image bundle and its structure impact on wavefront measurement accuracy statistics are addressed. Our results show that when the measurement signal-to-noise ratio is high, the centroid measurement accuracy is dominated by the wound fibre image bundle structure, e.g. tile angle and gap spacing. For the measurement with low signal-to-noise ratio, its accuracy is influenced by the read noise of the detector instead of the wound fibre image bundle structure defects. We demonstrate this both with simulation and experimentally. We provide a statistical model of the centroid and wavefront error of a wound fibre image bundle found through experiment.

  20. Image interpolation and denoising for division of focal plane sensors using Gaussian processes.

    Science.gov (United States)

    Gilboa, Elad; Cunningham, John P; Nehorai, Arye; Gruev, Viktor

    2014-06-16

    Image interpolation and denoising are important techniques in image processing. These methods are inherent to digital image acquisition as most digital cameras are composed of a 2D grid of heterogeneous imaging sensors. Current polarization imaging employ four different pixelated polarization filters, commonly referred to as division of focal plane polarization sensors. The sensors capture only partial information of the true scene, leading to a loss of spatial resolution as well as inaccuracy of the captured polarization information. Interpolation is a standard technique to recover the missing information and increase the accuracy of the captured polarization information. Here we focus specifically on Gaussian process regression as a way to perform a statistical image interpolation, where estimates of sensor noise are used to improve the accuracy of the estimated pixel information. We further exploit the inherent grid structure of this data to create a fast exact algorithm that operates in ����(N(3/2)) (vs. the naive ���� (N³)), thus making the Gaussian process method computationally tractable for image data. This modeling advance and the enabling computational advance combine to produce significant improvements over previously published interpolation methods for polarimeters, which is most pronounced in cases of low signal-to-noise ratio (SNR). We provide the comprehensive mathematical model as well as experimental results of the GP interpolation performance for division of focal plane polarimeter.

  1. Efficient demodulation scheme for rolling-shutter-patterning of CMOS image sensor based visible light communications.

    Science.gov (United States)

    Chen, Chia-Wei; Chow, Chi-Wai; Liu, Yang; Yeh, Chien-Hung

    2017-10-02

    Recently even the low-end mobile-phones are equipped with a high-resolution complementary-metal-oxide-semiconductor (CMOS) image sensor. This motivates using a CMOS image sensor for visible light communication (VLC). Here we propose and demonstrate an efficient demodulation scheme to synchronize and demodulate the rolling shutter pattern in image sensor based VLC. The implementation algorithm is discussed. The bit-error-rate (BER) performance and processing latency are evaluated and compared with other thresholding schemes.

  2. A 10-bit column-parallel cyclic ADC for high-speed CMOS image sensors

    International Nuclear Information System (INIS)

    Han Ye; Li Quanliang; Shi Cong; Wu Nanjian

    2013-01-01

    This paper presents a high-speed column-parallel cyclic analog-to-digital converter (ADC) for a CMOS image sensor. A correlated double sampling (CDS) circuit is integrated in the ADC, which avoids a stand-alone CDS circuit block. An offset cancellation technique is also introduced, which reduces the column fixed-pattern noise (FPN) effectively. One single channel ADC with an area less than 0.02 mm 2 was implemented in a 0.13 μm CMOS image sensor process. The resolution of the proposed ADC is 10-bit, and the conversion rate is 1.6 MS/s. The measured differential nonlinearity and integral nonlinearity are 0.89 LSB and 6.2 LSB together with CDS, respectively. The power consumption from 3.3 V supply is only 0.66 mW. An array of 48 10-bit column-parallel cyclic ADCs was integrated into an array of CMOS image sensor pixels. The measured results indicated that the ADC circuit is suitable for high-speed CMOS image sensors. (semiconductor integrated circuits)

  3. Retina-like sensor image coordinates transformation and display

    Science.gov (United States)

    Cao, Fengmei; Cao, Nan; Bai, Tingzhu; Song, Shengyu

    2015-03-01

    For a new kind of retina-like senor camera, the image acquisition, coordinates transformation and interpolation need to be realized. Both of the coordinates transformation and interpolation are computed in polar coordinate due to the sensor's particular pixels distribution. The image interpolation is based on sub-pixel interpolation and its relative weights are got in polar coordinates. The hardware platform is composed of retina-like senor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes the real-time image acquisition, coordinate transformation and interpolation.

  4. Measuring the Contractile Response of Isolated Tissue Using an Image Sensor

    Directory of Open Access Journals (Sweden)

    David Díaz-Martín

    2015-04-01

    Full Text Available Isometric or isotonic transducers have traditionally been used to study the contractile/relaxation effects of drugs on isolated tissues. However, these mechanical sensors are expensive and delicate, and they are associated with certain disadvantages when performing experiments in the laboratory. In this paper, a method that uses an image sensor to measure the contractile effect of drugs on blood vessel rings and other luminal organs is presented. The new method is based on an image-processing algorithm, and it provides a fast, easy and non-expensive way to analyze the effects of such drugs. In our tests, we have obtained dose-response curves from rat aorta rings that are equivalent to those achieved with classical mechanic sensors.

  5. White-light full-field OCT resolution improvement by image sensor colour balance adjustment: numerical simulation

    International Nuclear Information System (INIS)

    Kalyanov, A L; Lychagov, V V; Ryabukho, V P; Smirnov, I V

    2012-01-01

    The possibility of improving white-light full-field optical coherence tomography (OCT) resolution by image sensor colour balance tuning is shown numerically. We calculated the full-width at half-maximum (FWHM) of a coherence pulse registered by a silicon colour image sensor under various colour balance settings. The calculations were made for both a halogen lamp and white LED sources. The results show that the interference pulse width can be reduced by the proper choice of colour balance coefficients. The reduction is up to 18%, as compared with a colour image sensor with regular settings, and up to 20%, as compared with a monochrome sensor. (paper)

  6. Extracellular Bio-imaging of Acetylcholine-stimulated PC12 Cells Using a Calcium and Potassium Multi-ion Image Sensor.

    Science.gov (United States)

    Matsuba, Sota; Kato, Ryo; Okumura, Koichi; Sawada, Kazuaki; Hattori, Toshiaki

    2018-01-01

    In biochemistry, Ca 2+ and K + play essential roles to control signal transduction. Much interest has been focused on ion-imaging, which facilitates understanding of their ion flux dynamics. In this paper, we report a calcium and potassium multi-ion image sensor and its application to living cells (PC12). The multi-ion sensor had two selective plasticized poly(vinyl chloride) membranes containing ionophores. Each region on the sensor responded to only the corresponding ion. The multi-ion sensor has many advantages including not only label-free and real-time measurement but also simultaneous detection of Ca 2+ and K + . Cultured PC12 cells treated with nerve growth factor were prepared, and a practical observation for the cells was conducted with the sensor. After the PC12 cells were stimulated by acetylcholine, only the extracellular Ca 2+ concentration increased while there was no increase in the extracellular K + concentration. Through the practical observation, we demonstrated that the sensor was helpful for analyzing the cell events with changing Ca 2+ and/or K + concentration.

  7. Optical Imaging Sensors and Systems for Homeland Security Applications

    CERN Document Server

    Javidi, Bahram

    2006-01-01

    Optical and photonic systems and devices have significant potential for homeland security. Optical Imaging Sensors and Systems for Homeland Security Applications presents original and significant technical contributions from leaders of industry, government, and academia in the field of optical and photonic sensors, systems and devices for detection, identification, prevention, sensing, security, verification and anti-counterfeiting. The chapters have recent and technically significant results, ample illustrations, figures, and key references. This book is intended for engineers and scientists in the relevant fields, graduate students, industry managers, university professors, government managers, and policy makers. Advanced Sciences and Technologies for Security Applications focuses on research monographs in the areas of -Recognition and identification (including optical imaging, biometrics, authentication, verification, and smart surveillance systems) -Biological and chemical threat detection (including bios...

  8. INTEGRATED GEOREFERENCING OF STEREO IMAGE SEQUENCES CAPTURED WITH A STEREOVISION MOBILE MAPPING SYSTEM – APPROACHES AND PRACTICAL RESULTS

    Directory of Open Access Journals (Sweden)

    H. Eugster

    2012-07-01

    Full Text Available Stereovision based mobile mapping systems enable the efficient capturing of directly georeferenced stereo pairs. With today's camera and onboard storage technologies imagery can be captured at high data rates resulting in dense stereo sequences. These georeferenced stereo sequences provide a highly detailed and accurate digital representation of the roadside environment which builds the foundation for a wide range of 3d mapping applications and image-based geo web-services. Georeferenced stereo images are ideally suited for the 3d mapping of street furniture and visible infrastructure objects, pavement inspection, asset management tasks or image based change detection. As in most mobile mapping systems, the georeferencing of the mapping sensors and observations – in our case of the imaging sensors – normally relies on direct georeferencing based on INS/GNSS navigation sensors. However, in urban canyons the achievable direct georeferencing accuracy of the dynamically captured stereo image sequences is often insufficient or at least degraded. Furthermore, many of the mentioned application scenarios require homogeneous georeferencing accuracy within a local reference frame over the entire mapping perimeter. To achieve these demands georeferencing approaches are presented and cost efficient workflows are discussed which allows validating and updating the INS/GNSS based trajectory with independently estimated positions in cases of prolonged GNSS signal outages in order to increase the georeferencing accuracy up to the project requirements.

  9. Laser Doppler perfusion imaging with a complimentary metal oxide semiconductor image sensor

    NARCIS (Netherlands)

    Serov, Alexander; Steenbergen, Wiendelt; de Mul, F.F.M.

    2002-01-01

    We utilized a complimentary metal oxide semiconductor video camera for fast f low imaging with the laser Doppler technique. A single sensor is used for both observation of the area of interest and measurements of the interference signal caused by dynamic light scattering from moving particles inside

  10. Real-time method for establishing a detection map for a network of sensors

    Science.gov (United States)

    Nguyen, Hung D; Koch, Mark W; Giron, Casey; Rondeau, Daniel M; Russell, John L

    2012-09-11

    A method for establishing a detection map of a dynamically configurable sensor network. This method determines an appropriate set of locations for a plurality of sensor units of a sensor network and establishes a detection map for the network of sensors while the network is being set up; the detection map includes the effects of the local terrain and individual sensor performance. Sensor performance is characterized during the placement of the sensor units, which enables dynamic adjustment or reconfiguration of the placement of individual elements of the sensor network during network set-up to accommodate variations in local terrain and individual sensor performance. The reconfiguration of the network during initial set-up to accommodate deviations from idealized individual sensor detection zones improves the effectiveness of the sensor network in detecting activities at a detection perimeter and can provide the desired sensor coverage of an area while minimizing unintentional gaps in coverage.

  11. Light-Addressable Potentiometric Sensors for Quantitative Spatial Imaging of Chemical Species.

    Science.gov (United States)

    Yoshinobu, Tatsuo; Miyamoto, Ko-Ichiro; Werner, Carl Frederik; Poghossian, Arshak; Wagner, Torsten; Schöning, Michael J

    2017-06-12

    A light-addressable potentiometric sensor (LAPS) is a semiconductor-based chemical sensor, in which a measurement site on the sensing surface is defined by illumination. This light addressability can be applied to visualize the spatial distribution of pH or the concentration of a specific chemical species, with potential applications in the fields of chemistry, materials science, biology, and medicine. In this review, the features of this chemical imaging sensor technology are compared with those of other technologies. Instrumentation, principles of operation, and various measurement modes of chemical imaging sensor systems are described. The review discusses and summarizes state-of-the-art technologies, especially with regard to the spatial resolution and measurement speed; for example, a high spatial resolution in a submicron range and a readout speed in the range of several tens of thousands of pixels per second have been achieved with the LAPS. The possibility of combining this technology with microfluidic devices and other potential future developments are discussed.

  12. Real-time biochemical sensor based on Raman scattering with CMOS contact imaging.

    Science.gov (United States)

    Muyun Cao; Yuhua Li; Yadid-Pecht, Orly

    2015-08-01

    This work presents a biochemical sensor based on Raman scattering with Complementary metal-oxide-semiconductor (CMOS) contact imaging. This biochemical optical sensor is designed for detecting the concentration of solutions. The system is built with a laser diode, an optical filter, a sample holder and a commercial CMOS sensor. The output of the system is analyzed by an image processing program. The system provides instant measurements with a resolution of 0.2 to 0.4 Mol. This low cost and easy-operated small scale system is useful in chemical, biomedical and environmental labs for quantitative bio-chemical concentration detection with results reported comparable to a highly cost commercial spectrometer.

  13. Decoding mobile-phone image sensor rolling shutter effect for visible light communications

    Science.gov (United States)

    Liu, Yang

    2016-01-01

    Optical wireless communication (OWC) using visible lights, also known as visible light communication (VLC), has attracted significant attention recently. As the traditional OWC and VLC receivers (Rxs) are based on PIN photo-diode or avalanche photo-diode, deploying the complementary metal-oxide-semiconductor (CMOS) image sensor as the VLC Rx is attractive since nowadays nearly every person has a smart phone with embedded CMOS image sensor. However, deploying the CMOS image sensor as the VLC Rx is challenging. In this work, we propose and demonstrate two simple contrast ratio (CR) enhancement schemes to improve the contrast of the rolling shutter pattern. Then we describe their processing algorithms one by one. The experimental results show that both the proposed CR enhancement schemes can significantly mitigate the high-intensity fluctuations of the rolling shutter pattern and improve the bit-error-rate performance.

  14. High-resolution dynamic pressure sensor array based on piezo-phototronic effect tuned photoluminescence imaging.

    Science.gov (United States)

    Peng, Mingzeng; Li, Zhou; Liu, Caihong; Zheng, Qiang; Shi, Xieqing; Song, Ming; Zhang, Yang; Du, Shiyu; Zhai, Junyi; Wang, Zhong Lin

    2015-03-24

    A high-resolution dynamic tactile/pressure display is indispensable to the comprehensive perception of force/mechanical stimulations such as electronic skin, biomechanical imaging/analysis, or personalized signatures. Here, we present a dynamic pressure sensor array based on pressure/strain tuned photoluminescence imaging without the need for electricity. Each sensor is a nanopillar that consists of InGaN/GaN multiple quantum wells. Its photoluminescence intensity can be modulated dramatically and linearly by small strain (0-0.15%) owing to the piezo-phototronic effect. The sensor array has a high pixel density of 6350 dpi and exceptional small standard deviation of photoluminescence. High-quality tactile/pressure sensing distribution can be real-time recorded by parallel photoluminescence imaging without any cross-talk. The sensor array can be inexpensively fabricated over large areas by semiconductor product lines. The proposed dynamic all-optical pressure imaging with excellent resolution, high sensitivity, good uniformity, and ultrafast response time offers a suitable way for smart sensing, micro/nano-opto-electromechanical systems.

  15. Comparison of the performance of intraoral X-ray sensors using objective image quality assessment.

    Science.gov (United States)

    Hellén-Halme, Kristina; Johansson, Curt; Nilsson, Mats

    2016-05-01

    The main aim of this study was to evaluate the performance of 10 individual sensors of the same make, using objective measures of key image quality parameters. A further aim was to compare 8 brands of sensors. Ten new sensors of 8 different models from 6 manufacturers (i.e., 80 sensors) were included in the study. All sensors were exposed in a standardized way using an X-ray tube voltage of 60 kVp and different exposure times. Sensor response, noise, low-contrast resolution, spatial resolution and uniformity were measured. Individual differences between sensors of the same brand were surprisingly large in some cases. There were clear differences in the characteristics of the different brands of sensors. The largest variations were found for individual sensor response for some of the brands studied. Also, noise level and low contrast resolution showed large variations between brands. Sensors, even of the same brand, vary significantly in their quality. It is thus valuable to establish action levels for the acceptance of newly delivered sensors and to use objective image quality control for commissioning purposes and periodic checks to ensure high performance of individual digital sensors. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. An Approach for Unsupervised Change Detection in Multitemporal VHR Images Acquired by Different Multispectral Sensors

    Directory of Open Access Journals (Sweden)

    Yady Tatiana Solano-Correa

    2018-03-01

    Full Text Available This paper proposes an approach for the detection of changes in multitemporal Very High Resolution (VHR optical images acquired by different multispectral sensors. The proposed approach, which is inspired by a recent framework developed to support the design of change-detection systems for single-sensor VHR remote sensing images, addresses and integrates in the general approach a strategy to effectively deal with multisensor information, i.e., to perform change detection between VHR images acquired by different multispectral sensors on two dates. This is achieved by the definition of procedures for the homogenization of radiometric, spectral and geometric image properties. These procedures map images into a common feature space where the information acquired by different multispectral sensors becomes comparable across time. Although the approach is general, here we optimize it for the detection of changes in vegetation and urban areas by employing features based on linear transformations (Tasseled Caps and Orthogonal Equations, which are shown to be effective for representing the multisensor information in a homogeneous physical way irrespectively of the considered sensor. Experiments on multitemporal images acquired by different VHR satellite systems (i.e., QuickBird, WorldView-2 and GeoEye-1 confirm the effectiveness of the proposed approach.

  17. Development of integrated semiconductor optical sensors for functional brain imaging

    Science.gov (United States)

    Lee, Thomas T.

    Optical imaging of neural activity is a widely accepted technique for imaging brain function in the field of neuroscience research, and has been used to study the cerebral cortex in vivo for over two decades. Maps of brain activity are obtained by monitoring intensity changes in back-scattered light, called Intrinsic Optical Signals (IOS), that correspond to fluctuations in blood oxygenation and volume associated with neural activity. Current imaging systems typically employ bench-top equipment including lamps and CCD cameras to study animals using visible light. Such systems require the use of anesthetized or immobilized subjects with craniotomies, which imposes limitations on the behavioral range and duration of studies. The ultimate goal of this work is to overcome these limitations by developing a single-chip semiconductor sensor using arrays of sources and detectors operating at near-infrared (NIR) wavelengths. A single-chip implementation, combined with wireless telemetry, will eliminate the need for immobilization or anesthesia of subjects and allow in vivo studies of free behavior. NIR light offers additional advantages because it experiences less absorption in animal tissue than visible light, which allows for imaging through superficial tissues. This, in turn, reduces or eliminates the need for traumatic surgery and enables long-term brain-mapping studies in freely-behaving animals. This dissertation concentrates on key engineering challenges of implementing the sensor. This work shows the feasibility of using a GaAs-based array of vertical-cavity surface emitting lasers (VCSELs) and PIN photodiodes for IOS imaging. I begin with in-vivo studies of IOS imaging through the skull in mice, and use these results along with computer simulations to establish minimum performance requirements for light sources and detectors. I also evaluate the performance of a current commercial VCSEL for IOS imaging, and conclude with a proposed prototype sensor.

  18. Evaluation of onboard hyperspectral-image compression techniques for a parallel push-broom sensor

    Energy Technology Data Exchange (ETDEWEB)

    Briles, S.

    1996-04-01

    A single hyperspectral imaging sensor can produce frames with spatially-continuous rows of differing, but adjacent, spectral wavelength. If the frame sample-rate of the sensor is such that subsequent hyperspectral frames are spatially shifted by one row, then the sensor can be thought of as a parallel (in wavelength) push-broom sensor. An examination of data compression techniques for such a sensor is presented. The compression techniques are intended to be implemented onboard a space-based platform and to have implementation speeds that match the date rate of the sensor. Data partitions examined extend from individually operating on a single hyperspectral frame to operating on a data cube comprising the two spatial axes and the spectral axis. Compression algorithms investigated utilize JPEG-based image compression, wavelet-based compression and differential pulse code modulation. Algorithm performance is quantitatively presented in terms of root-mean-squared error and root-mean-squared correlation coefficient error. Implementation issues are considered in algorithm development.

  19. Top scientists join Stephen Hawking at Perimeter Institute

    Science.gov (United States)

    Banks, Michael

    2009-03-01

    Nine leading researchers are to join Stephen Hawking as visiting fellows at the Perimeter Institute for Theoretical Physics in Ontario, Canada. The researchers, who include string theorists Leonard Susskind from Stanford University and Asoka Sen from the Harisch-Chandra Research Institute in India, will each spend a few months of the year at the institute as "distinguished research chairs". They will be joined by another 30 scientists to be announced at a later date.

  20. Honeywell's Compact, Wide-angle Uv-visible Imaging Sensor

    Science.gov (United States)

    Pledger, D.; Billing-Ross, J.

    1993-01-01

    Honeywell is currently developing the Earth Reference Attitude Determination System (ERADS). ERADS determines attitude by imaging the entire Earth's limb and a ring of the adjacent star field in the 2800-3000 A band of the ultraviolet. This is achieved through the use of a highly nonconventional optical system, an intensifier tube, and a mega-element CCD array. The optics image a 30 degree region in the center of the field, and an outer region typically from 128 to 148 degrees, which can be adjusted up to 180 degrees. Because of the design employed, the illumination at the outer edge of the field is only some 15 percent below that at the center, in contrast to the drastic rolloffs encountered in conventional wide-angle sensors. The outer diameter of the sensor is only 3 in; the volume and weight of the entire system, including processor, are 1000 cc and 6 kg, respectively.

  1. Distributed Fiber-Optic Sensors for Vibration Detection.

    Science.gov (United States)

    Liu, Xin; Jin, Baoquan; Bai, Qing; Wang, Yu; Wang, Dong; Wang, Yuncai

    2016-07-26

    Distributed fiber-optic vibration sensors receive extensive investigation and play a significant role in the sensor panorama. Optical parameters such as light intensity, phase, polarization state, or light frequency will change when external vibration is applied on the sensing fiber. In this paper, various technologies of distributed fiber-optic vibration sensing are reviewed, from interferometric sensing technology, such as Sagnac, Mach-Zehnder, and Michelson, to backscattering-based sensing technology, such as phase-sensitive optical time domain reflectometer, polarization-optical time domain reflectometer, optical frequency domain reflectometer, as well as some combinations of interferometric and backscattering-based techniques. Their operation principles are presented and recent research efforts are also included. Finally, the applications of distributed fiber-optic vibration sensors are summarized, which mainly include structural health monitoring and perimeter security, etc. Overall, distributed fiber-optic vibration sensors possess the advantages of large-scale monitoring, good concealment, excellent flexibility, and immunity to electromagnetic interference, and thus show considerable potential for a variety of practical applications.

  2. Operational calibration and validation of landsat data continuity mission (LDCM) sensors using the image assessment system (IAS)

    Science.gov (United States)

    Micijevic, Esad; Morfitt, Ron

    2010-01-01

    Systematic characterization and calibration of the Landsat sensors and the assessment of image data quality are performed using the Image Assessment System (IAS). The IAS was first introduced as an element of the Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+) ground segment and recently extended to Landsat 4 (L4) and 5 (L5) Thematic Mappers (TM) and Multispectral Sensors (MSS) on-board the Landsat 1-5 satellites. In preparation for the Landsat Data Continuity Mission (LDCM), the IAS was developed for the Earth Observer 1 (EO-1) Advanced Land Imager (ALI) with a capability to assess pushbroom sensors. This paper describes the LDCM version of the IAS and how it relates to unique calibration and validation attributes of its on-board imaging sensors. The LDCM IAS system will have to handle a significantly larger number of detectors and the associated database than the previous IAS versions. An additional challenge is that the LDCM IAS must handle data from two sensors, as the LDCM products will combine the Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) spectral bands.

  3. Asymptotics of the $s$-perimeter as $s\\searrow 0$

    OpenAIRE

    Dipierro, Serena; Figalli, Alessio; Palatucci, Giampiero; Valdinoci, Enrico

    2012-01-01

    We deal with the asymptotic behavior of the $s$-perimeter of a set $E$ inside a domain $\\Omega$ as $s\\searrow0$. We prove necessary and sufficient conditions for the existence of such limit, by also providing an explicit formulation in terms of the Lebesgue measure of $E$ and $\\Omega$. Moreover, we construct examples of sets for which the limit does not exist.

  4. Image sensor for testing refractive error of eyes

    Science.gov (United States)

    Li, Xiangning; Chen, Jiabi; Xu, Longyun

    2000-05-01

    It is difficult to detect ametropia and anisometropia for children. Image sensor for testing refractive error of eyes does not need the cooperation of children and can be used to do the general survey of ametropia and anisometropia for children. In our study, photographs are recorded by a CCD element in a digital form which can be directly processed by a computer. In order to process the image accurately by digital technique, formula considering the effect of extended light source and the size of lens aperture has been deduced, which is more reliable in practice. Computer simulation of the image sensing is made to verify the fineness of the results.

  5. CCD image sensor induced error in PIV applications

    Science.gov (United States)

    Legrand, M.; Nogueira, J.; Vargas, A. A.; Ventas, R.; Rodríguez-Hidalgo, M. C.

    2014-06-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (˜0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described.

  6. CCD image sensor induced error in PIV applications

    International Nuclear Information System (INIS)

    Legrand, M; Nogueira, J; Vargas, A A; Ventas, R; Rodríguez-Hidalgo, M C

    2014-01-01

    The readout procedure of charge-coupled device (CCD) cameras is known to generate some image degradation in different scientific imaging fields, especially in astrophysics. In the particular field of particle image velocimetry (PIV), widely extended in the scientific community, the readout procedure of the interline CCD sensor induces a bias in the registered position of particle images. This work proposes simple procedures to predict the magnitude of the associated measurement error. Generally, there are differences in the position bias for the different images of a certain particle at each PIV frame. This leads to a substantial bias error in the PIV velocity measurement (∼0.1 pixels). This is the order of magnitude that other typical PIV errors such as peak-locking may reach. Based on modern CCD technology and architecture, this work offers a description of the readout phenomenon and proposes a modeling for the CCD readout bias error magnitude. This bias, in turn, generates a velocity measurement bias error when there is an illumination difference between two successive PIV exposures. The model predictions match the experiments performed with two 12-bit-depth interline CCD cameras (MegaPlus ES 4.0/E incorporating the Kodak KAI-4000M CCD sensor with 4 megapixels). For different cameras, only two constant values are needed to fit the proposed calibration model and predict the error from the readout procedure. Tests by different researchers using different cameras would allow verification of the model, that can be used to optimize acquisition setups. Simple procedures to obtain these two calibration values are also described. (paper)

  7. Integration of piezo-capacitive and piezo-electric nanoweb based pressure sensors for imaging of static and dynamic pressure distribution.

    Science.gov (United States)

    Jeong, Y J; Oh, T I; Woo, E J; Kim, K J

    2017-07-01

    Recently, highly flexible and soft pressure distribution imaging sensor is in great demand for tactile sensing, gait analysis, ubiquitous life-care based on activity recognition, and therapeutics. In this study, we integrate the piezo-capacitive and piezo-electric nanowebs with the conductive fabric sheets for detecting static and dynamic pressure distributions on a large sensing area. Electrical impedance tomography (EIT) and electric source imaging are applied for reconstructing pressure distribution images from measured current-voltage data on the boundary of the hybrid fabric sensor. We evaluated the piezo-capacitive nanoweb sensor, piezo-electric nanoweb sensor, and hybrid fabric sensor. The results show the feasibility of static and dynamic pressure distribution imaging from the boundary measurements of the fabric sensors.

  8. Highly sensitive digital optical sensor with large measurement range based on the dual-microring resonator with waveguide-coupled feedback

    International Nuclear Information System (INIS)

    Xiang Xing-Ye; Wang Kui-Ru; Yuan Jin-Hui; Jin Bo-Yuan; Sang Xin-Zhu; Yu Chong-Xiu

    2014-01-01

    We propose a novel high-performance digital optical sensor based on the Mach—Zehnder interferential effect and the dual-microring resonators with the waveguide-coupled feedback. The simulation results show that the sensitivity of the sensor can be orders of magnitude higher than that of a conventional sensor, and high quality factor is not critical in it. Moreover, by optimizing the length of the feedback waveguide to be equal to the perimeter of the ring, the measurement range of the proposed sensor is twice as much as that of the conventional sensor in the weak coupling case

  9. Visual Image Sensor Organ Replacement

    Science.gov (United States)

    Maluf, David A.

    2014-01-01

    This innovation is a system that augments human vision through a technique called "Sensing Super-position" using a Visual Instrument Sensory Organ Replacement (VISOR) device. The VISOR device translates visual and other sensors (i.e., thermal) into sounds to enable very difficult sensing tasks. Three-dimensional spatial brightness and multi-spectral maps of a sensed image are processed using real-time image processing techniques (e.g. histogram normalization) and transformed into a two-dimensional map of an audio signal as a function of frequency and time. Because the human hearing system is capable of learning to process and interpret extremely complicated and rapidly changing auditory patterns, the translation of images into sounds reduces the risk of accidentally filtering out important clues. The VISOR device was developed to augment the current state-of-the-art head-mounted (helmet) display systems. It provides the ability to sense beyond the human visible light range, to increase human sensing resolution, to use wider angle visual perception, and to improve the ability to sense distances. It also allows compensation for movement by the human or changes in the scene being viewed.

  10. Refined universal laws for hull volumes and perimeters in large planar maps

    International Nuclear Information System (INIS)

    Guitter, Emmanuel

    2017-01-01

    We consider ensembles of planar maps with two marked vertices at distance k from each other, and look at the closed line separating these vertices and lying at distance d from the first one ( d   <   k ). This line divides the map into two components, the hull at distance d which corresponds to the part of the map lying on the same side as the first vertex and its complementary. The number of faces within the hull is called the hull volume, and the length of the separating line the hull perimeter. We study the statistics of the hull volume and perimeter for arbitrary d and k in the limit of infinitely large planar quadrangulations, triangulations and Eulerian triangulations. We consider more precisely situations where both d and k become large with the ratio d / k remaining finite. For infinitely large maps, two regimes may be encountered: either the hull has a finite volume and its complementary is infinitely large, or the hull itself has an infinite volume and its complementary is of finite size. We compute the probability for the map to be in either regime as a function of d / k as well as a number of universal statistical laws for the hull perimeter and volume when maps are conditioned to be in one regime or the other. (paper)

  11. Design of complex bone internal structure using topology optimization with perimeter control.

    Science.gov (United States)

    Park, Jaejong; Sutradhar, Alok; Shah, Jami J; Paulino, Glaucio H

    2018-03-01

    Large facial bone loss usually requires patient-specific bone implants to restore the structural integrity and functionality that also affects the appearance of each patient. Titanium alloys (e.g., Ti-6Al-4V) are typically used in the interfacial porous coatings between the implant and the surrounding bone to promote stability. There exists a property mismatch between the two that in general leads to complications such as stress-shielding. This biomechanical discrepancy is a hurdle in the design of bone replacements. To alleviate the mismatch, the internal structure of the bone replacements should match that of the bone. Topology optimization has proven to be a good technique for designing bone replacements. However, the complex internal structure of the bone is difficult to mimic using conventional topology optimization methods without additional restrictions. In this work, the complex bone internal structure is recovered using a perimeter control based topology optimization approach. By restricting the solution space by means of the perimeter, the intricate design complexity of bones can be achieved. Three different bone regions with well-known physiological loadings are selected to illustrate the method. Additionally, we found that the target perimeter value and the pattern of the initial distribution play a vital role in obtaining the natural curvatures in the bone internal structures as well as avoiding excessive island patterns. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Clock Drawing in Spatial Neglect: A Comprehensive Analysis of Clock Perimeter, Placement, and Accuracy

    Science.gov (United States)

    Chen, Peii; Goedert, Kelly M.

    2012-01-01

    Clock drawings produced by right-brain-damaged (RBD) individuals with spatial neglect often contain an abundance of empty space on the left while numbers and hands are placed on the right. However, the clock perimeter is rarely compromised in neglect patients’ drawings. By analyzing clock drawings produced by 71 RBD and 40 healthy adults, this study investigated whether the geometric characteristics of the clock perimeter reveal novel insights to understanding spatial neglect. Neglect participants drew smaller clocks than either healthy or non-neglect RBD participants. While healthy participants’ clock perimeter was close to circular, RBD participants drew radially extended ellipses. The mechanisms for these phenomena were investigated by examining the relation between clock-drawing characteristics and performance on six subtests of the Behavioral Inattention Test (BIT). The findings indicated that the clock shape was independent of any BIT subtest or the drawing placement on the test sheet and that the clock size was significantly predicted by one BIT subtest: the poorer the figure and shape copying, the smaller the clock perimeter. Further analyses revealed that in all participants, clocks decreased in size as they were placed farther from the center of the paper. However, even when neglect participants placed their clocks towards the center of the page, they were smaller than those produced by healthy or non-neglect RBD participants. These results suggest a neglect-specific reduction in the subjectively available workspace for graphic production from memory, consistent with the hypothesis that neglect patients are impaired in the ability to enlarge the attentional aperture. PMID:22390278

  13. Analysis on the Effect of Sensor Views in Image Reconstruction Produced by Optical Tomography System Using Charge-Coupled Device.

    Science.gov (United States)

    Jamaludin, Juliza; Rahim, Ruzairi Abdul; Fazul Rahiman, Mohd Hafiz; Mohd Rohani, Jemmy

    2018-04-01

    Optical tomography (OPT) is a method to capture a cross-sectional image based on the data obtained by sensors, distributed around the periphery of the analyzed system. This system is based on the measurement of the final light attenuation or absorption of radiation after crossing the measured objects. The number of sensor views will affect the results of image reconstruction, where the high number of sensor views per projection will give a high image quality. This research presents an application of charge-coupled device linear sensor and laser diode in an OPT system. Experiments in detecting solid and transparent objects in crystal clear water were conducted. Two numbers of sensors views, 160 and 320 views are evaluated in this research in reconstructing the images. The image reconstruction algorithms used were filtered images of linear back projection algorithms. Analysis on comparing the simulation and experiments image results shows that, with 320 image views giving less area error than 160 views. This suggests that high image view resulted in the high resolution of image reconstruction.

  14. Active Sensor for Microwave Tissue Imaging with Bias-Switched Arrays.

    Science.gov (United States)

    Foroutan, Farzad; Nikolova, Natalia K

    2018-05-06

    A prototype of a bias-switched active sensor was developed and measured to establish the achievable dynamic range in a new generation of active arrays for microwave tissue imaging. The sensor integrates a printed slot antenna, a low-noise amplifier (LNA) and an active mixer in a single unit, which is sufficiently small to enable inter-sensor separation distance as small as 12 mm. The sensor’s input covers the bandwidth from 3 GHz to 7.5 GHz. Its output intermediate frequency (IF) is 30 MHz. The sensor is controlled by a simple bias-switching circuit, which switches ON and OFF the bias of the LNA and the mixer simultaneously. It was demonstrated experimentally that the dynamic range of the sensor, as determined by its ON and OFF states, is 109 dB and 118 dB at resolution bandwidths of 1 kHz and 100 Hz, respectively.

  15. Displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor

    International Nuclear Information System (INIS)

    Wang, Zujun; Huang, Shaoyan; Liu, Minbo; Xiao, Zhigang; He, Baoping; Yao, Zhibin; Sheng, Jiangkun

    2014-01-01

    The experiments of displacement damage effects on CMOS APS image sensors induced by neutron irradiation from a nuclear reactor are presented. The CMOS APS image sensors are manufactured in the standard 0.35 μm CMOS technology. The flux of neutron beams was about 1.33 × 10 8 n/cm 2 s. The three samples were exposed by 1 MeV neutron equivalent-fluence of 1 × 10 11 , 5 × 10 11 , and 1 × 10 12 n/cm 2 , respectively. The mean dark signal (K D ), dark signal spike, dark signal non-uniformity (DSNU), noise (V N ), saturation output signal voltage (V S ), and dynamic range (DR) versus neutron fluence are investigated. The degradation mechanisms of CMOS APS image sensors are analyzed. The mean dark signal increase due to neutron displacement damage appears to be proportional to displacement damage dose. The dark images from CMOS APS image sensors irradiated by neutrons are presented to investigate the generation of dark signal spike

  16. Particle detection and classification using commercial off the shelf CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Martín [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Comisión Nacional de Energía Atómica (CNEA), Centro Atómico Bariloche, Av. Bustillo 9500, Bariloche 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina); Lipovetzky, Jose, E-mail: lipo@cab.cnea.gov.ar [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Comisión Nacional de Energía Atómica (CNEA), Centro Atómico Bariloche, Av. Bustillo 9500, Bariloche 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina); Sofo Haro, Miguel; Sidelnik, Iván; Blostein, Juan Jerónimo; Alcalde Bessia, Fabricio; Berisso, Mariano Gómez [Instituto Balseiro, Av. Bustillo 9500, Bariloche, 8400 (Argentina); Consejo Nacional de Investigaciones Científicas y Técnicas, Centro Atómico Bariloche, Av. Bustillo 9500, 8400 Bariloche (Argentina)

    2016-08-11

    In this paper we analyse the response of two different Commercial Off The shelf CMOS image sensors as particle detectors. Sensors were irradiated using X-ray photons, gamma photons, beta particles and alpha particles from diverse sources. The amount of charge produced by different particles, and the size of the spot registered on the sensor are compared, and analysed by an algorithm to classify them. For a known incident energy spectrum, the employed sensors provide a dose resolution lower than microGray, showing their potentials in radioprotection, area monitoring, or medical applications.

  17. A widefield fluorescence microscope with a linear image sensor for image cytometry of biospecimens: Considerations for image quality optimization

    Energy Technology Data Exchange (ETDEWEB)

    Hutcheson, Joshua A.; Majid, Aneeka A.; Powless, Amy J.; Muldoon, Timothy J., E-mail: tmuldoon@uark.edu [Department of Biomedical Engineering, University of Arkansas, 120 Engineering Hall, Fayetteville, Arkansas 72701 (United States)

    2015-09-15

    Linear image sensors have been widely used in numerous research and industry applications to provide continuous imaging of moving objects. Here, we present a widefield fluorescence microscope with a linear image sensor used to image translating objects for image cytometry. First, a calibration curve was characterized for a custom microfluidic chamber over a span of volumetric pump rates. Image data were also acquired using 15 μm fluorescent polystyrene spheres on a slide with a motorized translation stage in order to match linear translation speed with line exposure periods to preserve the image aspect ratio. Aspect ratios were then calculated after imaging to ensure quality control of image data. Fluorescent beads were imaged in suspension flowing through the microfluidics chamber being pumped by a mechanical syringe pump at 16 μl min{sup −1} with a line exposure period of 150 μs. The line period was selected to acquire images of fluorescent beads with a 40 dB signal-to-background ratio. A motorized translation stage was then used to transport conventional glass slides of stained cellular biospecimens. Whole blood collected from healthy volunteers was stained with 0.02% (w/v) proflavine hemisulfate was imaged to highlight leukocyte morphology with a 1.56 mm × 1.28 mm field of view (1540 ms total acquisition time). Oral squamous cells were also collected from healthy volunteers and stained with 0.01% (w/v) proflavine hemisulfate to demonstrate quantifiable subcellular features and an average nuclear to cytoplasmic ratio of 0.03 (n = 75), with a resolution of 0.31 μm pixels{sup −1}.

  18. Towards Rural Land Use: Challenges for Oversizing Urban Perimeters in Shrinking Towns

    Science.gov (United States)

    Sá, João; Virtudes, Ana

    2017-12-01

    This article, based on the literature review, aims to study the challenges of the urban dispersion and oversizing of urban perimeters, in the cases where the towns are shrinking or spreading to the rural land-use. It is focused on the case of Portugal where during the last decades there was an escaping to the big cities alongside to the sea (Atlantic and Mediterranean) shore. In the Interior part of the country, which means near to the border with Spain, several towns are shrinking, despite their huge urban perimeters, proposed by the municipal master plans, since the middle of the nineties. Consequently, these urban perimeters are nowadays oversizing, with empty buildings and non-urbanized areas. At the same time, the social patterns of occupation of this territory have changed significantly, moving from a society with signs of rurality to an urban realm, understood not only in territorial terms but also regarding the current lifestyle. This deep changing has occurred not only in urbanistic terms but also in the economic, cultural and social organizations of the country, under a movement that corresponds to a decline of the small urban settlements in rural areas, far away from the cosmopolitan strip of land nearby the sea, in between the capital city, Lisbon and the second one Oporto. These transformations were not driven by any significant public policy for land-use actions. On the contrary, the production of urban areas, supporting the new model of economic and social development was largely left to the initiative of economic and social private agents and land owners. These agents were the leading responsible for the new urban developments and housing. In this sense, this research aims to present some strategies for the short time period regarding the devolution of urban areas to rural land use. In this sense, the next steps of spatial planning policies, under the role of local authorities (the 308 municipalities including Madeira and Azores islands, plus the

  19. Nanoimprinted distributed feedback dye laser sensor for real-time imaging of small molecule diffusion

    DEFF Research Database (Denmark)

    Vannahme, Christoph; Dufva, Martin; Kristensen, Anders

    2014-01-01

    Label-free imaging is a promising tool for the study of biological processes such as cell adhesion and small molecule signaling processes. In order to image in two dimensions of space current solutions require motorized stages which results in low imaging frame rates. Here, a highly sensitive...... distributed feedback (DFB) dye laser sensor for real-time label-free imaging without any moving parts enabling a frame rate of 12 Hz is presented. The presence of molecules on the laser surface results in a wavelength shift which is used as sensor signal. The unique DFB laser structure comprises several areas...

  20. Spatial and Temporal Patterns of Unburned Areas within Fire Perimeters in the Northwestern United States from 1984 to 2014

    Science.gov (United States)

    Meddens, A. J.; Kolden, C.; Lutz, J. A.; Abatzoglou, J. T.; Hudak, A. T.

    2016-12-01

    Recently, there has been concern about increasing extent and severity of wildfires across the globe given rapid climate change. Areas that do not burn within fire perimeters can act as fire refugia, providing (1) protection from the detrimental effects of the fire, (2) seed sources, and (3) post-fire habitat on the landscape. However, recent studies have mainly focused on the higher end of the burn severity spectrum whereas the lower end of the burn severity spectrum has been largely ignored. We developed a spatially explicit database for 2,200 fires across the inland northwestern USA, delineating unburned areas within fire perimeters from 1984 to 2014. We used 1,600 Landsat scenes with one or two scenes before and one or two scenes after the fires to capture the unburned proportion of the fire. Subsequently, we characterized the spatial and temporal patterns of unburned areas and related the unburned proportion to interannual climate variability. The overall classification accuracy detecting unburned locations was 89.2% using a 10-fold cross-validation classification tree approach in combination with 719 randomly located field plots. The unburned proportion ranged from 2% to 58% with an average of 19% for a select number of fires. We find that using both an immediate post-fire image and a one-year post fire image improves classification accuracy of unburned islands over using just a single post-fire image. The spatial characteristics of the unburned islands differ between forested and non-forested regions with a larger amount of unburned area within non-forest. In addition, we show trends of unburned proportion related primarily to concurrent climatic drought conditions across the entire region. This database is important for subsequent analyses of fire refugia prioritization, vegetation recovery studies, ecosystem resilience, and forest management to facilitate unburned islands through fuels breaks, prescribed burning, and fire suppression strategies.

  1. PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.

    Science.gov (United States)

    Zheng, Lei; Lukac, Rastislav; Wu, Xiaolin; Zhang, David

    2009-04-01

    Single-sensor digital color cameras use a process called color demosiacking to produce full color images from the data captured by a color filter array (CAF). The quality of demosiacked images is degraded due to the sensor noise introduced during the image acquisition process. The conventional solution to combating CFA sensor noise is demosiacking first, followed by a separate denoising processing. This strategy will generate many noise-caused color artifacts in the demosiacking process, which are hard to remove in the denoising process. Few denoising schemes that work directly on the CFA images have been presented because of the difficulties arisen from the red, green and blue interlaced mosaic pattern, yet a well-designed "denoising first and demosiacking later" scheme can have advantages such as less noise-caused color artifacts and cost-effective implementation. This paper presents a principle component analysis (PCA)-based spatially-adaptive denoising algorithm, which works directly on the CFA data using a supporting window to analyze the local image statistics. By exploiting the spatial and spectral correlations existing in the CFA image, the proposed method can effectively suppress noise while preserving color edges and details. Experiments using both simulated and real CFA images indicate that the proposed scheme outperforms many existing approaches, including those sophisticated demosiacking and denoising schemes, in terms of both objective measurement and visual evaluation.

  2. Planoconcave optical microresonator sensors for photoacoustic imaging: pushing the limits of sensitivity (Conference Presentation)

    Science.gov (United States)

    Guggenheim, James A.; Zhang, Edward Z.; Beard, Paul C.

    2016-03-01

    Most photoacoustic scanners use piezoelectric detectors but these have two key limitations. Firstly, they are optically opaque, inhibiting backward mode operation. Secondly, it is difficult to achieve adequate detection sensitivity with the small element sizes needed to provide near-omnidirectional response as required for tomographic imaging. Planar Fabry-Perot (FP) ultrasound sensing etalons can overcome both of these limitations and have proved extremely effective for superficial (beam. However, this has the disadvantage that beam walk-off due to the divergence of the beam fundamentally limits the etalon finesse and thus sensitivity - in essence, the problem is one of insufficient optical confinement. To overcome this, novel planoconcave micro-resonator sensors have been fabricated using precision ink-jet printed polymer domes with curvatures matching that of the laser wavefront. By providing near-perfect beam confinement, we show that it is possible to approach the maximum theoretical limit for finesse (f) imposed by the etalon mirror reflectivities (e.g. f=400 for R=99.2% in contrast to a typical planar sensor value of fbeam walk-off, viable sensors can be made with significantly greater thickness than planar FP sensors. This provides an additional sensitivity gain for deep tissue imaging applications such as breast imaging where detection bandwidths in the low MHz can be tolerated. For example, for a 250 μm thick planoconcave sensor with a -3dB bandwidth of 5MHz, the measured NEP was 4 Pa. This NEP is comparable to that provided by mm scale piezoelectric detectors used for breast imaging applications but with more uniform frequency response characteristics and an order-of-magnitude smaller element size. Following previous proof-of-concept work, several important advances towards practical application have been made. A family of sensors with bandwidths ranging from 3MHz to 20MHz have been fabricated and characterised. A novel interrogation scheme based on

  3. Class Energy Image Analysis for Video Sensor-Based Gait Recognition: A Review

    Directory of Open Access Journals (Sweden)

    Zhuowen Lv

    2015-01-01

    Full Text Available Gait is a unique perceptible biometric feature at larger distances, and the gait representation approach plays a key role in a video sensor-based gait recognition system. Class Energy Image is one of the most important gait representation methods based on appearance, which has received lots of attentions. In this paper, we reviewed the expressions and meanings of various Class Energy Image approaches, and analyzed the information in the Class Energy Images. Furthermore, the effectiveness and robustness of these approaches were compared on the benchmark gait databases. We outlined the research challenges and provided promising future directions for the field. To the best of our knowledge, this is the first review that focuses on Class Energy Image. It can provide a useful reference in the literature of video sensor-based gait representation approach.

  4. Applications of image analysis in the characterization of Streptomyces olindensis in submerged culture

    Directory of Open Access Journals (Sweden)

    Pamboukian Celso R. Denser

    2002-01-01

    Full Text Available The morphology of Streptomyces olindensis (producer of retamycin, an antitumor antibiotic grown in submerged culture was assessed by image analysis. The morphology was differentiated into four classes: pellets, clumps (or entangled filaments, branched and unbranched free filaments. Four morphological parameters were initially considered (area, convex area, perimeter, and convex perimeter but only two parameters (perimeter and convex perimeter were chosen to automatically classify the cells into the four morphological classes, using histogram analysis. Each morphological class was evaluated during growth carried out in liquid media in fermenter or shaker. It was found that pellets and clumps dominated in early growth stages in fermenter (due to the inoculum coming from a shaker cultivation and that during cultivation, the breakage of pellets and clumps caused an increase in the percentage of free filaments. The criteria of morphological classification by image analysis proposed were useful to quantify the percentage of each morphological class during fermentations and may help to establish correlations between antibiotic production and microorganism morphology.

  5. Near-IR Two-Photon Fluorescent Sensor for K(+) Imaging in Live Cells.

    Science.gov (United States)

    Sui, Binglin; Yue, Xiling; Kim, Bosung; Belfield, Kevin D

    2015-08-19

    A new two-photon excited fluorescent K(+) sensor is reported. The sensor comprises three moieties, a highly selective K(+) chelator as the K(+) recognition unit, a boron-dipyrromethene (BODIPY) derivative modified with phenylethynyl groups as the fluorophore, and two polyethylene glycol chains to afford water solubility. The sensor displays very high selectivity (>52-fold) in detecting K(+) over other physiological metal cations. Upon binding K(+), the sensor switches from nonfluorescent to highly fluorescent, emitting red to near-IR (NIR) fluorescence. The sensor exhibited a good two-photon absorption cross section, 500 GM at 940 nm. Moreover, it is not sensitive to pH in the physiological pH range. Time-dependent cell imaging studies via both one- and two-photon fluorescence microscopy demonstrate that the sensor is suitable for dynamic K(+) sensing in living cells.

  6. Special Sensor Microwave Imager/Sounder (SSMIS) Temperature Data Record (TDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager/Sounder (SSMIS) is a series of passive microwave conically scanning imagers and sounders onboard the DMSP satellites beginning...

  7. Real-time classification of humans versus animals using profiling sensors and hidden Markov tree model

    Science.gov (United States)

    Hossen, Jakir; Jacobs, Eddie L.; Chari, Srikant

    2015-07-01

    Linear pyroelectric array sensors have enabled useful classifications of objects such as humans and animals to be performed with relatively low-cost hardware in border and perimeter security applications. Ongoing research has sought to improve the performance of these sensors through signal processing algorithms. In the research presented here, we introduce the use of hidden Markov tree (HMT) models for object recognition in images generated by linear pyroelectric sensors. HMTs are trained to statistically model the wavelet features of individual objects through an expectation-maximization learning process. Human versus animal classification for a test object is made by evaluating its wavelet features against the trained HMTs using the maximum-likelihood criterion. The classification performance of this approach is compared to two other techniques; a texture, shape, and spectral component features (TSSF) based classifier and a speeded-up robust feature (SURF) classifier. The evaluation indicates that among the three techniques, the wavelet-based HMT model works well, is robust, and has improved classification performance compared to a SURF-based algorithm in equivalent computation time. When compared to the TSSF-based classifier, the HMT model has a slightly degraded performance but almost an order of magnitude improvement in computation time enabling real-time implementation.

  8. CMOS image sensor for detection of interferon gamma protein interaction as a point-of-care approach.

    Science.gov (United States)

    Marimuthu, Mohana; Kandasamy, Karthikeyan; Ahn, Chang Geun; Sung, Gun Yong; Kim, Min-Gon; Kim, Sanghyo

    2011-09-01

    Complementary metal oxide semiconductor (CMOS)-based image sensors have received increased attention owing to the possibility of incorporating them into portable diagnostic devices. The present research examined the efficiency and sensitivity of a CMOS image sensor for the detection of antigen-antibody interactions involving interferon gamma protein without the aid of expensive instruments. The highest detection sensitivity of about 1 fg/ml primary antibody was achieved simply by a transmission mechanism. When photons are prevented from hitting the sensor surface, a reduction in digital output occurs in which the number of photons hitting the sensor surface is approximately proportional to the digital number. Nanoscale variation in substrate thickness after protein binding can be detected with high sensitivity by the CMOS image sensor. Therefore, this technique can be easily applied to smartphones or any clinical diagnostic devices for the detection of several biological entities, with high impact on the development of point-of-care applications.

  9. New amorphous-silicon image sensor for x-ray diagnostic medical imaging applications

    Science.gov (United States)

    Weisfield, Richard L.; Hartney, Mark A.; Street, Robert A.; Apte, Raj B.

    1998-07-01

    This paper introduces new high-resolution amorphous Silicon (a-Si) image sensors specifically configured for demonstrating film-quality medical x-ray imaging capabilities. The devices utilizes an x-ray phosphor screen coupled to an array of a-Si photodiodes for detecting visible light, and a-Si thin-film transistors (TFTs) for connecting the photodiodes to external readout electronics. We have developed imagers based on a pixel size of 127 micrometer X 127 micrometer with an approximately page-size imaging area of 244 mm X 195 mm, and array size of 1,536 data lines by 1,920 gate lines, for a total of 2.95 million pixels. More recently, we have developed a much larger imager based on the same pixel pattern, which covers an area of approximately 406 mm X 293 mm, with 2,304 data lines by 3,200 gate lines, for a total of nearly 7.4 million pixels. This is very likely to be the largest image sensor array and highest pixel count detector fabricated on a single substrate. Both imagers connect to a standard PC and are capable of taking an image in a few seconds. Through design rule optimization we have achieved a light sensitive area of 57% and optimized quantum efficiency for x-ray phosphor output in the green part of the spectrum, yielding an average quantum efficiency between 500 and 600 nm of approximately 70%. At the same time, we have managed to reduce extraneous leakage currents on these devices to a few fA per pixel, which allows for very high dynamic range to be achieved. We have characterized leakage currents as a function of photodiode bias, time and temperature to demonstrate high stability over these large sized arrays. At the electronics level, we have adopted a new generation of low noise, charge- sensitive amplifiers coupled to 12-bit A/D converters. Considerable attention was given to reducing electronic noise in order to demonstrate a large dynamic range (over 4,000:1) for medical imaging applications. Through a combination of low data lines capacitance

  10. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor.

    Science.gov (United States)

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-Ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-03-05

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes.

  11. Miniscrew-assisted rapid palatal expansion for managing arch perimeter in an adult patient

    Directory of Open Access Journals (Sweden)

    Amanda Carneiro da Cunha

    Full Text Available ABSTRACT Introduction: Etiology of dental crowding may be related to arch constriction in diverse dimensions, and an appropriate manipulation of arch perimeter by intervening in basal bone discrepancies cases, may be a key for crowding relief, especially when incisors movement is limited due to underlying pathology, periodontal issues or restrictions related to soft tissue profile. Objectives: This case report illustrates a 24-year old woman, with maxillary transverse deficiency, upper and lower arches crowding, Class II, division 1, subdivision right relationship, previous upper incisors traumatic episode and straight profile. A non-surgical and non-extraction treatment approach was feasible due to the miniscrew-assisted rapid palatal expansion technique (MARPE. Methods: The MARPE appliance consisted of a conventional Hyrax expander supported by four orthodontic miniscrews. A slow expansion protocol was adopted, with an overall of 40 days of activation and a 3-month retention period. Intrusive traction miniscrew-anchored mechanics were used for correcting the Class II subdivision relationship, managing lower arch perimeter and midline deviation before including the upper central incisors. Results: Post-treatment records show an intermolar width increase of 5 mm, bilateral Class I molar and canine relationships, upper and lower crowding resolution, coincident dental midlines and proper intercuspation. Conclusions: The MARPE is an effective treatment approach for managing arch-perimeter deficiencies related to maxillary transverse discrepancies in adult patients.

  12. Change Detection with GRASS GIS – Comparison of images taken by different sensors

    Directory of Open Access Journals (Sweden)

    Michael Fuchs

    2009-04-01

    Full Text Available Images of American military reconnaissance satellites of the Sixties (CORONA in combination with modern sensors (SPOT, QuickBird were used for detection of changes in land use. The pilot area was located about 40 km northwest of Yemen’s capital Sana’a and covered approximately 100 km2 . To produce comparable layers from images of distinctly different sources, the moving window technique was applied, using the diversity parameter. The resulting difference layers reveal plausible and interpretable change patterns, particularly in areas where urban sprawl occurs.The comparison of CORONA images with images taken by modern sensors proved to be an additional tool to visualize and quantify major changes in land use. The results should serve as additional basic data eg. in regional planning.The computation sequence was executed in GRASS GIS.

  13. Modeling of Potential Distribution of Electrical Capacitance Tomography Sensor for Multiphase Flow Image

    Directory of Open Access Journals (Sweden)

    S. Sathiyamoorthy

    2007-09-01

    Full Text Available Electrical Capacitance Tomography (ECT was used to develop image of various multi phase flow of gas-liquid-solid in a closed pipe. The principal difficulties to obtained real time image from ECT sensor are permittivity distribution across the plate and capacitance is nonlinear; the electric field is distorted by the material present and is also sensitive to measurement errors and noise. This work present a detailed description is given on method employed for image reconstruction from the capacitance measurements. The discretization and iterative algorithm is developed for improving the predictions with minimum error. The author analyzed eight electrodes square sensor ECT system with two-phase water-gas and solid-gas.

  14. Computed Tomography Image Origin Identification Based on Original Sensor Pattern Noise and 3-D Image Reconstruction Algorithm Footprints.

    Science.gov (United States)

    Duan, Yuping; Bouslimi, Dalel; Yang, Guanyu; Shu, Huazhong; Coatrieux, Gouenou

    2017-07-01

    In this paper, we focus on the "blind" identification of the computed tomography (CT) scanner that has produced a CT image. To do so, we propose a set of noise features derived from the image chain acquisition and which can be used as CT-scanner footprint. Basically, we propose two approaches. The first one aims at identifying a CT scanner based on an original sensor pattern noise (OSPN) that is intrinsic to the X-ray detectors. The second one identifies an acquisition system based on the way this noise is modified by its three-dimensional (3-D) image reconstruction algorithm. As these reconstruction algorithms are manufacturer dependent and kept secret, our features are used as input to train a support vector machine (SVM) based classifier to discriminate acquisition systems. Experiments conducted on images issued from 15 different CT-scanner models of 4 distinct manufacturers demonstrate that our system identifies the origin of one CT image with a detection rate of at least 94% and that it achieves better performance than sensor pattern noise (SPN) based strategy proposed for general public camera devices.

  15. A CMOS Image Sensor With In-Pixel Buried-Channel Source Follower and Optimized Row Selector

    NARCIS (Netherlands)

    Chen, Y.; Wang, X.; Mierop, A.J.; Theuwissen, A.J.P.

    2009-01-01

    This paper presents a CMOS imager sensor with pinned-photodiode 4T active pixels which use in-pixel buried-channel source followers (SFs) and optimized row selectors. The test sensor has been fabricated in a 0.18-mum CMOS process. The sensor characterization was carried out successfully, and the

  16. Examining perimeter gating control of urban traffic networkswith locally adaptive traffic signals

    NARCIS (Netherlands)

    Keyvan Ekbatani, M.; Gao, X.; Gayah, V.V.; Knoop, V.L.

    2015-01-01

    Traditionally, urban traffic is controlled by traffic lights. Recent findings of the Macroscopic or Network Fundamental Diagram (MFD or NFD) have led to the development of novel traffic control strategies that can be applied at a networkwide level. One pertinent example is perimeter flow control

  17. A sprayable luminescent pH sensor and its use for wound imaging in vivo.

    Science.gov (United States)

    Schreml, Stephan; Meier, Robert J; Weiß, Katharina T; Cattani, Julia; Flittner, Dagmar; Gehmert, Sebastian; Wolfbeis, Otto S; Landthaler, Michael; Babilas, Philipp

    2012-12-01

    Non-invasive luminescence imaging is of great interest for studying biological parameters in wound healing, tumors and other biomedical fields. Recently, we developed the first method for 2D luminescence imaging of pH in vivo on humans, and a novel method for one-stop-shop visualization of oxygen and pH using the RGB read-out of digital cameras. Both methods make use of semitransparent sensor foils. Here, we describe a sprayable ratiometric luminescent pH sensor, which combines properties of both these methods. Additionally, a major advantage is that the sensor spray is applicable to very uneven tissue surfaces due to its consistency. A digital RGB image of the spray on tissue is taken. The signal of the pH indicator (fluorescein isothiocyanate) is stored in the green channel (G), while that of the reference dye [ruthenium(II)-tris-(4,7-diphenyl-1,10-phenanthroline)] is stored in the red channel (R). Images are processed by rationing luminescence intensities (G/R) to result in pseudocolor pH maps of tissues, e.g. wounds. © 2012 John Wiley & Sons A/S.

  18. Finite Element Analysis of Film Stack Architecture for Complementary Metal-Oxide-Semiconductor Image Sensors.

    Science.gov (United States)

    Wu, Kuo-Tsai; Hwang, Sheng-Jye; Lee, Huei-Huang

    2017-05-02

    Image sensors are the core components of computer, communication, and consumer electronic products. Complementary metal oxide semiconductor (CMOS) image sensors have become the mainstay of image-sensing developments, but are prone to leakage current. In this study, we simulate the CMOS image sensor (CIS) film stacking process by finite element analysis. To elucidate the relationship between the leakage current and stack architecture, we compare the simulated and measured leakage currents in the elements. Based on the analysis results, we further improve the performance by optimizing the architecture of the film stacks or changing the thin-film material. The material parameters are then corrected to improve the accuracy of the simulation results. The simulated and experimental results confirm a positive correlation between measured leakage current and stress. This trend is attributed to the structural defects induced by high stress, which generate leakage. Using this relationship, we can change the structure of the thin-film stack to reduce the leakage current and thereby improve the component life and reliability of the CIS components.

  19. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  20. Differential GPS effectiveness in measuring area and perimeter in forested settings

    International Nuclear Information System (INIS)

    Frank, Jereme; Wing, Michael G

    2013-01-01

    This study quantifies area and perimeter measurement errors, traverse times, recording intervals, and overall time and cost effectiveness for using a mapping-grade differential Global Positioning System (GPS) receiver in forested settings. We compared two configurations including one that maximized data collection productivity (position dilution of precision (PDOP) 20, signal to noise ratio (SNR 33), and minimum elevation mask 5°) and a second that involved traditional receiver settings that was designed to improve accuracies (PDOP 6, SNR 39, and minimum elevation mask 15°). We determined that averaging 30 positions and using the settings that maximized productivity was the most time effective combination of recording interval and settings. This combination of recording interval and settings proved slightly more cost effective than other traditional surveying methods such as a laser with digital compass and string box. Average absolute per cent area errors when averaging 30 positions and using maximum settings were 2.6% and average absolute per cent perimeter errors were 2.0%. These results should help forest resource professionals more effectively evaluate GPS techniques and receiver configurations. (paper)

  1. A Portable Colloidal Gold Strip Sensor for Clenbuterol and Ractopamine Using Image Processing Technology

    Directory of Open Access Journals (Sweden)

    Yi Guo

    2013-01-01

    Full Text Available A portable colloidal golden strip sensor for detecting clenbuterol and ractopamine has been developed using image processing technology, as well as a novel strip reader has achieved innovatively with this imaging sensor. Colloidal gold strips for clenbuterol and ractopamine is used as first sensor with given biomedical immunication reaction. After three minutes the target sample dropped on, the color showing in the T line is relative to the content of objects as clenbuterol, this reader can finish many functions like automatic acquit ion of colored strip image, quantatively analysis of the color lines including the control line and test line, and data storage and transfer to computer. The system is integrated image collection, pattern recognition and real-time colloidal gold quantitative measurement. In experiment, clenbuterol and ractopamine standard substance with concentration from 0 ppb to 10 ppb is prepared and tested, the result reveals that standard solutions of clenbuterol and ractopamine have a good secondary fitting character with color degree (R2 is up to 0.99 and 0.98. Besides, through standard sample addition to the object negative substance, good recovery results are obtained up to 98 %. Above all, an optical sensor for colloidal strip measure is capable of determining the content of clenbuterol and ractopamine, it is likely to apply to quantatively identifying of similar reaction of colloidal golden strips.

  2. Maintained functionality of an implantable radiotelemetric blood pressure and heart rate sensor after magnetic resonance imaging in rats

    International Nuclear Information System (INIS)

    Nölte, I; Boll, H; Figueiredo, G; Groden, C; Brockmann, M A; Gorbey, S; Lemmer, B

    2011-01-01

    Radiotelemetric sensors for in vivo assessment of blood pressure and heart rate are widely used in animal research. MRI with implanted sensors is regarded as contraindicated as transmitter malfunction and injury of the animal may be caused. Moreover, artefacts are expected to compromise image evaluation. In vitro, the function of a radiotelemetric sensor (TA11PA-C10, Data Sciences International) after exposure to MRI up to 9.4 T was assessed. The magnetic force of the electromagnetic field on the sensor as well as radiofrequency (RF)-induced sensor heating was analysed. Finally, MRI with an implanted sensor was performed in a rat. Imaging artefacts were analysed at 3.0 and 9.4 T ex vivo and in vivo. Transmitted 24 h blood pressure and heart rate were compared before and after MRI to verify the integrity of the telemetric sensor. The function of the sensor was not altered by MRI up to 9.4 T. The maximum force exerted on the sensor was 273 ± 50 mN. RF-induced heating was ruled out. Artefacts impeded the assessment of the abdomen and thorax in a dead rat, but not of the head and neck. MRI with implanted radiotelemetric sensors is feasible in principal. The tested sensor maintains functionality up to 9.4 T. Artefacts hampered abdominal and throacic imaging in rats, while assessment of the head and neck is possible

  3. Dark current spectroscopy of space and nuclear environment induced displacement damage defects in pinned photodiode based CMOS image sensors

    International Nuclear Information System (INIS)

    Belloir, Jean-Marc

    2016-01-01

    CMOS image sensors are envisioned for an increasing number of high-end scientific imaging applications such as space imaging or nuclear experiments. Indeed, the performance of high-end CMOS image sensors has dramatically increased in the past years thanks to the unceasing improvements of microelectronics, and these image sensors have substantial advantages over CCDs which make them great candidates to replace CCDs in future space missions. However, in space and nuclear environments, CMOS image sensors must face harsh radiation which can rapidly degrade their electro-optical performances. In particular, the protons, electrons and ions travelling in space or the fusion neutrons from nuclear experiments can displace silicon atoms in the pixels and break the crystalline structure. These displacement damage effects lead to the formation of stable defects and to the introduction of states in the forbidden bandgap of silicon, which can allow the thermal generation of electron-hole pairs. Consequently, non ionizing radiation leads to a permanent increase of the dark current of the pixels and thus a decrease of the image sensor sensitivity and dynamic range. The aim of the present work is to extend the understanding of the effect of displacement damage on the dark current increase of CMOS image sensors. In particular, this work focuses on the shape of the dark current distribution depending on the particle type, energy and fluence but also on the image sensor physical parameters. Thanks to the many conditions tested, an empirical model for the prediction of the dark current distribution induced by displacement damage in nuclear or space environments is experimentally validated and physically justified. Another central part of this work consists in using the dark current spectroscopy technique for the first time on irradiated CMOS image sensors to detect and characterize radiation-induced silicon bulk defects. Many types of defects are detected and two of them are identified

  4. X-ray CCD image sensor with a thick depletion region

    International Nuclear Information System (INIS)

    Saito, Hirobumi; Watabe, Hiroshi.

    1984-01-01

    To develop a solid-state image sensor for high energy X-ray above 1 -- 2 keV, basic studies have been made on the CCD (charge coupled device) with a thick depletion region. A method of super-imposing a high DC bias voltage on low voltage signal pulses was newly proposed. The characteristics of both SCCD and BCCD were investigated, and their ability as X-ray sensors was compared. It was found that a depletion region of 60 μm thick was able to be obtained with ordinary doping density of 10 20 /m 3 , and that even thicker over 1 mm depletion region was able to be obtained with doping density of about 10 18 /m 3 , and a high bias voltage above 1 kV was able to be applied. It is suggested that the CCD image sensors for 8 keV or 24 keV X-ray can be realized since the absorption length of these X-ray in Si is about 60 μm and 1 mm, respectively. As for the characteristics other than the depletion thickness, the BCCD is preferable to SCCD for the present purpose because of lower noise and dark current. As for the transfer method, the frame-transfer method is recommended. (Aoki, K.)

  5. Photoresponse analysis of the CMOS photodiodes for CMOS x-ray image sensor

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Soo; Ha, Jang Ho; Kim, Han Soo; Yeo, Sun Mok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2012-11-15

    Although in the short term CMOS active pixel sensors (APSs) cannot compete with the conventionally used charge coupled devices (CCDs) for high quality scientific imaging, recent development in CMOS APSs indicate that CMOS performance level of CCDs in several domains. CMOS APSs possess thereby a number of advantages such as simpler driving requirements and low power operation. CMOS image sensors can be processed in standard CMOS technologies and the potential of on-chip integration of analog and digital circuitry makes them more suitable for several vision systems where system cost is of importance. Moreover, CMOS imagers can directly benefit from on-going technological progress in the field of CMOS technologies. Due to these advantages, the CMOS APSs are currently being investigated actively for various applications such as star tracker, navigation camera and X-ray imaging etc. In most detection systems, it is thought that the sensor is most important, since this decides the signal and noise level. So, in CMOS APSs, the pixel is very important compared to other functional blocks. In order to predict the performance of such image sensor, a detailed understanding of the photocurrent generation in the photodiodes that comprise the CMOS APS is required. In this work, we developed the analytical model that can calculate the photocurrent generated in CMOS photodiode comprising CMOS APSs. The photocurrent calculations and photo response simulations with respect to the wavelength of the incident photon were performed using this model for four types of photodiodes that can be fabricated in standard CMOS process. n{sup +}/p{sup -}sub and n{sup +}/p{sup -}epi/p{sup -}sub photodiode show better performance compared to n{sup -}well/p{sup -}sub and n{sup -}well/p{sup -}epi/p{sup -}sub due to the wider depletion width. Comparing n{sup +}/p{sup -}sub and n{sup +}/p{sup -}epi/p{sup -}sub photodiode, n{sup +}/p{sup -}sub has higher photo-responsivity in longer wavelength because of

  6. Photoresponse analysis of the CMOS photodiodes for CMOS x-ray image sensor

    International Nuclear Information System (INIS)

    Kim, Young Soo; Ha, Jang Ho; Kim, Han Soo; Yeo, Sun Mok

    2012-01-01

    Although in the short term CMOS active pixel sensors (APSs) cannot compete with the conventionally used charge coupled devices (CCDs) for high quality scientific imaging, recent development in CMOS APSs indicate that CMOS performance level of CCDs in several domains. CMOS APSs possess thereby a number of advantages such as simpler driving requirements and low power operation. CMOS image sensors can be processed in standard CMOS technologies and the potential of on-chip integration of analog and digital circuitry makes them more suitable for several vision systems where system cost is of importance. Moreover, CMOS imagers can directly benefit from on-going technological progress in the field of CMOS technologies. Due to these advantages, the CMOS APSs are currently being investigated actively for various applications such as star tracker, navigation camera and X-ray imaging etc. In most detection systems, it is thought that the sensor is most important, since this decides the signal and noise level. So, in CMOS APSs, the pixel is very important compared to other functional blocks. In order to predict the performance of such image sensor, a detailed understanding of the photocurrent generation in the photodiodes that comprise the CMOS APS is required. In this work, we developed the analytical model that can calculate the photocurrent generated in CMOS photodiode comprising CMOS APSs. The photocurrent calculations and photo response simulations with respect to the wavelength of the incident photon were performed using this model for four types of photodiodes that can be fabricated in standard CMOS process. n + /p - sub and n + /p - epi/p - sub photodiode show better performance compared to n - well/p - sub and n - well/p - epi/p - sub due to the wider depletion width. Comparing n + /p - sub and n + /p - epi/p - sub photodiode, n + /p - sub has higher photo-responsivity in longer wavelength because of the higher electron diffusion current

  7. Imaging intracellular pH in live cells with a genetically encoded red fluorescent protein sensor.

    Science.gov (United States)

    Tantama, Mathew; Hung, Yin Pun; Yellen, Gary

    2011-07-06

    Intracellular pH affects protein structure and function, and proton gradients underlie the function of organelles such as lysosomes and mitochondria. We engineered a genetically encoded pH sensor by mutagenesis of the red fluorescent protein mKeima, providing a new tool to image intracellular pH in live cells. This sensor, named pHRed, is the first ratiometric, single-protein red fluorescent sensor of pH. Fluorescence emission of pHRed peaks at 610 nm while exhibiting dual excitation peaks at 440 and 585 nm that can be used for ratiometric imaging. The intensity ratio responds with an apparent pK(a) of 6.6 and a >10-fold dynamic range. Furthermore, pHRed has a pH-responsive fluorescence lifetime that changes by ~0.4 ns over physiological pH values and can be monitored with single-wavelength two-photon excitation. After characterizing the sensor, we tested pHRed's ability to monitor intracellular pH by imaging energy-dependent changes in cytosolic and mitochondrial pH.

  8. MULTI-TEMPORAL AND MULTI-SENSOR IMAGE MATCHING BASED ON LOCAL FREQUENCY INFORMATION

    Directory of Open Access Journals (Sweden)

    X. Liu

    2012-08-01

    Full Text Available Image Matching is often one of the first tasks in many Photogrammetry and Remote Sensing applications. This paper presents an efficient approach to automated multi-temporal and multi-sensor image matching based on local frequency information. Two new independent image representations, Local Average Phase (LAP and Local Weighted Amplitude (LWA, are presented to emphasize the common scene information, while suppressing the non-common illumination and sensor-dependent information. In order to get the two representations, local frequency information is firstly obtained from Log-Gabor wavelet transformation, which is similar to that of the human visual system; then the outputs of odd and even symmetric filters are used to construct the LAP and LWA. The LAP and LWA emphasize on the phase and amplitude information respectively. As these two representations are both derivative-free and threshold-free, they are robust to noise and can keep as much of the image details as possible. A new Compositional Similarity Measure (CSM is also presented to combine the LAP and LWA with the same weight for measuring the similarity of multi-temporal and multi-sensor images. The CSM can make the LAP and LWA compensate for each other and can make full use of the amplitude and phase of local frequency information. In many image matching applications, the template is usually selected without consideration of its matching robustness and accuracy. In order to overcome this problem, a local best matching point detection is presented to detect the best matching template. In the detection method, we employ self-similarity analysis to identify the template with the highest matching robustness and accuracy. Experimental results using some real images and simulation images demonstrate that the presented approach is effective for matching image pairs with significant scene and illumination changes and that it has advantages over other state-of-the-art approaches, which include: the

  9. Introduction to sensors for ranging and imaging

    CERN Document Server

    Brooker, Graham

    2009-01-01

    ""This comprehensive text-reference provides a solid background in active sensing technology. It is concerned with active sensing, starting with the basics of time-of-flight sensors (operational principles, components), and going through the derivation of the radar range equation and the detection of echo signals, both fundamental to the understanding of radar, sonar and lidar imaging. Several chapters cover signal propagation of both electromagnetic and acoustic energy, target characteristics, stealth, and clutter. The remainder of the book introduces the range measurement process, active ima

  10. High-Resolution Spin-on-Patterning of Perovskite Thin Films for a Multiplexed Image Sensor Array.

    Science.gov (United States)

    Lee, Woongchan; Lee, Jongha; Yun, Huiwon; Kim, Joonsoo; Park, Jinhong; Choi, Changsoon; Kim, Dong Chan; Seo, Hyunseon; Lee, Hakyong; Yu, Ji Woong; Lee, Won Bo; Kim, Dae-Hyeong

    2017-10-01

    Inorganic-organic hybrid perovskite thin films have attracted significant attention as an alternative to silicon in photon-absorbing devices mainly because of their superb optoelectronic properties. However, high-definition patterning of perovskite thin films, which is important for fabrication of the image sensor array, is hardly accomplished owing to their extreme instability in general photolithographic solvents. Here, a novel patterning process for perovskite thin films is described: the high-resolution spin-on-patterning (SoP) process. This fast and facile process is compatible with a variety of spin-coated perovskite materials and perovskite deposition techniques. The SoP process is successfully applied to develop a high-performance, ultrathin, and deformable perovskite-on-silicon multiplexed image sensor array, paving the road toward next-generation image sensor arrays. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. The Dynamic Photometric Stereo Method Using a Multi-Tap CMOS Image Sensor

    Science.gov (United States)

    Yoda, Takuya; Nagahara, Hajime; Taniguchi, Rin-ichiro; Kagawa, Keiichiro; Yasutomi, Keita; Kawahito, Shoji

    2018-01-01

    The photometric stereo method enables estimation of surface normals from images that have been captured using different but known lighting directions. The classical photometric stereo method requires at least three images to determine the normals in a given scene. However, this method cannot be applied to dynamic scenes because it is assumed that the scene remains static while the required images are captured. In this work, we present a dynamic photometric stereo method for estimation of the surface normals in a dynamic scene. We use a multi-tap complementary metal-oxide-semiconductor (CMOS) image sensor to capture the input images required for the proposed photometric stereo method. This image sensor can divide the electrons from the photodiode from a single pixel into the different taps of the exposures and can thus capture multiple images under different lighting conditions with almost identical timing. We implemented a camera lighting system and created a software application to enable estimation of the normal map in real time. We also evaluated the accuracy of the estimated surface normals and demonstrated that our proposed method can estimate the surface normals of dynamic scenes. PMID:29510599

  12. Hyperspectral Imaging Sensors and the Marine Coastal Zone

    Science.gov (United States)

    Richardson, Laurie L.

    2000-01-01

    Hyperspectral imaging sensors greatly expand the potential of remote sensing to assess, map, and monitor marine coastal zones. Each pixel in a hyperspectral image contains an entire spectrum of information. As a result, hyperspectral image data can be processed in two very different ways: by image classification techniques, to produce mapped outputs of features in the image on a regional scale; and by use of spectral analysis of the spectral data embedded within each pixel of the image. The latter is particularly useful in marine coastal zones because of the spectral complexity of suspended as well as benthic features found in these environments. Spectral-based analysis of hyperspectral (AVIRIS) imagery was carried out to investigate a marine coastal zone of South Florida, USA. Florida Bay is a phytoplankton-rich estuary characterized by taxonomically distinct phytoplankton assemblages and extensive seagrass beds. End-member spectra were extracted from AVIRIS image data corresponding to ground-truth sample stations and well-known field sites. Spectral libraries were constructed from the AVIRIS end-member spectra and used to classify images using the Spectral Angle Mapper (SAM) algorithm, a spectral-based approach that compares the spectrum, in each pixel of an image with each spectrum in a spectral library. Using this approach different phytoplankton assemblages containing diatoms, cyanobacteria, and green microalgae, as well as benthic community (seagrasses), were mapped.

  13. Swivel arm perimeter for visual field testing in different body positions.

    Science.gov (United States)

    Flammer, J; Hendrickson, P; Lietz, A; Stümpfig, D

    1993-01-01

    To investigate the influence of body position on visual field results, a 'swivel arm perimeter' was built, based on a modified Octopus 1-2-3. Here, the measuring unit was detected from the control unit and mounted on a swivel arm, allowing its movement in all directions. The first results obtained with this device have indicated that its development was worthwhile.

  14. Column-Parallel Single Slope ADC with Digital Correlated Multiple Sampling for Low Noise CMOS Image Sensors

    NARCIS (Netherlands)

    Chen, Y.; Theuwissen, A.J.P.; Chae, Y.

    2011-01-01

    This paper presents a low noise CMOS image sensor (CIS) using 10/12 bit configurable column-parallel single slope ADCs (SS-ADCs) and digital correlated multiple sampling (CMS). The sensor used is a conventional 4T active pixel with a pinned-photodiode as photon detector. The test sensor was

  15. Mapping crop based on phenological characteristics using time-series NDVI of operational land imager data in Tadla irrigated perimeter, Morocco

    Science.gov (United States)

    Ouzemou, Jamal-eddine; El Harti, Abderrazak; EL Moujahid, Ali; Bouch, Naima; El Ouazzani, Rabii; Lhissou, Rachid; Bachaoui, El Mostafa

    2015-10-01

    Morocco is a primarily arid to semi-arid country. These climatic conditions make irrigation an imperative and inevitable technique. Especially, agriculture has a paramount importance for the national economy. Retrieving of crops and their location as well as their spatial extent is useful information for agricultural planning and better management of irrigation water resource. Remote sensing technology was often used in management and agricultural research. Indeed, it's allows crops extraction and mapping based on phenological characteristics, as well as yield estimation. The study area of this work is the Tadla irrigated perimeter which is characterized by heterogeneous areas and extremely small size fields. Our principal objectives are: (1) the delimitation of the major crops for a good water management, (2) the insulation of sugar beet parcels for modeling its yields. To achieve the traced goals, we have used Landsat-8 OLI (Operational Land Imager) data pan-sharpened to 15 m. Spectral Angle Mapper (SAM) and Support Vector Machine (SVM) classifications were applied to the Normalized Difference Vegetation Index (NDVI) time-series of 10 periods. Classifications were calculated for a site of more than 124000 ha. This site was divided into two parts: the first part for selecting, training datasets and the second one for validating the classification results. The SVM and SAM methods classified the principal crops with overall accuracies of 85.27% and 57.17% respectively, and kappa coefficient of 80% and 43% respectively. The study showed the potential of using time-series OLI NDVI data for mapping different crops in irrigated, heterogeneous and undersized parcels in arid and semi-arid environment.

  16. An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability.

    Science.gov (United States)

    Cevik, Ismail; Huang, Xiwei; Yu, Hao; Yan, Mei; Ay, Suat U

    2015-03-06

    An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT)-based power management system (PMS) is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI) pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  17. Advanced data visualization and sensor fusion: Conversion of techniques from medical imaging to Earth science

    Science.gov (United States)

    Savage, Richard C.; Chen, Chin-Tu; Pelizzari, Charles; Ramanathan, Veerabhadran

    1993-01-01

    Hughes Aircraft Company and the University of Chicago propose to transfer existing medical imaging registration algorithms to the area of multi-sensor data fusion. The University of Chicago's algorithms have been successfully demonstrated to provide pixel by pixel comparison capability for medical sensors with different characteristics. The research will attempt to fuse GOES (Geostationary Operational Environmental Satellite), AVHRR (Advanced Very High Resolution Radiometer), and SSM/I (Special Sensor Microwave Imager) sensor data which will benefit a wide range of researchers. The algorithms will utilize data visualization and algorithm development tools created by Hughes in its EOSDIS (Earth Observation SystemData/Information System) prototyping. This will maximize the work on the fusion algorithms since support software (e.g. input/output routines) will already exist. The research will produce a portable software library with documentation for use by other researchers.

  18. Multiocular image sensor with on-chip beam-splitter and inner meta-micro-lens for single-main-lens stereo camera.

    Science.gov (United States)

    Koyama, Shinzo; Onozawa, Kazutoshi; Tanaka, Keisuke; Saito, Shigeru; Kourkouss, Sahim Mohamed; Kato, Yoshihisa

    2016-08-08

    We developed multiocular 1/3-inch 2.75-μm-pixel-size 2.1M- pixel image sensors by co-design of both on-chip beam-splitter and 100-nm-width 800-nm-depth patterned inner meta-micro-lens for single-main-lens stereo camera systems. A camera with the multiocular image sensor can capture horizontally one-dimensional light filed by both the on-chip beam-splitter horizontally dividing ray according to incident angle, and the inner meta-micro-lens collecting the divided ray into pixel with small optical loss. Cross-talks between adjacent light field images of a fabricated binocular image sensor and of a quad-ocular image sensor are as low as 6% and 7% respectively. With the selection of two images from one-dimensional light filed images, a selective baseline for stereo vision is realized to view close objects with single-main-lens. In addition, by adding multiple light field images with different ratios, baseline distance can be tuned within an aperture of a main lens. We suggest the electrically selective or tunable baseline stereo vision to reduce 3D fatigue of viewers.

  19. Processing Infrared Images For Fire Management Applications

    Science.gov (United States)

    Warren, John R.; Pratt, William K.

    1981-12-01

    The USDA Forest Service has used airborne infrared systems for forest fire detection and mapping for many years. The transfer of the images from plane to ground and the transposition of fire spots and perimeters to maps has been performed manually. A new system has been developed which uses digital image processing, transmission, and storage. Interactive graphics, high resolution color display, calculations, and computer model compatibility are featured in the system. Images are acquired by an IR line scanner and converted to 1024 x 1024 x 8 bit frames for transmission to the ground at a 1.544 M bit rate over a 14.7 GHZ carrier. Individual frames are received and stored, then transferred to a solid state memory to refresh the display at a conventional 30 frames per second rate. Line length and area calculations, false color assignment, X-Y scaling, and image enhancement are available. Fire spread can be calculated for display and fire perimeters plotted on maps. The performance requirements, basic system, and image processing will be described.

  20. Radiometric inter-sensor cross-calibration uncertainty using a traceable high accuracy reference hyperspectral imager

    Science.gov (United States)

    Gorroño, Javier; Banks, Andrew C.; Fox, Nigel P.; Underwood, Craig

    2017-08-01

    Optical earth observation (EO) satellite sensors generally suffer from drifts and biases relative to their pre-launch calibration, caused by launch and/or time in the space environment. This places a severe limitation on the fundamental reliability and accuracy that can be assigned to satellite derived information, and is particularly critical for long time base studies for climate change and enabling interoperability and Analysis Ready Data. The proposed TRUTHS (Traceable Radiometry Underpinning Terrestrial and Helio-Studies) mission is explicitly designed to address this issue through re-calibrating itself directly to a primary standard of the international system of units (SI) in-orbit and then through the extension of this SI-traceability to other sensors through in-flight cross-calibration using a selection of Committee on Earth Observation Satellites (CEOS) recommended test sites. Where the characteristics of the sensor under test allows, this will result in a significant improvement in accuracy. This paper describes a set of tools, algorithms and methodologies that have been developed and used in order to estimate the radiometric uncertainty achievable for an indicative target sensor through in-flight cross-calibration using a well-calibrated hyperspectral SI-traceable reference sensor with observational characteristics such as TRUTHS. In this study, Multi-Spectral Imager (MSI) of Sentinel-2 and Landsat-8 Operational Land Imager (OLI) is evaluated as an example, however the analysis is readily translatable to larger-footprint sensors such as Sentinel-3 Ocean and Land Colour Instrument (OLCI) and Visible Infrared Imaging Radiometer Suite (VIIRS). This study considers the criticality of the instrumental and observational characteristics on pixel level reflectance factors, within a defined spatial region of interest (ROI) within the target site. It quantifies the main uncertainty contributors in the spectral, spatial, and temporal domains. The resultant tool

  1. Pervasive surveillance-agent system based on wireless sensor networks: design and deployment

    International Nuclear Information System (INIS)

    Martínez, José F; Bravo, Sury; García, Ana B; Corredor, Iván; Familiar, Miguel S; López, Lourdes; Hernández, Vicente; Da Silva, Antonio

    2010-01-01

    Nowadays, proliferation of embedded systems is enhancing the possibilities of gathering information by using wireless sensor networks (WSNs). Flexibility and ease of installation make these kinds of pervasive networks suitable for security and surveillance environments. Moreover, the risk for humans to be exposed to these functions is minimized when using these networks. In this paper, a virtual perimeter surveillance agent, which has been designed to detect any person crossing an invisible barrier around a marked perimeter and send an alarm notification to the security staff, is presented. This agent works in a state of 'low power consumption' until there is a crossing on the perimeter. In our approach, the 'intelligence' of the agent has been distributed by using mobile nodes in order to discern the cause of the event of presence. This feature contributes to saving both processing resources and power consumption since the required code that detects presence is the only system installed. The research work described in this paper illustrates our experience in the development of a surveillance system using WNSs for a practical application as well as its evaluation in real-world deployments. This mechanism plays an important role in providing confidence in ensuring safety to our environment

  2. Landsat 8 Operational Land Imager (OLI)_Thermal Infared Sensor (TIRS) V1

    Data.gov (United States)

    National Aeronautics and Space Administration — Abstract:The Operational Land Imager (OLI) and Thermal Infrared Sensor (TIRS) are instruments onboard the Landsat 8 satellite, which was launched in February of...

  3. Low-voltage 96 dB snapshot CMOS image sensor with 4.5 nW power dissipation per pixel.

    Science.gov (United States)

    Spivak, Arthur; Teman, Adam; Belenky, Alexander; Yadid-Pecht, Orly; Fish, Alexander

    2012-01-01

    Modern "smart" CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR) ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage "smart" image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR) and Dynamic Range (DR) as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  4. Ultrahigh sensitivity endoscopic camera using a new CMOS image sensor: providing with clear images under low illumination in addition to fluorescent images.

    Science.gov (United States)

    Aoki, Hisae; Yamashita, Hiromasa; Mori, Toshiyuki; Fukuyo, Tsuneo; Chiba, Toshio

    2014-11-01

    We developed a new ultrahigh-sensitive CMOS camera using a specific sensor that has a wide range of spectral sensitivity characteristics. The objective of this study is to present our updated endoscopic technology that has successfully integrated two innovative functions; ultrasensitive imaging as well as advanced fluorescent viewing. Two different experiments were conducted. One was carried out to evaluate the function of the ultrahigh-sensitive camera. The other was to test the availability of the newly developed sensor and its performance as a fluorescence endoscope. In both studies, the distance from the endoscopic tip to the target was varied and those endoscopic images in each setting were taken for further comparison. In the first experiment, the 3-CCD camera failed to display the clear images under low illumination, and the target was hardly seen. In contrast, the CMOS camera was able to display the targets regardless of the camera-target distance under low illumination. Under high illumination, imaging quality given by both cameras was quite alike. In the second experiment as a fluorescence endoscope, the CMOS camera was capable of clearly showing the fluorescent-activated organs. The ultrahigh sensitivity CMOS HD endoscopic camera is expected to provide us with clear images under low illumination in addition to the fluorescent images under high illumination in the field of laparoscopic surgery.

  5. A simple and low-cost biofilm quantification method using LED and CMOS image sensor.

    Science.gov (United States)

    Kwak, Yeon Hwa; Lee, Junhee; Lee, Junghoon; Kwak, Soo Hwan; Oh, Sangwoo; Paek, Se-Hwan; Ha, Un-Hwan; Seo, Sungkyu

    2014-12-01

    A novel biofilm detection platform, which consists of a cost-effective red, green, and blue light-emitting diode (RGB LED) as a light source and a lens-free CMOS image sensor as a detector, is designed. This system can measure the diffraction patterns of cells from their shadow images, and gather light absorbance information according to the concentration of biofilms through a simple image processing procedure. Compared to a bulky and expensive commercial spectrophotometer, this platform can provide accurate and reproducible biofilm concentration detection and is simple, compact, and inexpensive. Biofilms originating from various bacterial strains, including Pseudomonas aeruginosa (P. aeruginosa), were tested to demonstrate the efficacy of this new biofilm detection approach. The results were compared with the results obtained from a commercial spectrophotometer. To utilize a cost-effective light source (i.e., an LED) for biofilm detection, the illumination conditions were optimized. For accurate and reproducible biofilm detection, a simple, custom-coded image processing algorithm was developed and applied to a five-megapixel CMOS image sensor, which is a cost-effective detector. The concentration of biofilms formed by P. aeruginosa was detected and quantified by varying the indole concentration, and the results were compared with the results obtained from a commercial spectrophotometer. The correlation value of the results from those two systems was 0.981 (N = 9, P CMOS image-sensor platform. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Characterisation of a novel reverse-biased PPD CMOS image sensor

    Science.gov (United States)

    Stefanov, K. D.; Clarke, A. S.; Ivory, J.; Holland, A. D.

    2017-11-01

    A new pinned photodiode (PPD) CMOS image sensor (CIS) has been developed and characterised. The sensor can be fully depleted by means of reverse bias applied to the substrate, and the principle of operation is applicable to very thick sensitive volumes. Additional n-type implants under the pixel p-wells, called Deep Depletion Extension (DDE), have been added in order to eliminate the large parasitic substrate current that would otherwise be present in a normal device. The first prototype has been manufactured on a 18 μm thick, 1000 Ω .cm epitaxial silicon wafers using 180 nm PPD image sensor process at TowerJazz Semiconductor. The chip contains arrays of 10 μm and 5.4 μm pixels, with variations of the shape, size and the depth of the DDE implant. Back-side illuminated (BSI) devices were manufactured in collaboration with Teledyne e2v, and characterised together with the front-side illuminated (FSI) variants. The presented results show that the devices could be reverse-biased without parasitic leakage currents, in good agreement with simulations. The new 10 μm pixels in both BSI and FSI variants exhibit nearly identical photo response to the reference non-modified pixels, as characterised with the photon transfer curve. Different techniques were used to measure the depletion depth in FSI and BSI chips, and the results are consistent with the expected full depletion.

  7. Overview of CMOS process and design options for image sensor dedicated to space applications

    Science.gov (United States)

    Martin-Gonthier, P.; Magnan, P.; Corbiere, F.

    2005-10-01

    With the growth of huge volume markets (mobile phones, digital cameras...) CMOS technologies for image sensor improve significantly. New process flows appear in order to optimize some parameters such as quantum efficiency, dark current, and conversion gain. Space applications can of course benefit from these improvements. To illustrate this evolution, this paper reports results from three technologies that have been evaluated with test vehicles composed of several sub arrays designed with some space applications as target. These three technologies are CMOS standard, improved and sensor optimized process in 0.35μm generation. Measurements are focussed on quantum efficiency, dark current, conversion gain and noise. Other measurements such as Modulation Transfer Function (MTF) and crosstalk are depicted in [1]. A comparison between results has been done and three categories of CMOS process for image sensors have been listed. Radiation tolerance has been also studied for the CMOS improved process in the way of hardening the imager by design. Results at 4, 15, 25 and 50 krad prove a good ionizing dose radiation tolerance applying specific techniques.

  8. Low-complex energy-aware image communication in visual sensor networks

    Science.gov (United States)

    Phamila, Yesudhas Asnath Victy; Amutha, Ramachandran

    2013-10-01

    A low-complex, low bit rate, energy-efficient image compression algorithm explicitly designed for resource-constrained visual sensor networks applied for surveillance, battle field, habitat monitoring, etc. is presented, where voluminous amount of image data has to be communicated over a bandwidth-limited wireless medium. The proposed method overcomes the energy limitation of individual nodes and is investigated in terms of image quality, entropy, processing time, overall energy consumption, and system lifetime. This algorithm is highly energy efficient and extremely fast since it applies energy-aware zonal binary discrete cosine transform (DCT) that computes only the few required significant coefficients and codes them using enhanced complementary Golomb Rice code without using any floating point operations. Experiments are performed using the Atmel Atmega128 and MSP430 processors to measure the resultant energy savings. Simulation results show that the proposed energy-aware fast zonal transform consumes only 0.3% of energy needed by conventional DCT. This algorithm consumes only 6% of energy needed by Independent JPEG Group (fast) version, and it suits for embedded systems requiring low power consumption. The proposed scheme is unique since it significantly enhances the lifetime of the camera sensor node and the network without any need for distributed processing as was traditionally required in existing algorithms.

  9. Real-time DNA Amplification and Detection System Based on a CMOS Image Sensor.

    Science.gov (United States)

    Wang, Tiantian; Devadhasan, Jasmine Pramila; Lee, Do Young; Kim, Sanghyo

    2016-01-01

    In the present study, we developed a polypropylene well-integrated complementary metal oxide semiconductor (CMOS) platform to perform the loop mediated isothermal amplification (LAMP) technique for real-time DNA amplification and detection simultaneously. An amplification-coupled detection system directly measures the photon number changes based on the generation of magnesium pyrophosphate and color changes. The photon number decreases during the amplification process. The CMOS image sensor observes the photons and converts into digital units with the aid of an analog-to-digital converter (ADC). In addition, UV-spectral studies, optical color intensity detection, pH analysis, and electrophoresis detection were carried out to prove the efficiency of the CMOS sensor based the LAMP system. Moreover, Clostridium perfringens was utilized as proof-of-concept detection for the new system. We anticipate that this CMOS image sensor-based LAMP method will enable the creation of cost-effective, label-free, optical, real-time and portable molecular diagnostic devices.

  10. A Full Parallel Event Driven Readout Technique for Area Array SPAD FLIM Image Sensors

    Directory of Open Access Journals (Sweden)

    Kaiming Nie

    2016-01-01

    Full Text Available This paper presents a full parallel event driven readout method which is implemented in an area array single-photon avalanche diode (SPAD image sensor for high-speed fluorescence lifetime imaging microscopy (FLIM. The sensor only records and reads out effective time and position information by adopting full parallel event driven readout method, aiming at reducing the amount of data. The image sensor includes four 8 × 8 pixel arrays. In each array, four time-to-digital converters (TDCs are used to quantize the time of photons’ arrival, and two address record modules are used to record the column and row information. In this work, Monte Carlo simulations were performed in Matlab in terms of the pile-up effect induced by the readout method. The sensor’s resolution is 16 × 16. The time resolution of TDCs is 97.6 ps and the quantization range is 100 ns. The readout frame rate is 10 Mfps, and the maximum imaging frame rate is 100 fps. The chip’s output bandwidth is 720 MHz with an average power of 15 mW. The lifetime resolvability range is 5–20 ns, and the average error of estimated fluorescence lifetimes is below 1% by employing CMM to estimate lifetimes.

  11. An Ultra-Low Power CMOS Image Sensor with On-Chip Energy Harvesting and Power Management Capability

    Directory of Open Access Journals (Sweden)

    Ismail Cevik

    2015-03-01

    Full Text Available An ultra-low power CMOS image sensor with on-chip energy harvesting and power management capability is introduced in this paper. The photodiode pixel array can not only capture images but also harvest solar energy. As such, the CMOS image sensor chip is able to switch between imaging and harvesting modes towards self-power operation. Moreover, an on-chip maximum power point tracking (MPPT-based power management system (PMS is designed for the dual-mode image sensor to further improve the energy efficiency. A new isolated P-well energy harvesting and imaging (EHI pixel with very high fill factor is introduced. Several ultra-low power design techniques such as reset and select boosting techniques have been utilized to maintain a wide pixel dynamic range. The chip was designed and fabricated in a 1.8 V, 1P6M 0.18 µm CMOS process. Total power consumption of the imager is 6.53 µW for a 96 × 96 pixel array with 1 V supply and 5 fps frame rate. Up to 30 μW of power could be generated by the new EHI pixels. The PMS is capable of providing 3× the power required during imaging mode with 50% efficiency allowing energy autonomous operation with a 72.5% duty cycle.

  12. Multi-Sensor Fusion of Infrared and Electro-Optic Signals for High Resolution Night Images

    Directory of Open Access Journals (Sweden)

    Victor Lawrence

    2012-07-01

    Full Text Available Electro-optic (EO image sensors exhibit the properties of high resolution and low noise level at daytime, but they do not work in dark environments. Infrared (IR image sensors exhibit poor resolution and cannot separate objects with similar temperature. Therefore, we propose a novel framework of IR image enhancement based on the information (e.g., edge from EO images, which improves the resolution of IR images and helps us distinguish objects at night. Our framework superimposing/blending the edges of the EO image onto the corresponding transformed IR image improves their resolution. In this framework, we adopt the theoretical point spread function (PSF proposed by Hardie et al. for the IR image, which has the modulation transfer function (MTF of a uniform detector array and the incoherent optical transfer function (OTF of diffraction-limited optics. In addition, we design an inverse filter for the proposed PSF and use it for the IR image transformation. The framework requires four main steps: (1 inverse filter-based IR image transformation; (2 EO image edge detection; (3 registration; and (4 blending/superimposing of the obtained image pair. Simulation results show both blended and superimposed IR images, and demonstrate that blended IR images have better quality over the superimposed images. Additionally, based on the same steps, simulation result shows a blended IR image of better quality when only the original IR image is available.

  13. Low-Voltage 96 dB Snapshot CMOS Image Sensor with 4.5 nW Power Dissipation per Pixel

    Directory of Open Access Journals (Sweden)

    Orly Yadid-Pecht

    2012-07-01

    Full Text Available Modern “smart” CMOS sensors have penetrated into various applications, such as surveillance systems, bio-medical applications, digital cameras, cellular phones and many others. Reducing the power of these sensors continuously challenges designers. In this paper, a low power global shutter CMOS image sensor with Wide Dynamic Range (WDR ability is presented. This sensor features several power reduction techniques, including a dual voltage supply, a selective power down, transistors with different threshold voltages, a non-rationed logic, and a low voltage static memory. A combination of all these approaches has enabled the design of the low voltage “smart” image sensor, which is capable of reaching a remarkable dynamic range, while consuming very low power. The proposed power-saving solutions have allowed the maintenance of the standard architecture of the sensor, reducing both the time and the cost of the design. In order to maintain the image quality, a relation between the sensor performance and power has been analyzed and a mathematical model, describing the sensor Signal to Noise Ratio (SNR and Dynamic Range (DR as a function of the power supplies, is proposed. The described sensor was implemented in a 0.18 um CMOS process and successfully tested in the laboratory. An SNR of 48 dB and DR of 96 dB were achieved with a power dissipation of 4.5 nW per pixel.

  14. The assessment of multi-sensor image fusion using wavelet transforms for mapping the Brazilian Savanna

    NARCIS (Netherlands)

    Weimar Acerbi, F.; Clevers, J.G.P.W.; Schaepman, M.E.

    2006-01-01

    Multi-sensor image fusion using the wavelet approach provides a conceptual framework for the improvement of the spatial resolution with minimal distortion of the spectral content of the source image. This paper assesses whether images with a large ratio of spatial resolution can be fused, and

  15. Perimeter generating functions for the mean-squared radius of gyration of convex polygons

    International Nuclear Information System (INIS)

    Jensen, Iwan

    2005-01-01

    We have derived long series expansions for the perimeter generating functions of the radius of gyration of various polygons with a convexity constraint. Using the series we numerically find simple (algebraic) exact solutions for the generating functions. In all cases the size exponent ν 1. (letter to the editor)

  16. Real time three-dimensional space video rate sensors for millimeter waves imaging based very inexpensive plasma LED lamps

    Science.gov (United States)

    Levanon, Assaf; Yitzhaky, Yitzhak; Kopeika, Natan S.; Rozban, Daniel; Abramovich, Amir

    2014-10-01

    In recent years, much effort has been invested to develop inexpensive but sensitive Millimeter Wave (MMW) detectors that can be used in focal plane arrays (FPAs), in order to implement real time MMW imaging. Real time MMW imaging systems are required for many varied applications in many fields as homeland security, medicine, communications, military products and space technology. It is mainly because this radiation has high penetration and good navigability through dust storm, fog, heavy rain, dielectric materials, biological tissue, and diverse materials. Moreover, the atmospheric attenuation in this range of the spectrum is relatively low and the scattering is also low compared to NIR and VIS. The lack of inexpensive room temperature imaging systems makes it difficult to provide a suitable MMW system for many of the above applications. In last few years we advanced in research and development of sensors using very inexpensive (30-50 cents) Glow Discharge Detector (GDD) plasma indicator lamps as MMW detectors. This paper presents three kinds of GDD sensor based lamp Focal Plane Arrays (FPA). Those three kinds of cameras are different in the number of detectors, scanning operation, and detection method. The 1st and 2nd generations are 8 × 8 pixel array and an 18 × 2 mono-rail scanner array respectively, both of them for direct detection and limited to fixed imaging. The last designed sensor is a multiplexing frame rate of 16x16 GDD FPA. It permits real time video rate imaging of 30 frames/ sec and comprehensive 3D MMW imaging. The principle of detection in this sensor is a frequency modulated continuous wave (FMCW) system while each of the 16 GDD pixel lines is sampled simultaneously. Direct detection is also possible and can be done with a friendly user interface. This FPA sensor is built over 256 commercial GDD lamps with 3 mm diameter International Light, Inc., Peabody, MA model 527 Ne indicator lamps as pixel detectors. All three sensors are fully supported

  17. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics.

    Science.gov (United States)

    Hsieh, Sheng-Hsun; Li, Yung-Hui; Tien, Chung-Hao

    2016-11-25

    For many practical applications of image sensors, how to extend the depth-of-field (DoF) is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known "extended DoF" (EDoF) technique, or "wavefront coding," by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy.

  18. Development of High Resolution Eddy Current Imaging Using an Electro-Mechanical Sensor (Preprint)

    Science.gov (United States)

    2011-11-01

    The Fluxgate Magnetometer ,” J. Phys. E: Sci. Instrum., Vol. 12: 241-253. 13. A. Abedi, J. J. Fellenstein, A. J. Lucas, and J. P. Wikswo, Jr., “A...206 (2006). 11. Ripka, P., 1992, Review of Fluxgate Sensors, Sensors and Actuators, A. 33, Elsevier Sequoia: 129-141. 12. Primdahl, F., 1979...superconducting quantum interference device magnetometer system for quantitative analysis and imaging of hidden corrosion activity in aircraft aluminum

  19. Simulation and measurement of total ionizing dose radiation induced image lag increase in pinned photodiode CMOS image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Jing [School of Materials Science and Engineering, Xiangtan University, Hunan (China); State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Chen, Wei, E-mail: chenwei@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Wang, Zujun, E-mail: wangzujun@nint.ac.cn [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China); Xue, Yuanyuan; Yao, Zhibin; He, Baoping; Ma, Wuying; Jin, Junshan; Sheng, Jiangkun; Dong, Guantao [State Key Laboratory of Intense Pulsed Irradiation Simulation and Effect, Northwest Institute of Nuclear Technology, P.O.Box 69-10, Xi’an (China)

    2017-06-01

    This paper presents an investigation of total ionizing dose (TID) induced image lag sources in pinned photodiodes (PPD) CMOS image sensors based on radiation experiments and TCAD simulation. The radiation experiments have been carried out at the Cobalt −60 gamma-ray source. The experimental results show the image lag degradation is more and more serious with increasing TID. Combining with the TCAD simulation results, we can confirm that the junction of PPD and transfer gate (TG) is an important region forming image lag during irradiation. These simulations demonstrate that TID can generate a potential pocket leading to incomplete transfer.

  20. Computer-based image studies on tumor nests mathematical features of breast cancer and their clinical prognostic value.

    Science.gov (United States)

    Wang, Lin-Wei; Qu, Ai-Ping; Yuan, Jing-Ping; Chen, Chuang; Sun, Sheng-Rong; Hu, Ming-Bai; Liu, Juan; Li, Yan

    2013-01-01

    The expending and invasive features of tumor nests could reflect the malignant biological behaviors of breast invasive ductal carcinoma. Useful information on cancer invasiveness hidden within tumor nests could be extracted and analyzed by computer image processing and big data analysis. Tissue microarrays from invasive ductal carcinoma (n = 202) were first stained with cytokeratin by immunohistochemical method to clearly demarcate the tumor nests. Then an expert-aided computer analysis system was developed to study the mathematical and geometrical features of the tumor nests. Computer recognition system and imaging analysis software extracted tumor nests information, and mathematical features of tumor nests were calculated. The relationship between tumor nests mathematical parameters and patients' 5-year disease free survival was studied. There were 8 mathematical parameters extracted by expert-aided computer analysis system. Three mathematical parameters (number, circularity and total perimeter) with area under curve >0.5 and 4 mathematical parameters (average area, average perimeter, total area/total perimeter, average (area/perimeter)) with area under curve nests could be a useful parameter to predict the prognosis of early stage breast invasive ductal carcinoma.

  1. A Method for Application of Classification Tree Models to Map Aquatic Vegetation Using Remotely Sensed Images from Different Sensors and Dates

    Directory of Open Access Journals (Sweden)

    Ying Cai

    2012-09-01

    Full Text Available In previous attempts to identify aquatic vegetation from remotely-sensed images using classification trees (CT, the images used to apply CT models to different times or locations necessarily originated from the same satellite sensor as that from which the original images used in model development came, greatly limiting the application of CT. We have developed an effective normalization method to improve the robustness of CT models when applied to images originating from different sensors and dates. A total of 965 ground-truth samples of aquatic vegetation types were obtained in 2009 and 2010 in Taihu Lake, China. Using relevant spectral indices (SI as classifiers, we manually developed a stable CT model structure and then applied a standard CT algorithm to obtain quantitative (optimal thresholds from 2009 ground-truth data and images from Landsat7-ETM+, HJ-1B-CCD, Landsat5-TM and ALOS-AVNIR-2 sensors. Optimal CT thresholds produced average classification accuracies of 78.1%, 84.7% and 74.0% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. However, the optimal CT thresholds for different sensor images differed from each other, with an average relative variation (RV of 6.40%. We developed and evaluated three new approaches to normalizing the images. The best-performing method (Method of 0.1% index scaling normalized the SI images using tailored percentages of extreme pixel values. Using the images normalized by Method of 0.1% index scaling, CT models for a particular sensor in which thresholds were replaced by those from the models developed for images originating from other sensors provided average classification accuracies of 76.0%, 82.8% and 68.9% for emergent vegetation, floating-leaf vegetation and submerged vegetation, respectively. Applying the CT models developed for normalized 2009 images to 2010 images resulted in high classification (78.0%–93.3% and overall (92.0%–93.1% accuracies. Our

  2. Proximity gettering technology for advanced CMOS image sensors using carbon cluster ion-implantation technique. A review

    Energy Technology Data Exchange (ETDEWEB)

    Kurita, Kazunari; Kadono, Takeshi; Okuyama, Ryousuke; Shigemastu, Satoshi; Hirose, Ryo; Onaka-Masada, Ayumi; Koga, Yoshihiro; Okuda, Hidehiko [SUMCO Corporation, Saga (Japan)

    2017-07-15

    A new technique is described for manufacturing advanced silicon wafers with the highest capability yet reported for gettering transition metallic, oxygen, and hydrogen impurities in CMOS image sensor fabrication processes. Carbon and hydrogen elements are localized in the projection range of the silicon wafer by implantation of ion clusters from a hydrocarbon molecular gas source. Furthermore, these wafers can getter oxygen impurities out-diffused to device active regions from a Czochralski grown silicon wafer substrate to the carbon cluster ion projection range during heat treatment. Therefore, they can reduce the formation of transition metals and oxygen-related defects in the device active regions and improve electrical performance characteristics, such as the dark current, white spot defects, pn-junction leakage current, and image lag characteristics. The new technique enables the formation of high-gettering-capability sinks for transition metals, oxygen, and hydrogen impurities under device active regions of CMOS image sensors. The wafers formed by this technique have the potential to significantly improve electrical devices performance characteristics in advanced CMOS image sensors. (copyright 2017 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  3. Selection of bi-level image compression method for reduction of communication energy in wireless visual sensor networks

    Science.gov (United States)

    Khursheed, Khursheed; Imran, Muhammad; Ahmad, Naeem; O'Nils, Mattias

    2012-06-01

    Wireless Visual Sensor Network (WVSN) is an emerging field which combines image sensor, on board computation unit, communication component and energy source. Compared to the traditional wireless sensor network, which operates on one dimensional data, such as temperature, pressure values etc., WVSN operates on two dimensional data (images) which requires higher processing power and communication bandwidth. Normally, WVSNs are deployed in areas where installation of wired solutions is not feasible. The energy budget in these networks is limited to the batteries, because of the wireless nature of the application. Due to the limited availability of energy, the processing at Visual Sensor Nodes (VSN) and communication from VSN to server should consume as low energy as possible. Transmission of raw images wirelessly consumes a lot of energy and requires higher communication bandwidth. Data compression methods reduce data efficiently and hence will be effective in reducing communication cost in WVSN. In this paper, we have compared the compression efficiency and complexity of six well known bi-level image compression methods. The focus is to determine the compression algorithms which can efficiently compress bi-level images and their computational complexity is suitable for computational platform used in WVSNs. These results can be used as a road map for selection of compression methods for different sets of constraints in WVSN.

  4. Development of a solid-state multi-sensor array camera for real time imaging of magnetic fields

    International Nuclear Information System (INIS)

    Benitez, D; Gaydecki, P; Quek, S; Torres, V

    2007-01-01

    The development of a real-time magnetic field imaging camera based on solid-state sensors is described. The final laboratory comprises a 2D array of 33 x 33 solid state, tri-axial magneto-inductive sensors, and is located within a large current-carrying coil. This may be excited to produce either a steady or time-varying magnetic field. Outputs from several rows of sensors are routed to a sub-master controller and all sub-masters route to a master-controller responsible for data coordination and signal pre-processing. The data are finally streamed to a host computer via a USB interface and the image generated and displayed at a rate of several frames per second. Accurate image generation is predicated on a knowledge of the sensor response, magnetic field perturbations and the nature of the target respecting permeability and conductivity. To this end, the development of the instrumentation has been complemented by extensive numerical modelling of field distribution patterns using boundary element methods. Although it was originally intended for deployment in the nondestructive evaluation (NDE) of reinforced concrete, it was soon realised during the course of the work that the magnetic field imaging system had many potential applications, for example, in medicine, security screening, quality assurance (such as the food industry), other areas of nondestructive evaluation (NDE), designs associated with magnetic fields, teaching and research

  5. Development of a solid-state multi-sensor array camera for real time imaging of magnetic fields

    Science.gov (United States)

    Benitez, D.; Gaydecki, P.; Quek, S.; Torres, V.

    2007-07-01

    The development of a real-time magnetic field imaging camera based on solid-state sensors is described. The final laboratory comprises a 2D array of 33 x 33 solid state, tri-axial magneto-inductive sensors, and is located within a large current-carrying coil. This may be excited to produce either a steady or time-varying magnetic field. Outputs from several rows of sensors are routed to a sub-master controller and all sub-masters route to a master-controller responsible for data coordination and signal pre-processing. The data are finally streamed to a host computer via a USB interface and the image generated and displayed at a rate of several frames per second. Accurate image generation is predicated on a knowledge of the sensor response, magnetic field perturbations and the nature of the target respecting permeability and conductivity. To this end, the development of the instrumentation has been complemented by extensive numerical modelling of field distribution patterns using boundary element methods. Although it was originally intended for deployment in the nondestructive evaluation (NDE) of reinforced concrete, it was soon realised during the course of the work that the magnetic field imaging system had many potential applications, for example, in medicine, security screening, quality assurance (such as the food industry), other areas of nondestructive evaluation (NDE), designs associated with magnetic fields, teaching and research.

  6. Noise analysis of a novel hybrid active-passive pixel sensor for medical X-ray imaging

    International Nuclear Information System (INIS)

    Safavian, N.; Izadi, M.H.; Sultana, A.; Wu, D.; Karim, K.S.; Nathan, A.; Rowlands, J.A.

    2009-01-01

    Passive pixel sensor (PPS) is one of the most widely used architectures in large area amorphous silicon (a-Si) flat panel imagers. It consists of a detector and a thin film transistor (TFT) acting as a readout switch. While the PPS is advantageous in terms of providing a simple and small architecture suitable for high-resolution imaging, it directly exposes the signal to the noise of data line and external readout electronics, causing significant increase in the minimum readable sensor input signal. In this work we present the operation and noise performance of a hybrid 3-TFT current programmed, current output active pixel sensor (APS) suitable for real-time X-ray imaging. The pixel circuit extends the application of a-Si TFT from conventional switching element to on-pixel amplifier for enhanced signal-to-noise ratio and higher imager dynamic range. The capability of operation in both passive and active modes as well as being able to compensate for inherent instabilities of the TFTs makes the architecture a good candidate for X-ray imaging modalities with a wide range of incoming X-ray intensities. Measurement and theoretical calculations reveal a value for input refferd noise below the 1000 electron noise limit for real-time fluoroscopy. (copyright 2009 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  7. Development of High Resolution Eddy Current Imaging Using an Electro-Mechanical Sensor (Postprint)

    Science.gov (United States)

    2011-08-01

    Primdahl, F., 1979, “The Fluxgate Magnetometer ,” J. Phys. E: Sci. Instrum., Vol. 12: 241-253. 13. A. Abedi, J. J. Fellenstein, A. J. Lucas, and J. P...Issues 1-2, Pages 203-206 (2006). 11. Ripka, P., 1992, Review of Fluxgate Sensors, Sensors and Actuators, A. 33, Elsevier Sequoia: 129-141. 12...Wikswo, Jr., “A superconducting quantum interference device magnetometer system for quantitative analysis and imaging of hidden corrosion activity in

  8. Evaluation of high-perimeter electrode designs for deep brain stimulation

    Science.gov (United States)

    Howell, Bryan; Grill, Warren M.

    2014-08-01

    Objective. Deep brain stimulation (DBS) is an effective treatment for movement disorders and a promising therapy for treating epilepsy and psychiatric disorders. Despite its clinical success, complications including infections and mis-programing following surgical replacement of the battery-powered implantable pulse generator adversely impact the safety profile of this therapy. We sought to decrease power consumption and extend battery life by modifying the electrode geometry to increase stimulation efficiency. The specific goal of this study was to determine whether electrode contact perimeter or area had a greater effect on increasing stimulation efficiency. Approach. Finite-element method (FEM) models of eight prototype electrode designs were used to calculate the electrode access resistance, and the FEM models were coupled with cable models of passing axons to quantify stimulation efficiency. We also measured in vitro the electrical properties of the prototype electrode designs and measured in vivo the stimulation efficiency following acute implantation in anesthetized cats. Main results. Area had a greater effect than perimeter on altering the electrode access resistance; electrode (access or dynamic) resistance alone did not predict stimulation efficiency because efficiency was dependent on the shape of the potential distribution in the tissue; and, quantitative assessment of stimulation efficiency required consideration of the effects of the electrode-tissue interface impedance. Significance. These results advance understanding of the features of electrode geometry that are important for designing the next generation of efficient DBS electrodes.

  9. Real-time, wide-area hyperspectral imaging sensors for standoff detection of explosives and chemical warfare agents

    Science.gov (United States)

    Gomer, Nathaniel R.; Tazik, Shawna; Gardner, Charles W.; Nelson, Matthew P.

    2017-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the detection and analysis of targets located within complex backgrounds. HSI can detect threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Unfortunately, current generation HSI systems have size, weight, and power limitations that prohibit their use for field-portable and/or real-time applications. Current generation systems commonly provide an inefficient area search rate, require close proximity to the target for screening, and/or are not capable of making real-time measurements. ChemImage Sensor Systems (CISS) is developing a variety of real-time, wide-field hyperspectral imaging systems that utilize shortwave infrared (SWIR) absorption and Raman spectroscopy. SWIR HSI sensors provide wide-area imagery with at or near real time detection speeds. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focusing on sensor design and detection results.

  10. The effect of split pixel HDR image sensor technology on MTF measurements

    Science.gov (United States)

    Deegan, Brian M.

    2014-03-01

    Split-pixel HDR sensor technology is particularly advantageous in automotive applications, because the images are captured simultaneously rather than sequentially, thereby reducing motion blur. However, split pixel technology introduces artifacts in MTF measurement. To achieve a HDR image, raw images are captured from both large and small sub-pixels, and combined to make the HDR output. In some cases, a large sub-pixel is used for long exposure captures, and a small sub-pixel for short exposures, to extend the dynamic range. The relative size of the photosensitive area of the pixel (fill factor) plays a very significant role in the output MTF measurement. Given an identical scene, the MTF will be significantly different, depending on whether you use the large or small sub-pixels i.e. a smaller fill factor (e.g. in the short exposure sub-pixel) will result in higher MTF scores, but significantly greater aliasing. Simulations of split-pixel sensors revealed that, when raw images from both sub-pixels are combined, there is a significant difference in rising edge (i.e. black-to-white transition) and falling edge (white-to-black) reproduction. Experimental results showed a difference of ~50% in measured MTF50 between the falling and rising edges of a slanted edge test chart.

  11. High frame rate multi-resonance imaging refractometry with distributed feedback dye laser sensor

    DEFF Research Database (Denmark)

    Vannahme, Christoph; Dufva, Martin; Kristensen, Anders

    2015-01-01

    imaging refractometry without moving parts is presented. DFB dye lasers are low-cost and highly sensitive refractive index sensors. The unique multi-wavelength DFB laser structure presented here comprises several areas with different grating periods. Imaging in two dimensions of space is enabled...... by analyzing laser light from all areas in parallel with an imaging spectrometer. With this multi-resonance imaging refractometry method, the spatial position in one direction is identified from the horizontal, i.e., spectral position of the multiple laser lines which is obtained from the spectrometer charged...

  12. Laser beam welding quality monitoring system based in high-speed (10 kHz) uncooled MWIR imaging sensors

    Science.gov (United States)

    Linares, Rodrigo; Vergara, German; Gutiérrez, Raúl; Fernández, Carlos; Villamayor, Víctor; Gómez, Luis; González-Camino, Maria; Baldasano, Arturo; Castro, G.; Arias, R.; Lapido, Y.; Rodríguez, J.; Romero, Pablo

    2015-05-01

    The combination of flexibility, productivity, precision and zero-defect manufacturing in future laser-based equipment are a major challenge that faces this enabling technology. New sensors for online monitoring and real-time control of laserbased processes are necessary for improving products quality and increasing manufacture yields. New approaches to fully automate processes towards zero-defect manufacturing demand smarter heads where lasers, optics, actuators, sensors and electronics will be integrated in a unique compact and affordable device. Many defects arising in laser-based manufacturing processes come from instabilities in the dynamics of the laser process. Temperature and heat dynamics are key parameters to be monitored. Low cost infrared imagers with high-speed of response will constitute the next generation of sensors to be implemented in future monitoring and control systems for laser-based processes, capable to provide simultaneous information about heat dynamics and spatial distribution. This work describes the result of using an innovative low-cost high-speed infrared imager based on the first quantum infrared imager monolithically integrated with Si-CMOS ROIC of the market. The sensor is able to provide low resolution images at frame rates up to 10 KHz in uncooled operation at the same cost as traditional infrared spot detectors. In order to demonstrate the capabilities of the new sensor technology, a low-cost camera was assembled on a standard production laser welding head, allowing to register melting pool images at frame rates of 10 kHz. In addition, a specific software was developed for defect detection and classification. Multiple laser welding processes were recorded with the aim to study the performance of the system and its application to the real-time monitoring of laser welding processes. During the experiments, different types of defects were produced and monitored. The classifier was fed with the experimental images obtained. Self

  13. Test of the Practicality and Feasibility of EDoF-Empowered Image Sensors for Long-Range Biometrics

    Directory of Open Access Journals (Sweden)

    Sheng-Hsun Hsieh

    2016-11-01

    Full Text Available For many practical applications of image sensors, how to extend the depth-of-field (DoF is an important research topic; if successfully implemented, it could be beneficial in various applications, from photography to biometrics. In this work, we want to examine the feasibility and practicability of a well-known “extended DoF” (EDoF technique, or “wavefront coding,” by building real-time long-range iris recognition and performing large-scale iris recognition. The key to the success of long-range iris recognition includes long DoF and image quality invariance toward various object distance, which is strict and harsh enough to test the practicality and feasibility of EDoF-empowered image sensors. Besides image sensor modification, we also explored the possibility of varying enrollment/testing pairs. With 512 iris images from 32 Asian people as the database, 400-mm focal length and F/6.3 optics over 3 m working distance, our results prove that a sophisticated coding design scheme plus homogeneous enrollment/testing setups can effectively overcome the blurring caused by phase modulation and omit Wiener-based restoration. In our experiments, which are based on 3328 iris images in total, the EDoF factor can achieve a result 3.71 times better than the original system without a loss of recognition accuracy.

  14. Third-generation imaging sensor system concepts

    Science.gov (United States)

    Reago, Donald A.; Horn, Stuart B.; Campbell, James, Jr.; Vollmerhausen, Richard H.

    1999-07-01

    Second generation forward looking infrared sensors, based on either parallel scanning, long wave (8 - 12 um) time delay and integration HgCdTe detectors or mid wave (3 - 5 um), medium format staring (640 X 480 pixels) InSb detectors, are being fielded. The science and technology community is now turning its attention toward the definition of a future third generation of FLIR sensors, based on emerging research and development efforts. Modeled third generation sensor performance demonstrates a significant improvement in performance over second generation, resulting in enhanced lethality and survivability on the future battlefield. In this paper we present the current thinking on what third generation sensors systems will be and the resulting requirements for third generation focal plane array detectors. Three classes of sensors have been identified. The high performance sensor will contain a megapixel or larger array with at least two colors. Higher operating temperatures will also be the goal here so that power and weight can be reduced. A high performance uncooled sensor is also envisioned that will perform somewhere between first and second generation cooled detectors, but at significantly lower cost, weight, and power. The final third generation sensor is a very low cost micro sensor. This sensor can open up a whole new IR market because of its small size, weight, and cost. Future unattended throwaway sensors, micro UAVs, and helmet mounted IR cameras will be the result of this new class.

  15. Application Of FA Sensor 2

    International Nuclear Information System (INIS)

    Park, Seon Ho

    1993-03-01

    This book introduces FA sensor from basic to making system, which includes light sensor like photo diode and photo transistor, photo electricity sensor, CCD type image sensor, MOS type image sensor, color sensor, cds cell, and optical fiber scope. It also deals with direct election position sensor such as proximity switch, differential motion, linear scale of photo electricity type, and magnet scale, rotary sensor with summary of rotary encoder, rotary encoder types and applications, flow sensor, and sensing technology.

  16. Study of CMOS Image Sensors for the Alignment System of the CMS Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Virto, A. L.; Vila, I.; Rodrigo, T.; Matorras, F.; Figueroa, C. F.; Calvo, E.; Calderon, A.; Arce, P.; Oller, J. C.; Molinero, A.; Josa, M. I.; Fuentes, J.; Ferrando, A.; Fernandez, M. G.; Barcala, J. M.

    2002-07-01

    We report on an in-depth study made on commercial CMOS image sensors in order to determine their feasibility for beam light position detection in the CMS multipoint alignment scheme. (Author) 21 refs.

  17. Gimbal Integration to Small Format, Airborne, MWIR and LWIR Imaging Sensors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovation is for enhanced sensor performance and high resolution imaging for Long Wave InfraRed (LWIR) and Medium Wave IR (MWIR) camera systems used in...

  18. Imaging Intracellular pH in Live Cells with a Genetically-Encoded Red Fluorescent Protein Sensor

    OpenAIRE

    Tantama, Mathew; Hung, Yin Pun; Yellen, Gary

    2011-01-01

    Intracellular pH affects protein structure and function, and proton gradients underlie the function of organelles such as lysosomes and mitochondria. We engineered a genetically-encoded pH sensor by mutagenesis of the red fluorescent protein mKeima, providing a new tool to image intracellular pH in live cells. This sensor, named pHRed, is the first ratiometric, single-protein red fluorescent sensor of pH. Fluorescence emission of pHRed peaks at 610 nm while exhibiting dual excitation peaks at...

  19. A radiographic imaging system based upon a 2-D silicon microstrip sensor

    CERN Document Server

    Papanestis, A; Corrin, E; Raymond, M; Hall, G; Triantis, F A; Manthos, N; Evagelou, I; Van den Stelt, P; Tarrant, T; Speller, R D; Royle, G F

    2000-01-01

    A high resolution, direct-digital detector system based upon a 2-D silicon microstrip sensor has been designed, built and is undergoing evaluation for applications in dentistry and mammography. The sensor parameters and image requirements were selected using Monte Carlo simulations. Sensors selected for evaluation have a strip pitch of 50mum on the p-side and 80mum on the n-side. Front-end electronics and data acquisition are based on the APV6 chip and were adapted from systems used at CERN for high-energy physics experiments. The APV6 chip is not self-triggering so data acquisition is done at a fixed trigger rate. This paper describes the mammographic evaluation of the double sided microstrip sensor. Raw data correction procedures were implemented to remove the effects of dead strips and non-uniform response. Standard test objects (TORMAX) were used to determine limiting spatial resolution and detectability. MTFs were determined using the edge response. The results indicate that the spatial resolution of the...

  20. Reduction of CMOS Image Sensor Read Noise to Enable Photon Counting.

    Science.gov (United States)

    Guidash, Michael; Ma, Jiaju; Vogelsang, Thomas; Endsley, Jay

    2016-04-09

    Recent activity in photon counting CMOS image sensors (CIS) has been directed to reduction of read noise. Many approaches and methods have been reported. This work is focused on providing sub 1 e(-) read noise by design and operation of the binary and small signal readout of photon counting CIS. Compensation of transfer gate feed-through was used to provide substantially reduced CDS time and source follower (SF) bandwidth. SF read noise was reduced by a factor of 3 with this method. This method can be applied broadly to CIS devices to reduce the read noise for small signals to enable use as a photon counting sensor.

  1. Optical fiber sensors for image formation in radiodiagnostic - preliminary essays

    International Nuclear Information System (INIS)

    Carvalho, Cesar C. de; Werneck, Marcelo M.

    1998-01-01

    This work describes preliminary experiments that will bring subsidies to analyze the capability to implement a system able to capture radiological images with new sensor system, comprised by FOs scanning process and I-CCD camera. These experiments have the main objective to analyze the optical response from FOs bundle, with several typos of scintillators associated with them, when it is submitted to medical x-rays exposition. (author)

  2. State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation.

    Science.gov (United States)

    Sansoni, Giovanna; Trebeschi, Marco; Docchio, Franco

    2009-01-01

    3D imaging sensors for the acquisition of three dimensional (3D) shapes have created, in recent years, a considerable degree of interest for a number of applications. The miniaturization and integration of the optical and electronic components used to build them have played a crucial role in the achievement of compactness, robustness and flexibility of the sensors. Today, several 3D sensors are available on the market, even in combination with other sensors in a "sensor fusion" approach. An importance equal to that of physical miniaturization has the portability of the measurements, via suitable interfaces, into software environments designed for their elaboration, e.g., CAD-CAM systems, virtual renders, and rapid prototyping tools. In this paper, following an overview of the state-of-art of 3D imaging sensors, a number of significant examples of their use are presented, with particular reference to industry, heritage, medicine, and criminal investigation applications.

  3. An ultrasensitive method of real time pH monitoring with complementary metal oxide semiconductor image sensor.

    Science.gov (United States)

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2015-02-09

    CMOS sensors are becoming a powerful tool in the biological and chemical field. In this work, we introduce a new approach on quantifying various pH solutions with a CMOS image sensor. The CMOS image sensor based pH measurement produces high-accuracy analysis, making it a truly portable and user friendly system. pH indicator blended hydrogel matrix was fabricated as a thin film to the accurate color development. A distinct color change of red, green and blue (RGB) develops in the hydrogel film by applying various pH solutions (pH 1-14). The semi-quantitative pH evolution was acquired by visual read out. Further, CMOS image sensor absorbs the RGB color intensity of the film and hue value converted into digital numbers with the aid of an analog-to-digital converter (ADC) to determine the pH ranges of solutions. Chromaticity diagram and Euclidean distance represent the RGB color space and differentiation of pH ranges, respectively. This technique is applicable to sense the various toxic chemicals and chemical vapors by situ sensing. Ultimately, the entire approach can be integrated into smartphone and operable with the user friendly manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Design and Implementation of a Novel Compatible Encoding Scheme in the Time Domain for Image Sensor Communication

    Directory of Open Access Journals (Sweden)

    Trang Nguyen

    2016-05-01

    Full Text Available This paper presents a modulation scheme in the time domain based on On-Off-Keying and proposes various compatible supports for different types of image sensors. The content of this article is a sub-proposal to the IEEE 802.15.7r1 Task Group (TG7r1 aimed at Optical Wireless Communication (OWC using an image sensor as the receiver. The compatibility support is indispensable for Image Sensor Communications (ISC because the rolling shutter image sensors currently available have different frame rates, shutter speeds, sampling rates, and resolutions. However, focusing on unidirectional communications (i.e., data broadcasting, beacons, an asynchronous communication prototype is also discussed in the paper. Due to the physical limitations associated with typical image sensors (including low and varying frame rates, long exposures, and low shutter speeds, the link speed performance is critically considered. Based on the practical measurement of camera response to modulated light, an operating frequency range is suggested along with the similar system architecture, decoding procedure, and algorithms. A significant feature of our novel data frame structure is that it can support both typical frame rate cameras (in the oversampling mode as well as very low frame rate cameras (in the error detection mode for a camera whose frame rate is lower than the transmission packet rate. A high frame rate camera, i.e., no less than 20 fps, is supported in an oversampling mode in which a majority voting scheme for decoding data is applied. A low frame rate camera, i.e., when the frame rate drops to less than 20 fps at some certain time, is supported by an error detection mode in which any missing data sub-packet is detected in decoding and later corrected by external code. Numerical results and valuable analysis are also included to indicate the capability of the proposed schemes.

  5. A Support Vector Machine Approach for Truncated Fingerprint Image Detection from Sweeping Fingerprint Sensors

    Science.gov (United States)

    Chen, Chi-Jim; Pai, Tun-Wen; Cheng, Mox

    2015-01-01

    A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS), successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM), based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates. PMID:25835186

  6. A Support Vector Machine Approach for Truncated Fingerprint Image Detection from Sweeping Fingerprint Sensors

    Directory of Open Access Journals (Sweden)

    Chi-Jim Chen

    2015-03-01

    Full Text Available A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS, successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM, based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates.

  7. Nanosecond-laser induced crosstalk of CMOS image sensor

    Science.gov (United States)

    Zhu, Rongzhen; Wang, Yanbin; Chen, Qianrong; Zhou, Xuanfeng; Ren, Guangsen; Cui, Longfei; Li, Hua; Hao, Daoliang

    2018-02-01

    The CMOS Image Sensor (CIS) is photoelectricity image device which focused the photosensitive array, amplifier, A/D transfer, storage, DSP, computer interface circuit on the same silicon substrate[1]. It has low power consumption, high integration,low cost etc. With large scale integrated circuit technology progress, the noise suppression level of CIS is enhanced unceasingly, and its image quality is getting better and better. It has been in the security monitoring, biometrice, detection and imaging and even military reconnaissance and other field is widely used. CIS is easily disturbed and damaged while it is irradiated by laser. It is of great significance to study the effect of laser irradiation on optoelectronic countermeasure and device for the laser strengthening resistance is of great significance. There are some researchers have studied the laser induced disturbed and damaged of CIS. They focused on the saturation, supersaturated effects, and they observed different effects as for unsaturation, saturation, supersaturated, allsaturated and pixel flip etc. This paper research 1064nm laser interference effect in a typical before type CMOS, and observring the saturated crosstalk and half the crosstalk line. This paper extracted from cmos devices working principle and signal detection methods such as the Angle of the formation mechanism of the crosstalk line phenomenon are analyzed.

  8. High speed global shutter image sensors for professional applications

    Science.gov (United States)

    Wu, Xu; Meynants, Guy

    2015-04-01

    Global shutter imagers expand the use to miscellaneous applications, such as machine vision, 3D imaging, medical imaging, space etc. to eliminate motion artifacts in rolling shutter imagers. A low noise global shutter pixel requires more than one non-light sensitive memory to reduce the read noise. But larger memory area reduces the fill-factor of the pixels. Modern micro-lenses technology can compensate this fill-factor loss. Backside illumination (BSI) is another popular technique to improve the pixel fill-factor. But some pixel architecture may not reach sufficient shutter efficiency with backside illumination. Non-light sensitive memory elements make the fabrication with BSI possible. Machine vision like fast inspection system, medical imaging like 3D medical or scientific applications always ask for high frame rate global shutter image sensors. Thanks to the CMOS technology, fast Analog-to-digital converters (ADCs) can be integrated on chip. Dual correlated double sampling (CDS) on chip ADC with high interface digital data rate reduces the read noise and makes more on-chip operation control. As a result, a global shutter imager with digital interface is a very popular solution for applications with high performance and high frame rate requirements. In this paper we will review the global shutter architectures developed in CMOSIS, discuss their optimization process and compare their performances after fabrication.

  9. Area-efficient readout with 14-bit SAR-ADC for CMOS image sensors

    Directory of Open Access Journals (Sweden)

    Aziza Sassi Ben

    2016-01-01

    Full Text Available This paper proposes a readout design for CMOS image sensors. It has been squeezed into a 7.5um pitch under a 0.28um 1P3M technology. The ADC performs one 14-bit conversion in only 1.5us and targets a theoretical DNL feature about +1.3/-1 at 14-bit accuracy. Correlated Double Sampling (CDS is performed both in the analog and digital domains to preserve the image quality.

  10. Image accuracy and representational enhancement through low-level, multi-sensor integration techniques

    International Nuclear Information System (INIS)

    Baker, J.E.

    1993-05-01

    Multi-Sensor Integration (MSI) is the combining of data and information from more than one source in order to generate a more reliable and consistent representation of the environment. The need for MSI derives largely from basic ambiguities inherent in our current sensor imaging technologies. These ambiguities exist as long as the mapping from reality to image is not 1-to-1. That is, if different 44 realities'' lead to identical images, a single image cannot reveal the particular reality which was the truth. MSI techniques can be divided into three categories based on the relative information content of the original images with that of the desired representation: (1) ''detail enhancement,'' wherein the relative information content of the original images is less rich than the desired representation; (2) ''data enhancement,'' wherein the MSI techniques axe concerned with improving the accuracy of the data rather than either increasing or decreasing the level of detail; and (3) ''conceptual enhancement,'' wherein the image contains more detail than is desired, making it difficult to easily recognize objects of interest. In conceptual enhancement one must group pixels corresponding to the same conceptual object and thereby reduce the level of extraneous detail. This research focuses on data and conceptual enhancement algorithms. To be useful in many real-world applications, e.g., autonomous or teleoperated robotics, real-time feedback is critical. But, many MSI/image processing algorithms require significant processing time. This is especially true of feature extraction, object isolation, and object recognition algorithms due to their typical reliance on global or large neighborhood information. This research attempts to exploit the speed currently available in state-of-the-art digitizers and highly parallel processing systems by developing MSI algorithms based on pixel rather than global-level features

  11. Design and testing of a perimeter of increment threshold by projection

    OpenAIRE

    García Domene, María del Carmen; Luque Cobija, María José; Fez Saiz, Dolores de

    2017-01-01

    In the present study, we have designed and tested a perimeter for the detection of damage in the chromatic mechanisms using a video projector. To this purpose, we have characterized pixel to pixel a video projector, to account for the inhomogeneities in the projection. We have measured the tristimulus values of the projector primaries as a function of digital level, at 49 locations of the projection screen and, from them, we have arrived to a characterization model which reduces the color dif...

  12. Researchers develop CCD image sensor with 20ns per row parallel readout time

    CERN Multimedia

    Bush, S

    2004-01-01

    "Scientists at the Rutherford Appleton Laboratory (RAL) in Oxfordshire have developed what they claim is the fastest CCD (charge-coupled device) image sensor, with a readout time which is 20ns per row" (1/2 page)

  13. Monitoring Pest Insect Traps by Means of Low-Power Image Sensor Technologies

    Directory of Open Access Journals (Sweden)

    Juan J. Serrano

    2012-11-01

    Full Text Available Monitoring pest insect populations is currently a key issue in agriculture and forestry protection. At the farm level, human operators typically must perform periodical surveys of the traps disseminated through the field. This is a labor-, time- and cost-consuming activity, in particular for large plantations or large forestry areas, so it would be of great advantage to have an affordable system capable of doing this task automatically in an accurate and a more efficient way. This paper proposes an autonomous monitoring system based on a low-cost image sensor that it is able to capture and send images of the trap contents to a remote control station with the periodicity demanded by the trapping application. Our autonomous monitoring system will be able to cover large areas with very low energy consumption. This issue would be the main key point in our study; since the operational live of the overall monitoring system should be extended to months of continuous operation without any kind of maintenance (i.e., battery replacement. The images delivered by image sensors would be time-stamped and processed in the control station to get the number of individuals found at each trap. All the information would be conveniently stored at the control station, and accessible via Internet by means of available network services at control station (WiFi, WiMax, 3G/4G, etc..

  14. Monitoring Pest Insect Traps by Means of Low-Power Image Sensor Technologies

    Science.gov (United States)

    López, Otoniel; Rach, Miguel Martinez; Migallon, Hector; Malumbres, Manuel P.; Bonastre, Alberto; Serrano, Juan J.

    2012-01-01

    Monitoring pest insect populations is currently a key issue in agriculture and forestry protection. At the farm level, human operators typically must perform periodical surveys of the traps disseminated through the field. This is a labor-, time- and cost-consuming activity, in particular for large plantations or large forestry areas, so it would be of great advantage to have an affordable system capable of doing this task automatically in an accurate and a more efficient way. This paper proposes an autonomous monitoring system based on a low-cost image sensor that it is able to capture and send images of the trap contents to a remote control station with the periodicity demanded by the trapping application. Our autonomous monitoring system will be able to cover large areas with very low energy consumption. This issue would be the main key point in our study; since the operational live of the overall monitoring system should be extended to months of continuous operation without any kind of maintenance (i.e., battery replacement). The images delivered by image sensors would be time-stamped and processed in the control station to get the number of individuals found at each trap. All the information would be conveniently stored at the control station, and accessible via Internet by means of available network services at control station (WiFi, WiMax, 3G/4G, etc.). PMID:23202232

  15. Highly sensitive and area-efficient CMOS image sensor using a PMOSFET-type photodetector with a built-in transfer gate

    Science.gov (United States)

    Seo, Sang-Ho; Kim, Kyoung-Do; Kong, Jae-Sung; Shin, Jang-Kyoo; Choi, Pyung

    2007-02-01

    In this paper, a new CMOS image sensor is presented, which uses a PMOSFET-type photodetector with a transfer gate that has a high and variable sensitivity. The proposed CMOS image sensor has been fabricated using a 0.35 μm 2-poly 4- metal standard CMOS technology and is composed of a 256 × 256 array of 7.05 × 7.10 μm pixels. The unit pixel has a configuration of a pseudo 3-transistor active pixel sensor (APS) with the PMOSFET-type photodetector with a transfer gate, which has a function of conventional 4-transistor APS. The generated photocurrent is controlled by the transfer gate of the PMOSFET-type photodetector. The maximum responsivity of the photodetector is larger than 1.0 × 10 3 A/W without any optical lens. Fabricated 256 × 256 CMOS image sensor exhibits a good response to low-level illumination as low as 5 lux.

  16. Microwave Sensors for Breast Cancer Detection.

    Science.gov (United States)

    Wang, Lulu

    2018-02-23

    Breast cancer is the leading cause of death among females, early diagnostic methods with suitable treatments improve the 5-year survival rates significantly. Microwave breast imaging has been reported as the most potential to become the alternative or additional tool to the current gold standard X-ray mammography for detecting breast cancer. The microwave breast image quality is affected by the microwave sensor, sensor array, the number of sensors in the array and the size of the sensor. In fact, microwave sensor array and sensor play an important role in the microwave breast imaging system. Numerous microwave biosensors have been developed for biomedical applications, with particular focus on breast tumor detection. Compared to the conventional medical imaging and biosensor techniques, these microwave sensors not only enable better cancer detection and improve the image resolution, but also provide attractive features such as label-free detection. This paper aims to provide an overview of recent important achievements in microwave sensors for biomedical imaging applications, with particular focus on breast cancer detection. The electric properties of biological tissues at microwave spectrum, microwave imaging approaches, microwave biosensors, current challenges and future works are also discussed in the manuscript.

  17. State-of-The-Art and Applications of 3D Imaging Sensors in Industry, Cultural Heritage, Medicine, and Criminal Investigation

    Directory of Open Access Journals (Sweden)

    Giovanna Sansoni

    2009-01-01

    Full Text Available 3D imaging sensors for the acquisition of three dimensional (3D shapes have created, in recent years, a considerable degree of interest for a number of applications. The miniaturization and integration of the optical and electronic components used to build them have played a crucial role in the achievement of compactness, robustness and flexibility of the sensors. Today, several 3D sensors are available on the market, even in combination with other sensors in a “sensor fusion” approach. An importance equal to that of physical miniaturization has the portability of the measurements, via suitable interfaces, into software environments designed for their elaboration, e.g., CAD-CAM systems, virtual renders, and rapid prototyping tools. In this paper, following an overview of the state-of-art of 3D imaging sensors, a number of significant examples of their use are presented, with particular reference to industry, heritage, medicine, and criminal investigation applications.

  18. [Irrigated perimeters as a geopolitical strategy for the development of the semi-arid region and its implications for health, labor and the environment].

    Science.gov (United States)

    Pontes, Andrezza Graziella Veríssimo; Gadelha, Diego; Freitas, Bernadete Maria Coêlho; Rigotto, Raquel Maria; Ferreira, Marcelo José Monteiro

    2013-11-01

    An analysis was made of irrigated perimeters as a geopolitical strategy for expanding Brazilian agricultural frontiers and the "development" of the northeastern semi-arid region with respect to social determinants in health in rural communities. Research was conducted in the Chapada do Apodi in the states of Ceará and Rio Grande do Norte between 2007 and 2011. Various research techniques and tools were adopted, such as research-action, ethnographic studies, questionnaires and laboratory exams, water contamination analyses, social cartography and focal groups. In the context of agribusiness expansion, it was revealed that public policies of irrigation have had consequences for health, labor and the environment with the implementation of the Jaguaribe-Apodi Irrigated Perimeter in Ceará. The social and environmental conflict and resistance in the phase prior to the installation of the Santa Cruz do Apodi Irrigated Perimeter in Rio Grande do Norte was significant as it had consequences for the health-disease process on rural communities. It is important for the evaluation of public irrigation policies to consider the impacts of the perimeters on the lifestyle, labor, health and the environment of the affected territories.

  19. Integration of computer imaging and sensor data for structural health monitoring of bridges

    International Nuclear Information System (INIS)

    Zaurin, R; Catbas, F N

    2010-01-01

    The condition of civil infrastructure systems (CIS) changes over their life cycle for different reasons such as damage, overloading, severe environmental inputs, and ageing due normal continued use. The structural performance often decreases as a result of the change in condition. Objective condition assessment and performance evaluation are challenging activities since they require some type of monitoring to track the response over a period of time. In this paper, integrated use of video images and sensor data in the context of structural health monitoring is demonstrated as promising technologies for the safety of civil structures in general and bridges in particular. First, the challenges and possible solutions to using video images and computer vision techniques for structural health monitoring are presented. Then, the synchronized image and sensing data are analyzed to obtain unit influence line (UIL) as an index for monitoring bridge behavior under identified loading conditions. Subsequently, the UCF 4-span bridge model is used to demonstrate the integration and implementation of imaging devices and traditional sensing technology with UIL for evaluating and tracking the bridge behavior. It is shown that video images and computer vision techniques can be used to detect, classify and track different vehicles with synchronized sensor measurements to establish an input–output relationship to determine the normalized response of the bridge

  20. Automatic Welding System of Aluminum Pipe by Monitoring Backside Image of Molten Pool Using Vision Sensor

    Science.gov (United States)

    Baskoro, Ario Sunar; Kabutomori, Masashi; Suga, Yasuo

    An automatic welding system using Tungsten Inert Gas (TIG) welding with vision sensor for welding of aluminum pipe was constructed. This research studies the intelligent welding process of aluminum alloy pipe 6063S-T5 in fixed position and moving welding torch with the AC welding machine. The monitoring system consists of a vision sensor using a charge-coupled device (CCD) camera to monitor backside image of molten pool. The captured image was processed to recognize the edge of molten pool by image processing algorithm. Neural network model for welding speed control were constructed to perform the process automatically. From the experimental results it shows the effectiveness of the control system confirmed by good detection of molten pool and sound weld of experimental result.

  1. CMOS image sensor with contour enhancement

    Science.gov (United States)

    Meng, Liya; Lai, Xiaofeng; Chen, Kun; Yuan, Xianghui

    2010-10-01

    Imitating the signal acquisition and processing of vertebrate retina, a CMOS image sensor with bionic pre-processing circuit is designed. Integration of signal-process circuit on-chip can reduce the requirement of bandwidth and precision of the subsequent interface circuit, and simplify the design of the computer-vision system. This signal pre-processing circuit consists of adaptive photoreceptor, spatial filtering resistive network and Op-Amp calculation circuit. The adaptive photoreceptor unit with a dynamic range of approximately 100 dB has a good self-adaptability for the transient changes in light intensity instead of intensity level itself. Spatial low-pass filtering resistive network used to mimic the function of horizontal cell, is composed of the horizontal resistor (HRES) circuit and OTA (Operational Transconductance Amplifier) circuit. HRES circuit, imitating dendrite of the neuron cell, comprises of two series MOS transistors operated in weak inversion region. Appending two diode-connected n-channel transistors to a simple transconductance amplifier forms the OTA Op-Amp circuit, which provides stable bias voltage for the gate of MOS transistors in HRES circuit, while serves as an OTA voltage follower to provide input voltage for the network nodes. The Op-Amp calculation circuit with a simple two-stage Op-Amp achieves the image contour enhancing. By adjusting the bias voltage of the resistive network, the smoothing effect can be tuned to change the effect of image's contour enhancement. Simulations of cell circuit and 16×16 2D circuit array are implemented using CSMC 0.5μm DPTM CMOS process.

  2. An Image Compression Scheme in Wireless Multimedia Sensor Networks Based on NMF

    Directory of Open Access Journals (Sweden)

    Shikang Kong

    2017-02-01

    Full Text Available With the goal of addressing the issue of image compression in wireless multimedia sensor networks with high recovered quality and low energy consumption, an image compression and transmission scheme based on non-negative matrix factorization (NMF is proposed in this paper. First, the NMF algorithm theory is studied. Then, a collaborative mechanism of image capture, block, compression and transmission is completed. Camera nodes capture images and send them to ordinary nodes which use an NMF algorithm for image compression. Compressed images are transmitted to the station by the cluster head node and received from ordinary nodes. The station takes on the image restoration. Simulation results show that, compared with the JPEG2000 and singular value decomposition (SVD compression schemes, the proposed scheme has a higher quality of recovered images and lower total node energy consumption. It is beneficial to reduce the burden of energy consumption and prolong the life of the whole network system, which has great significance for practical applications of WMSNs.

  3. Edgeless silicon sensors for Medipix-based large-area X-ray imaging detectors

    International Nuclear Information System (INIS)

    Bosma, M J; Visser, J; Koffeman, E N; Evrard, O; De Moor, P; De Munck, K; Tezcan, D Sabuncuoglu

    2011-01-01

    Some X-ray imaging applications demand sensitive areas exceeding the active area of a single sensor. This requires a seamless tessellation of multiple detector modules with edgeless sensors. Our research is aimed at minimising the insensitive periphery that isolates the active area from the edge. Reduction of the edge-defect induced charge injection, caused by the deleterious effects of dicing, is an important step. We report on the electrical characterisation of 300 μm thick edgeless silicon p + -ν-n + diodes, diced using deep reactive ion etching. Sensors with both n-type and p-type stop rings were fabricated in various edge topologies. Leakage currents in the active area are compared with those of sensors with a conventional design. As expected, we observe an inverse correlation between leakage-current density and both the edge distance and stop-ring width. From this correlation we determine a minimum acceptable edge distance of 50 μm. We also conclude that structures with a p-type stop ring show lower leakage currents and higher breakdown voltages than the ones with an n-type stop ring.

  4. Coseismic displacements from SAR image offsets between different satellite sensors: Application to the 2001 Bhuj (India) earthquake

    KAUST Repository

    Wang, Teng

    2015-09-05

    Synthetic aperture radar (SAR) image offset tracking is increasingly being used for measuring ground displacements, e.g., due to earthquakes and landslide movement. However, this technique has been applied only to images acquired by the same or identical satellites. Here we propose a novel approach for determining offsets between images acquired by different satellite sensors, extending the usability of existing SAR image archives. The offsets are measured between two multiimage reflectivity maps obtained from different SAR data sets, which provide significantly better results than with single preevent and postevent images. Application to the 2001 Mw7.6 Bhuj earthquake reveals, for the first time, its near-field deformation using multiple preearthquake ERS and postearthquake Envisat images. The rupture model estimated from these cross-sensor offsets and teleseismic waveforms shows a compact fault slip pattern with fairly short rise times (<3 s) and a large stress drop (20 MPa), explaining the intense shaking observed in the earthquake.

  5. MHz rate X-Ray imaging with GaAs:Cr sensors using the LPD detector system

    Science.gov (United States)

    Veale, M. C.; Booker, P.; Cline, B.; Coughlan, J.; Hart, M.; Nicholls, T.; Schneider, A.; Seller, P.; Pape, I.; Sawhney, K.; Lozinskaya, A. D.; Novikov, V. A.; Tolbanov, O. P.; Tyazhev, A.; Zarubin, A. N.

    2017-02-01

    The STFC Rutherford Appleton Laboratory (U.K.) and Tomsk State University (Russia) have been working together to develop and characterise detector systems based on chromium-compensated gallium arsenide (GaAs:Cr) semiconductor material for high frame rate X-ray imaging. Previous work has demonstrated the spectroscopic performance of the material and its resistance to damage induced by high fluxes of X-rays. In this paper, recent results from experiments at the Diamond Light Source Synchrotron have demonstrated X-ray imaging with GaAs:Cr sensors at a frame rate of 3.7 MHz using the Large Pixel Detector (LPD) ASIC, developed by STFC for the European XFEL. Measurements have been made using a monochromatic 20 keV X-ray beam delivered in a single hybrid pulse with an instantenous flux of up to ~ 1 × 1010 photons s-1 mm-2. The response of 500 μm GaAs:Cr sensors is compared to that of the standard 500 μm thick LPD Si sensors.

  6. Thin-Film Quantum Dot Photodiode for Monolithic Infrared Image Sensors.

    Science.gov (United States)

    Malinowski, Pawel E; Georgitzikis, Epimitheas; Maes, Jorick; Vamvaka, Ioanna; Frazzica, Fortunato; Van Olmen, Jan; De Moor, Piet; Heremans, Paul; Hens, Zeger; Cheyns, David

    2017-12-10

    Imaging in the infrared wavelength range has been fundamental in scientific, military and surveillance applications. Currently, it is a crucial enabler of new industries such as autonomous mobility (for obstacle detection), augmented reality (for eye tracking) and biometrics. Ubiquitous deployment of infrared cameras (on a scale similar to visible cameras) is however prevented by high manufacturing cost and low resolution related to the need of using image sensors based on flip-chip hybridization. One way to enable monolithic integration is by replacing expensive, small-scale III-V-based detector chips with narrow bandgap thin-films compatible with 8- and 12-inch full-wafer processing. This work describes a CMOS-compatible pixel stack based on lead sulfide quantum dots (PbS QD) with tunable absorption peak. Photodiode with a 150-nm thick absorber in an inverted architecture shows dark current of 10 -6 A/cm² at -2 V reverse bias and EQE above 20% at 1440 nm wavelength. Optical modeling for top illumination architecture can improve the contact transparency to 70%. Additional cooling (193 K) can improve the sensitivity to 60 dB. This stack can be integrated on a CMOS ROIC, enabling order-of-magnitude cost reduction for infrared sensors.

  7. NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data Vb0

    Data.gov (United States)

    National Aeronautics and Space Administration — The NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data were collected by the LIS instrument on the ISS used to detect the...

  8. Time-of-flight camera via a single-pixel correlation image sensor

    Science.gov (United States)

    Mao, Tianyi; Chen, Qian; He, Weiji; Dai, Huidong; Ye, Ling; Gu, Guohua

    2018-04-01

    A time-of-flight imager based on single-pixel correlation image sensors is proposed for noise-free depth map acquisition in presence of ambient light. Digital micro-mirror device and time-modulated IR-laser provide spatial and temporal illumination on the unknown object. Compressed sensing and ‘four bucket principle’ method are combined to reconstruct the depth map from a sequence of measurements at a low sampling rate. Second-order correlation transform is also introduced to reduce the noise from the detector itself and direct ambient light. Computer simulations are presented to validate the computational models and improvement of reconstructions.

  9. Performance of a novel wafer scale CMOS active pixel sensor for bio-medical imaging

    International Nuclear Information System (INIS)

    Esposito, M; Evans, P M; Wells, K; Anaxagoras, T; Konstantinidis, A C; Zheng, Y; Speller, R D; Allinson, N M

    2014-01-01

    Recently CMOS active pixels sensors (APSs) have become a valuable alternative to amorphous silicon and selenium flat panel imagers (FPIs) in bio-medical imaging applications. CMOS APSs can now be scaled up to the standard 20 cm diameter wafer size by means of a reticle stitching block process. However, despite wafer scale CMOS APS being monolithic, sources of non-uniformity of response and regional variations can persist representing a significant challenge for wafer scale sensor response. Non-uniformity of stitched sensors can arise from a number of factors related to the manufacturing process, including variation of amplification, variation between readout components, wafer defects and process variations across the wafer due to manufacturing processes. This paper reports on an investigation into the spatial non-uniformity and regional variations of a wafer scale stitched CMOS APS. For the first time a per-pixel analysis of the electro-optical performance of a wafer CMOS APS is presented, to address inhomogeneity issues arising from the stitching techniques used to manufacture wafer scale sensors. A complete model of the signal generation in the pixel array has been provided and proved capable of accounting for noise and gain variations across the pixel array. This novel analysis leads to readout noise and conversion gain being evaluated at pixel level, stitching block level and in regions of interest, resulting in a coefficient of variation ⩽1.9%. The uniformity of the image quality performance has been further investigated in a typical x-ray application, i.e. mammography, showing a uniformity in terms of CNR among the highest when compared with mammography detectors commonly used in clinical practice. Finally, in order to compare the detection capability of this novel APS with the technology currently used (i.e. FPIs), theoretical evaluation of the detection quantum efficiency (DQE) at zero-frequency has been performed, resulting in a higher DQE for this

  10. Performance of a novel wafer scale CMOS active pixel sensor for bio-medical imaging.

    Science.gov (United States)

    Esposito, M; Anaxagoras, T; Konstantinidis, A C; Zheng, Y; Speller, R D; Evans, P M; Allinson, N M; Wells, K

    2014-07-07

    Recently CMOS active pixels sensors (APSs) have become a valuable alternative to amorphous silicon and selenium flat panel imagers (FPIs) in bio-medical imaging applications. CMOS APSs can now be scaled up to the standard 20 cm diameter wafer size by means of a reticle stitching block process. However, despite wafer scale CMOS APS being monolithic, sources of non-uniformity of response and regional variations can persist representing a significant challenge for wafer scale sensor response. Non-uniformity of stitched sensors can arise from a number of factors related to the manufacturing process, including variation of amplification, variation between readout components, wafer defects and process variations across the wafer due to manufacturing processes. This paper reports on an investigation into the spatial non-uniformity and regional variations of a wafer scale stitched CMOS APS. For the first time a per-pixel analysis of the electro-optical performance of a wafer CMOS APS is presented, to address inhomogeneity issues arising from the stitching techniques used to manufacture wafer scale sensors. A complete model of the signal generation in the pixel array has been provided and proved capable of accounting for noise and gain variations across the pixel array. This novel analysis leads to readout noise and conversion gain being evaluated at pixel level, stitching block level and in regions of interest, resulting in a coefficient of variation ⩽1.9%. The uniformity of the image quality performance has been further investigated in a typical x-ray application, i.e. mammography, showing a uniformity in terms of CNR among the highest when compared with mammography detectors commonly used in clinical practice. Finally, in order to compare the detection capability of this novel APS with the technology currently used (i.e. FPIs), theoretical evaluation of the detection quantum efficiency (DQE) at zero-frequency has been performed, resulting in a higher DQE for this

  11. Visual Sensor Based Image Segmentation by Fuzzy Classification and Subregion Merge

    Directory of Open Access Journals (Sweden)

    Huidong He

    2017-01-01

    Full Text Available The extraction and tracking of targets in an image shot by visual sensors have been studied extensively. The technology of image segmentation plays an important role in such tracking systems. This paper presents a new approach to color image segmentation based on fuzzy color extractor (FCE. Different from many existing methods, the proposed approach provides a new classification of pixels in a source color image which usually classifies an individual pixel into several subimages by fuzzy sets. This approach shows two unique features: the spatial proximity and color similarity, and it mainly consists of two algorithms: CreateSubImage and MergeSubImage. We apply the FCE to segment colors of the test images from the database at UC Berkeley in the RGB, HSV, and YUV, the three different color spaces. The comparative studies show that the FCE applied in the RGB space is superior to the HSV and YUV spaces. Finally, we compare the segmentation effect with Canny edge detection and Log edge detection algorithms. The results show that the FCE-based approach performs best in the color image segmentation.

  12. IR sensitivity enhancement of CMOS Image Sensor with diffractive light trapping pixels.

    Science.gov (United States)

    Yokogawa, Sozo; Oshiyama, Itaru; Ikeda, Harumi; Ebiko, Yoshiki; Hirano, Tomoyuki; Saito, Suguru; Oinoue, Takashi; Hagimoto, Yoshiya; Iwamoto, Hayato

    2017-06-19

    We report on the IR sensitivity enhancement of back-illuminated CMOS Image Sensor (BI-CIS) with 2-dimensional diffractive inverted pyramid array structure (IPA) on crystalline silicon (c-Si) and deep trench isolation (DTI). FDTD simulations of semi-infinite thick c-Si having 2D IPAs on its surface whose pitches over 400 nm shows more than 30% improvement of light absorption at λ = 850 nm and the maximum enhancement of 43% with the 540 nm pitch at the wavelength is confirmed. A prototype BI-CIS sample with pixel size of 1.2 μm square containing 400 nm pitch IPAs shows 80% sensitivity enhancement at λ = 850 nm compared to the reference sample with flat surface. This is due to diffraction with the IPA and total reflection at the pixel boundary. The NIR images taken by the demo camera equip with a C-mount lens show 75% sensitivity enhancement in the λ = 700-1200 nm wavelength range with negligible spatial resolution degradation. Light trapping CIS pixel technology promises to improve NIR sensitivity and appears to be applicable to many different image sensor applications including security camera, personal authentication, and range finding Time-of-Flight camera with IR illuminations.

  13. Image sensor pixel with on-chip high extinction ratio polarizer based on 65-nm standard CMOS technology.

    Science.gov (United States)

    Sasagawa, Kiyotaka; Shishido, Sanshiro; Ando, Keisuke; Matsuoka, Hitoshi; Noda, Toshihiko; Tokuda, Takashi; Kakiuchi, Kiyomi; Ohta, Jun

    2013-05-06

    In this study, we demonstrate a polarization sensitive pixel for a complementary metal-oxide-semiconductor (CMOS) image sensor based on 65-nm standard CMOS technology. Using such a deep-submicron CMOS technology, it is possible to design fine metal patterns smaller than the wavelengths of visible light by using a metal wire layer. We designed and fabricated a metal wire grid polarizer on a 20 × 20 μm(2) pixel for image sensor. An extinction ratio of 19.7 dB was observed at a wavelength 750 nm.

  14. Extended Special Sensor Microwave Imager (SSM/I) Temperature Data Record (TDR) in netCDF

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Special Sensor Microwave Imager (SSM/I) is a seven-channel linearly polarized passive microwave radiometer that operates at frequencies of 19.36 (vertically and...

  15. NOAA JPSS Visible Infrared Imaging Radiometer Suite (VIIRS) Sensor Data Record (SDR) from IDPS

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Sensor Data Records (SDRs), or Level 1b data, from the Visible Infrared Imaging Radiometer Suite (VIIRS) are the calibrated and geolocated radiance and reflectance...

  16. Determination of tire cross-sectional geometric characteristics from a digitally scanned image

    Science.gov (United States)

    Danielson, Kent T.

    1995-08-01

    A semi-automated procedure is described for the accurate determination of geometrical characteristics using a scanned image of the tire cross-section. The procedure can be useful for cases when CAD drawings are not available or when a description of the actual cured tire is desired. Curves representing the perimeter of the tire cross-section are determined by an edge tracing scheme, and the plyline and cord-end positions are determined by locations of color intensities. The procedure provides an accurate description of the perimeter of the tire cross-section and the locations of plylines and cord-ends. The position, normals, and curvatures of the cross-sectional surface are included in this description. The locations of the plylines provide the necessary information for determining the ply thicknesses and relative position to a reference surface. Finally, the locations of the cord-ends provide a means to calculate the cord-ends per inch (epi). Menu driven software has been developed to facilitate the procedure using the commercial code, PV-Wave by Visual Numerics, Inc., to display the images. From a single user interface, separate modules are executed for image enhancement, curve fitting the edge trace of the cross-sectional perimeter, and determining the plyline and cord-end locations. The code can run on SUN or SGI workstations and requires the use of a mouse to specify options or identify items on the scanned image.

  17. Low Computational-Cost Footprint Deformities Diagnosis Sensor through Angles, Dimensions Analysis and Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    J. Rodolfo Maestre-Rendon

    2017-11-01

    Full Text Available Manual measurements of foot anthropometry can lead to errors since this task involves the experience of the specialist who performs them, resulting in different subjective measures from the same footprint. Moreover, some of the diagnoses that are given to classify a footprint deformity are based on a qualitative interpretation by the physician; there is no quantitative interpretation of the footprint. The importance of providing a correct and accurate diagnosis lies in the need to ensure that an appropriate treatment is provided for the improvement of the patient without risking his or her health. Therefore, this article presents a smart sensor that integrates the capture of the footprint, a low computational-cost analysis of the image and the interpretation of the results through a quantitative evaluation. The smart sensor implemented required the use of a camera (Logitech C920 connected to a Raspberry Pi 3, where a graphical interface was made for the capture and processing of the image, and it was adapted to a podoscope conventionally used by specialists such as orthopedist, physiotherapists and podiatrists. The footprint diagnosis smart sensor (FPDSS has proven to be robust to different types of deformity, precise, sensitive and correlated in 0.99 with the measurements from the digitalized image of the ink mat.

  18. Engineering workstation: Sensor modeling

    Science.gov (United States)

    Pavel, M; Sweet, B.

    1993-01-01

    The purpose of the engineering workstation is to provide an environment for rapid prototyping and evaluation of fusion and image processing algorithms. Ideally, the algorithms are designed to optimize the extraction of information that is useful to a pilot for all phases of flight operations. Successful design of effective fusion algorithms depends on the ability to characterize both the information available from the sensors and the information useful to a pilot. The workstation is comprised of subsystems for simulation of sensor-generated images, image processing, image enhancement, and fusion algorithms. As such, the workstation can be used to implement and evaluate both short-term solutions and long-term solutions. The short-term solutions are being developed to enhance a pilot's situational awareness by providing information in addition to his direct vision. The long term solutions are aimed at the development of complete synthetic vision systems. One of the important functions of the engineering workstation is to simulate the images that would be generated by the sensors. The simulation system is designed to use the graphics modeling and rendering capabilities of various workstations manufactured by Silicon Graphics Inc. The workstation simulates various aspects of the sensor-generated images arising from phenomenology of the sensors. In addition, the workstation can be used to simulate a variety of impairments due to mechanical limitations of the sensor placement and due to the motion of the airplane. Although the simulation is currently not performed in real-time, sequences of individual frames can be processed, stored, and recorded in a video format. In that way, it is possible to examine the appearance of different dynamic sensor-generated and fused images.

  19. AROSICS: An Automated and Robust Open-Source Image Co-Registration Software for Multi-Sensor Satellite Data

    Directory of Open Access Journals (Sweden)

    Daniel Scheffler

    2017-07-01

    Full Text Available Geospatial co-registration is a mandatory prerequisite when dealing with remote sensing data. Inter- or intra-sensoral misregistration will negatively affect any subsequent image analysis, specifically when processing multi-sensoral or multi-temporal data. In recent decades, many algorithms have been developed to enable manual, semi- or fully automatic displacement correction. Especially in the context of big data processing and the development of automated processing chains that aim to be applicable to different remote sensing systems, there is a strong need for efficient, accurate and generally usable co-registration. Here, we present AROSICS (Automated and Robust Open-Source Image Co-Registration Software, a Python-based open-source software including an easy-to-use user interface for automatic detection and correction of sub-pixel misalignments between various remote sensing datasets. It is independent of spatial or spectral characteristics and robust against high degrees of cloud coverage and spectral and temporal land cover dynamics. The co-registration is based on phase correlation for sub-pixel shift estimation in the frequency domain utilizing the Fourier shift theorem in a moving-window manner. A dense grid of spatial shift vectors can be created and automatically filtered by combining various validation and quality estimation metrics. Additionally, the software supports the masking of, e.g., clouds and cloud shadows to exclude such areas from spatial shift detection. The software has been tested on more than 9000 satellite images acquired by different sensors. The results are evaluated exemplarily for two inter-sensoral and two intra-sensoral use cases and show registration results in the sub-pixel range with root mean square error fits around 0.3 pixels and better.

  20. Simultaneous live cell imaging using dual FRET sensors with a single excitation light.

    Directory of Open Access Journals (Sweden)

    Yusuke Niino

    Full Text Available Fluorescence resonance energy transfer (FRET between fluorescent proteins is a powerful tool for visualization of signal transduction in living cells, and recently, some strategies for imaging of dual FRET pairs in a single cell have been reported. However, these necessitate alteration of excitation light between two different wavelengths to avoid the spectral overlap, resulting in sequential detection with a lag time. Thus, to follow fast signal dynamics or signal changes in highly motile cells, a single-excitation dual-FRET method should be required. Here we reported this by using four-color imaging with a single excitation light and subsequent linear unmixing to distinguish fluorescent proteins. We constructed new FRET sensors with Sapphire/RFP to combine with CFP/YFP, and accomplished simultaneous imaging of cAMP and cGMP in single cells. We confirmed that signal amplitude of our dual FRET measurement is comparable to of conventional single FRET measurement. Finally, we demonstrated to monitor both intracellular Ca(2+ and cAMP in highly motile cardiac myocytes. To cancel out artifacts caused by the movement of the cell, this method expands the applicability of the combined use of dual FRET sensors for cell samples with high motility.

  1. A 256×256 low-light-level CMOS imaging sensor with digital CDS

    Science.gov (United States)

    Zou, Mei; Chen, Nan; Zhong, Shengyou; Li, Zhengfen; Zhang, Jicun; Yao, Li-bin

    2016-10-01

    In order to achieve high sensitivity for low-light-level CMOS image sensors (CIS), a capacitive transimpedance amplifier (CTIA) pixel circuit with a small integration capacitor is used. As the pixel and the column area are highly constrained, it is difficult to achieve analog correlated double sampling (CDS) to remove the noise for low-light-level CIS. So a digital CDS is adopted, which realizes the subtraction algorithm between the reset signal and pixel signal off-chip. The pixel reset noise and part of the column fixed-pattern noise (FPN) can be greatly reduced. A 256×256 CIS with CTIA array and digital CDS is implemented in the 0.35μm CMOS technology. The chip size is 7.7mm×6.75mm, and the pixel size is 15μm×15μm with a fill factor of 20.6%. The measured pixel noise is 24LSB with digital CDS in RMS value at dark condition, which shows 7.8× reduction compared to the image sensor without digital CDS. Running at 7fps, this low-light-level CIS can capture recognizable images with the illumination down to 0.1lux.

  2. A Low Power Digital Accumulation Technique for Digital-Domain CMOS TDI Image Sensor.

    Science.gov (United States)

    Yu, Changwei; Nie, Kaiming; Xu, Jiangtao; Gao, Jing

    2016-09-23

    In this paper, an accumulation technique suitable for digital domain CMOS time delay integration (TDI) image sensors is proposed to reduce power consumption without degrading the rate of imaging. In terms of the slight variations of quantization codes among different pixel exposures towards the same object, the pixel array is divided into two groups: one is for coarse quantization of high bits only, and the other one is for fine quantization of low bits. Then, the complete quantization codes are composed of both results from the coarse-and-fine quantization. The equivalent operation comparably reduces the total required bit numbers of the quantization. In the 0.18 µm CMOS process, two versions of 16-stage digital domain CMOS TDI image sensor chains based on a 10-bit successive approximate register (SAR) analog-to-digital converter (ADC), with and without the proposed technique, are designed. The simulation results show that the average power consumption of slices of the two versions are 6 . 47 × 10 - 8 J/line and 7 . 4 × 10 - 8 J/line, respectively. Meanwhile, the linearity of the two versions are 99.74% and 99.99%, respectively.

  3. Imaging Voltage in Genetically Defined Neuronal Subpopulations with a Cre Recombinase-Targeted Hybrid Voltage Sensor.

    Science.gov (United States)

    Bayguinov, Peter O; Ma, Yihe; Gao, Yu; Zhao, Xinyu; Jackson, Meyer B

    2017-09-20

    Genetically encoded voltage indicators create an opportunity to monitor electrical activity in defined sets of neurons as they participate in the complex patterns of coordinated electrical activity that underlie nervous system function. Taking full advantage of genetically encoded voltage indicators requires a generalized strategy for targeting the probe to genetically defined populations of cells. To this end, we have generated a mouse line with an optimized hybrid voltage sensor (hVOS) probe within a locus designed for efficient Cre recombinase-dependent expression. Crossing this mouse with Cre drivers generated double transgenics expressing hVOS probe in GABAergic, parvalbumin, and calretinin interneurons, as well as hilar mossy cells, new adult-born neurons, and recently active neurons. In each case, imaging in brain slices from male or female animals revealed electrically evoked optical signals from multiple individual neurons in single trials. These imaging experiments revealed action potentials, dynamic aspects of dendritic integration, and trial-to-trial fluctuations in response latency. The rapid time response of hVOS imaging revealed action potentials with high temporal fidelity, and enabled accurate measurements of spike half-widths characteristic of each cell type. Simultaneous recording of rapid voltage changes in multiple neurons with a common genetic signature offers a powerful approach to the study of neural circuit function and the investigation of how neural networks encode, process, and store information. SIGNIFICANCE STATEMENT Genetically encoded voltage indicators hold great promise in the study of neural circuitry, but realizing their full potential depends on targeting the sensor to distinct cell types. Here we present a new mouse line that expresses a hybrid optical voltage sensor under the control of Cre recombinase. Crossing this line with Cre drivers generated double-transgenic mice, which express this sensor in targeted cell types. In

  4. A generalized logarithmic image processing model based on the gigavision sensor model.

    Science.gov (United States)

    Deng, Guang

    2012-03-01

    The logarithmic image processing (LIP) model is a mathematical theory providing generalized linear operations for image processing. The gigavision sensor (GVS) is a new imaging device that can be described by a statistical model. In this paper, by studying these two seemingly unrelated models, we develop a generalized LIP (GLIP) model. With the LIP model being its special case, the GLIP model not only provides new insights into the LIP model but also defines new image representations and operations for solving general image processing problems that are not necessarily related to the GVS. A new parametric LIP model is also developed. To illustrate the application of the new scalar multiplication operation, we propose an energy-preserving algorithm for tone mapping, which is a necessary step in image dehazing. By comparing with results using two state-of-the-art algorithms, we show that the new scalar multiplication operation is an effective tool for tone mapping.

  5. A 75-ps Gated CMOS Image Sensor with Low Parasitic Light Sensitivity.

    Science.gov (United States)

    Zhang, Fan; Niu, Hanben

    2016-06-29

    In this study, a 40 × 48 pixel global shutter complementary metal-oxide-semiconductor (CMOS) image sensor with an adjustable shutter time as low as 75 ps was implemented using a 0.5-μm mixed-signal CMOS process. The implementation consisted of a continuous contact ring around each p+/n-well photodiode in the pixel array in order to apply sufficient light shielding. The parasitic light sensitivity of the in-pixel storage node was measured to be 1/8.5 × 10⁷ when illuminated by a 405-nm diode laser and 1/1.4 × 10⁴ when illuminated by a 650-nm diode laser. The pixel pitch was 24 μm, the size of the square p+/n-well photodiode in each pixel was 7 μm per side, the measured random readout noise was 217 e(-) rms, and the measured dynamic range of the pixel of the designed chip was 5500:1. The type of gated CMOS image sensor (CIS) that is proposed here can be used in ultra-fast framing cameras to observe non-repeatable fast-evolving phenomena.

  6. Design and Performance of a Pinned Photodiode CMOS Image Sensor Using Reverse Substrate Bias.

    Science.gov (United States)

    Stefanov, Konstantin D; Clarke, Andrew S; Ivory, James; Holland, Andrew D

    2018-01-03

    A new pinned photodiode (PPD) CMOS image sensor with reverse biased p-type substrate has been developed and characterized. The sensor uses traditional PPDs with one additional deep implantation step to suppress the parasitic reverse currents, and can be fully depleted. The first prototypes have been manufactured on an 18 µm thick, 1000 Ω·cm epitaxial silicon wafers using 180 nm PPD image sensor process. Both front-side illuminated (FSI) and back-side illuminated (BSI) devices were manufactured in collaboration with Teledyne e2v. The characterization results from a number of arrays of 10 µm and 5.4 µm PPD pixels, with different shape, the size and the depth of the new implant are in good agreement with device simulations. The new pixels could be reverse-biased without parasitic leakage currents well beyond full depletion, and demonstrate nearly identical optical response to the reference non-modified pixels. The observed excessive charge sharing in some pixel variants is shown to not be a limiting factor in operation. This development promises to realize monolithic PPD CIS with large depleted thickness and correspondingly high quantum efficiency at near-infrared and soft X-ray wavelengths.

  7. Imaging properties of small-pixel spectroscopic x-ray detectors based on cadmium telluride sensors

    International Nuclear Information System (INIS)

    Koenig, Thomas; Schulze, Julia; Zuber, Marcus; Rink, Kristian; Oelfke, Uwe; Butzer, Jochen; Hamann, Elias; Cecilia, Angelica; Zwerger, Andreas; Fauler, Alex; Fiederle, Michael

    2012-01-01

    Spectroscopic x-ray imaging by means of photon counting detectors has received growing interest during the past years. Critical to the image quality of such devices is their pixel pitch and the sensor material employed. This paper describes the imaging properties of Medipix2 MXR multi-chip assemblies bump bonded to 1 mm thick CdTe sensors. Two systems were investigated with pixel pitches of 110 and 165 μm, which are in the order of the mean free path lengths of the characteristic x-rays produced in their sensors. Peak widths were found to be almost constant across the energy range of 10 to 60 keV, with values of 2.3 and 2.2 keV (FWHM) for the two pixel pitches. The average number of pixels responding to a single incoming photon are about 1.85 and 1.45 at 60 keV, amounting to detective quantum efficiencies of 0.77 and 0.84 at a spatial frequency of zero. Energy selective CT acquisitions are presented, and the two pixel pitches' abilities to discriminate between iodine and gadolinium contrast agents are examined. It is shown that the choice of the pixel pitch translates into a minimum contrast agent concentration for which material discrimination is still possible. We finally investigate saturation effects at high x-ray fluxes and conclude with the finding that higher maximum count rates come at the cost of a reduced energy resolution. (paper)

  8. Smart sensor systems for outdoor intrusion detection

    International Nuclear Information System (INIS)

    Lynn, J.K.

    1988-01-01

    A major improvement in outdoor perimeter security system probability of detection (PD) and reduction in false alarm rate (FAR) and nuisance alarm rate (NAR) may be obtained by analyzing the indications immediately preceding an event which might be interpreted as an intrusion. Existing systems go into alarm after crossing a threshold. Very slow changes, which accumulate until the threshold is reached, may be assessed falsely as an intrusion. A hierarchial program has begun at Stellar to develop a modular, expandable Smart Sensor system which may be interfaced to most types of sensor and alarm reporting systems. A major upgrade to the SSI Test Site is in progress so that intrusions may be simulated in a controlled and repeatable manner. A test platform is being constructed which will operate in conduction with a mobile instrumentation center with CCTVB, lighting control, weather and data monitoring and remote control of the test platform and intrusion simulators. Additional testing was contracted with an independent test facility to assess the effects of severe winter weather conditions

  9. WE-AB-BRA-11: Improved Imaging of Permanent Prostate Brachytherapy Seed Implants by Combining an Endorectal X-Ray Sensor with a CT Scanner

    International Nuclear Information System (INIS)

    Steiner, J; Matthews, K; Jia, G

    2016-01-01

    Purpose: To test feasibility of the use of a digital endorectal x-ray sensor for improved image resolution of permanent brachytherapy seed implants compared to conventional CT. Methods: Two phantoms simulating the male pelvic region were used to test the capabilities of a digital endorectal x-ray sensor for imaging permanent brachytherapy seed implants. Phantom 1 was constructed from acrylic plastic with cavities milled in the locations of the prostate and the rectum. The prostate cavity was filled a Styrofoam plug implanted with 10 training seeds. Phantom 2 was constructed from tissue-equivalent gelatins and contained a prostate phantom implanted with 18 strands of training seeds. For both phantoms, an intraoral digital dental x-ray sensor was placed in the rectum within 2 cm of the seed implants. Scout scans were taken of the phantoms over a limited arc angle using a CT scanner (80 kV, 120–200 mA). The dental sensor was removed from the phantoms and normal helical CT and scout (0 degree) scans using typical parameters for pelvic CT (120 kV, auto-mA) were collected. A shift-and add tomosynthesis algorithm was developed to localize seed plane location normal to detector face. Results: The endorectal sensor produced images with improved resolution compared to CT scans. Seed clusters and individual seed geometry were more discernable using the endorectal sensor. Seed 3D locations, including seeds that were not located in every projection image, were discernable using the shift and add algorithm. Conclusion: This work shows that digital endorectal x-ray sensors are a feasible method for improving imaging of permanent brachytherapy seed implants. Future work will consist of optimizing the tomosynthesis technique to produce higher resolution, lower dose images of 1) permanent brachytherapy seed implants for post-implant dosimetry and 2) fine anatomic details for imaging and managing prostatic disease compared to CT images. Funding: LSU Faculty Start-up Funding

  10. WE-AB-BRA-11: Improved Imaging of Permanent Prostate Brachytherapy Seed Implants by Combining an Endorectal X-Ray Sensor with a CT Scanner

    Energy Technology Data Exchange (ETDEWEB)

    Steiner, J; Matthews, K; Jia, G [Louisiana State University, Baton Rouge, LA (United States)

    2016-06-15

    Purpose: To test feasibility of the use of a digital endorectal x-ray sensor for improved image resolution of permanent brachytherapy seed implants compared to conventional CT. Methods: Two phantoms simulating the male pelvic region were used to test the capabilities of a digital endorectal x-ray sensor for imaging permanent brachytherapy seed implants. Phantom 1 was constructed from acrylic plastic with cavities milled in the locations of the prostate and the rectum. The prostate cavity was filled a Styrofoam plug implanted with 10 training seeds. Phantom 2 was constructed from tissue-equivalent gelatins and contained a prostate phantom implanted with 18 strands of training seeds. For both phantoms, an intraoral digital dental x-ray sensor was placed in the rectum within 2 cm of the seed implants. Scout scans were taken of the phantoms over a limited arc angle using a CT scanner (80 kV, 120–200 mA). The dental sensor was removed from the phantoms and normal helical CT and scout (0 degree) scans using typical parameters for pelvic CT (120 kV, auto-mA) were collected. A shift-and add tomosynthesis algorithm was developed to localize seed plane location normal to detector face. Results: The endorectal sensor produced images with improved resolution compared to CT scans. Seed clusters and individual seed geometry were more discernable using the endorectal sensor. Seed 3D locations, including seeds that were not located in every projection image, were discernable using the shift and add algorithm. Conclusion: This work shows that digital endorectal x-ray sensors are a feasible method for improving imaging of permanent brachytherapy seed implants. Future work will consist of optimizing the tomosynthesis technique to produce higher resolution, lower dose images of 1) permanent brachytherapy seed implants for post-implant dosimetry and 2) fine anatomic details for imaging and managing prostatic disease compared to CT images. Funding: LSU Faculty Start-up Funding

  11. NRT Lightning Imaging Sensor (LIS) on International Space Station (ISS) Provisional Science Data Vp0

    Data.gov (United States)

    National Aeronautics and Space Administration — The International Space Station (ISS) Lightning Imaging Sensor (LIS) datasets were collected by the LIS instrument on the ISS used to detect the distribution and...

  12. Analysis Of The Socioeconomic And Environmental Impacts Of Irrigated Agriculture In The Irrigated Perimeter Of Pau Dos Ferros (Rn

    Directory of Open Access Journals (Sweden)

    José Jobson Garcia de Almeida

    2014-07-01

    Full Text Available The Brazilian Government implemented irrigated perimeters to ameliorate problems of drought and poverty in the Northeast. In this sense, the objective of this work was to analyze the social, economic and environmental impacts generated by the practice of irrigated agriculture in the municipality of Pau dos Ferros-RN, resulting from the impacts caused by the activity. Obtained references on the topic, on-site visits and interviews with producers of the perimeter. It was observed the presence of negative impacts in the area, such as waste, contamination and water salinisation, compaction and soil erosion, deforestation caused by the removal of the native vegetation, high consumption of energy and public health problems.

  13. High capacity fiber optic sensor networks using hybrid multiplexing techniques and their applications

    Science.gov (United States)

    Sun, Qizhen; Li, Xiaolei; Zhang, Manliang; Liu, Qi; Liu, Hai; Liu, Deming

    2013-12-01

    Fiber optic sensor network is the development trend of fiber senor technologies and industries. In this paper, I will discuss recent research progress on high capacity fiber sensor networks with hybrid multiplexing techniques and their applications in the fields of security monitoring, environment monitoring, Smart eHome, etc. Firstly, I will present the architecture of hybrid multiplexing sensor passive optical network (HSPON), and the key technologies for integrated access and intelligent management of massive fiber sensor units. Two typical hybrid WDM/TDM fiber sensor networks for perimeter intrusion monitor and cultural relics security are introduced. Secondly, we propose the concept of "Microstructure-Optical X Domin Refecltor (M-OXDR)" for fiber sensor network expansion. By fabricating smart micro-structures with the ability of multidimensional encoded and low insertion loss along the fiber, the fiber sensor network of simple structure and huge capacity more than one thousand could be achieved. Assisted by the WDM/TDM and WDM/FDM decoding methods respectively, we built the verification systems for long-haul and real-time temperature sensing. Finally, I will show the high capacity and flexible fiber sensor network with IPv6 protocol based hybrid fiber/wireless access. By developing the fiber optic sensor with embedded IPv6 protocol conversion module and IPv6 router, huge amounts of fiber optic sensor nodes can be uniquely addressed. Meanwhile, various sensing information could be integrated and accessed to the Next Generation Internet.

  14. Quantum dots in imaging, drug delivery and sensor applications.

    Science.gov (United States)

    Matea, Cristian T; Mocan, Teodora; Tabaran, Flaviu; Pop, Teodora; Mosteanu, Ofelia; Puia, Cosmin; Iancu, Cornel; Mocan, Lucian

    2017-01-01

    Quantum dots (QDs), also known as nanoscale semiconductor crystals, are nanoparticles with unique optical and electronic properties such as bright and intensive fluorescence. Since most conventional organic label dyes do not offer the near-infrared (>650 nm) emission possibility, QDs, with their tunable optical properties, have gained a lot of interest. They possess characteristics such as good chemical and photo-stability, high quantum yield and size-tunable light emission. Different types of QDs can be excited with the same light wavelength, and their narrow emission bands can be detected simultaneously for multiple assays. There is an increasing interest in the development of nano-theranostics platforms for simultaneous sensing, imaging and therapy. QDs have great potential for such applications, with notable results already published in the fields of sensors, drug delivery and biomedical imaging. This review summarizes the latest developments available in literature regarding the use of QDs for medical applications.

  15. Comparison of Three Non-Imaging Angle-Diversity Receivers as Input Sensors of Nodes for Indoor Infrared Wireless Sensor Networks: Theory and Simulation

    Directory of Open Access Journals (Sweden)

    Beatriz R. Mendoza

    2016-07-01

    Full Text Available In general, the use of angle-diversity receivers makes it possible to reduce the impact of ambient light noise, path loss and multipath distortion, in part by exploiting the fact that they often receive the desired signal from different directions. Angle-diversity detection can be performed using a composite receiver with multiple detector elements looking in different directions. These are called non-imaging angle-diversity receivers. In this paper, a comparison of three non-imaging angle-diversity receivers as input sensors of nodes for an indoor infrared (IR wireless sensor network is presented. The receivers considered are the conventional angle-diversity receiver (CDR, the sectored angle-diversity receiver (SDR, and the self-orienting receiver (SOR, which have been proposed or studied by research groups in Spain. To this end, the effective signal-collection area of the three receivers is modelled and a Monte-Carlo-based ray-tracing algorithm is implemented which allows us to investigate the effect on the signal to noise ratio and main IR channel parameters, such as path loss and rms delay spread, of using the three receivers in conjunction with different combination techniques in IR links operating at low bit rates. Based on the results of the simulations, we show that the use of a conventional angle-diversity receiver in conjunction with the equal-gain combining technique provides the solution with the best signal to noise ratio, the lowest computational capacity and the lowest transmitted power requirements, which comprise the main limitations for sensor nodes in an indoor infrared wireless sensor network.

  16. Extraction automatique des zones irriguées dans la région du Gharb par analyse d’image basée-objets des images Landsat 8

    Directory of Open Access Journals (Sweden)

    B. E. LAMHAMEDI

    2017-05-01

    Full Text Available Morocco has made enormous efforts in the irrigation field. However the lack of information regarding the precise location and delimitation of the irrigated areas is increasingly affecting farmers’ monitoring and control. The main objective of this study is to evaluate the contribution of the object-based image analysis approach for automatic extraction of irrigated areas, especially those out of the officials irrigated perimeters. We worked on two Landsat 8 images of the Gharb region. Three methods of classification have been carried out and evaluated according to the object-based approach. For the first two methods, we have created and implemented a series of membership rules, that aim, through various attributes (NDVI and temperature, to discriminate the objects of interest. For the third method, we used the nearest neighbor interpolation to establish a land cover map of the area of study. The results of this study are very promising. The irrigated areas were successfully extracted at 88.5 %. The superposition of the extracted areas with the boundary of the official irrigated perimeter made it possible to detect and locate irrigated areas outside this perimeter in the Gharb region. This will contribute to a better monitoring and control of the irrigation methods and the water consumption of the farmers in this region.

  17. Transition-edge sensor imaging arrays for astrophysics applications

    Science.gov (United States)

    Burney, Jennifer Anne

    Many interesting objects in our universe currently elude observation in the optical band: they are too faint or they vary rapidly and thus any structure in their radiation is lost over the period of an exposure. Conventional photon detectors cannot simultaneously provide energy resolution and time-stamping of individual photons at fast rates. Superconducting detectors have recently made the possibility of simultaneous photon counting, imaging, and energy resolution a reality. Our research group has pioneered the use of one such detector, the Transition-Edge Sensor (TES). TES physics is simple and elegant. A thin superconducting film, biased at its critical temperature, can act as a particle detector: an incident particle deposits energy and drives the film into its superconducting-normal transition. By inductively coupling the detector to a SQUID amplifier circuit, this resistance change can be read out as a current pulse, and its energy deduced by integrating over the pulse. TESs can be used to accurately time-stamp (to 0.1 [mu]s) and energy-resolve (0.15 eV at 1.6 eV) near-IR/visible/near-UV photons at rates of 30~kHz. The first astronomical observations using fiber-coupled detectors were made at the Stanford Student Observatory 0.6~m telescope in 1999. Further observations of the Crab Pulsar from the 107" telescope at the University of Texas McDonald Observatory showed rapid phase variations over the near-IR/visible/near-UV band. These preliminary observations provided a glimpse into a new realm of observations of pulsars, binary systems, and accreting black holes promised by TES arrays. This thesis describes the development, characterization, and preliminary use of the first camera system based on Transition-Edge Sensors. While single-device operation is relatively well-understood, the operation of a full imaging array poses significant challenges. This thesis addresses all aspects related to the creation and characterization of this cryogenic imaging

  18. Advanced microlens and color filter process technology for the high-efficiency CMOS and CCD image sensors

    Science.gov (United States)

    Fan, Yang-Tung; Peng, Chiou-Shian; Chu, Cheng-Yu

    2000-12-01

    New markets are emerging for digital electronic image device, especially in visual communications, PC camera, mobile/cell phone, security system, toys, vehicle image system and computer peripherals for document capture. To enable one-chip image system that image sensor is with a full digital interface, can make image capture devices in our daily lives. Adding a color filter to such image sensor in a pattern of mosaics pixel or wide stripes can make image more real and colorful. We can say 'color filter makes the life more colorful color filter is? Color filter means can filter image light source except the color with specific wavelength and transmittance that is same as color filter itself. Color filter process is coating and patterning green, red and blue (or cyan, magenta and yellow) mosaic resists onto matched pixel in image sensing array pixels. According to the signal caught from each pixel, we can figure out the environment image picture. Widely use of digital electronic camera and multimedia applications today makes the feature of color filter becoming bright. Although it has challenge but it is very worthy to develop the process of color filter. We provide the best service on shorter cycle time, excellent color quality, high and stable yield. The key issues of advanced color process have to be solved and implemented are planarization and micro-lens technology. Lost of key points of color filter process technology have to consider will also be described in this paper.

  19. Using polynomials to simplify fixed pattern noise and photometric correction of logarithmic CMOS image sensors.

    Science.gov (United States)

    Li, Jing; Mahmoodi, Alireza; Joseph, Dileepan

    2015-10-16

    An important class of complementary metal-oxide-semiconductor (CMOS) image sensors are those where pixel responses are monotonic nonlinear functions of light stimuli. This class includes various logarithmic architectures, which are easily capable of wide dynamic range imaging, at video rates, but which are vulnerable to image quality issues. To minimize fixed pattern noise (FPN) and maximize photometric accuracy, pixel responses must be calibrated and corrected due to mismatch and process variation during fabrication. Unlike literature approaches, which employ circuit-based models of varying complexity, this paper introduces a novel approach based on low-degree polynomials. Although each pixel may have a highly nonlinear response, an approximately-linear FPN calibration is possible by exploiting the monotonic nature of imaging. Moreover, FPN correction requires only arithmetic, and an optimal fixed-point implementation is readily derived, subject to a user-specified number of bits per pixel. Using a monotonic spline, involving cubic polynomials, photometric calibration is also possible without a circuit-based model, and fixed-point photometric correction requires only a look-up table. The approach is experimentally validated with a logarithmic CMOS image sensor and is compared to a leading approach from the literature. The novel approach proves effective and efficient.

  20. Fast responsive fluorescence turn-on sensor for Cu2+ and its application in live cell imaging

    International Nuclear Information System (INIS)

    Wang Jiaoliang; Li Hao; Long Liping; Xiao Guqing; Xie Dan

    2012-01-01

    A new effective fluorescent sensor based on rhodamine was synthesized, which was induced by Cu 2+ in aqueous media to produce turn-on fluorescence. The new sensor 1 exhibited good selectivity for Cu 2+ over other heavy and transition metal (HTM) ions in H 2 O/CH 3 CN(7:3, v/v). Upon addition of Cu 2+ , a remarkable color change from colorless to pink was easily observed by the naked eye, and the dramatic fluorescence turn-on was corroborated. Furthermore, kinetic assay indicates that sensor 1 could be used for real-time tracking of Cu 2+ in cells and organisms. In addition, the turn-on fluorescent change upon the addition of Cu 2+ was also applied in bioimaging. - Highlights: ► A new effective fluorescent sensor based on rhodamine was developed to detect Cu 2+ . ► The sensor exhibited fast response, good selectivity at physiological pH condition. ► The sensor was an effective intracellular Cu 2+ ion imaging agent.

  1. Scintillator high-gain avalanche rushing photoconductor active-matrix flat panel imager: zero-spatial frequency x-ray imaging properties of the solid-state SHARP sensor structure.

    Science.gov (United States)

    Wronski, M; Zhao, W; Tanioka, K; Decrescenzo, G; Rowlands, J A

    2012-11-01

    The authors are investigating the feasibility of a new type of solid-state x-ray imaging sensor with programmable avalanche gain: scintillator high-gain avalanche rushing photoconductor active matrix flat panel imager (SHARP-AMFPI). The purpose of the present work is to investigate the inherent x-ray detection properties of SHARP and demonstrate its wide dynamic range through programmable gain. A distributed resistive layer (DRL) was developed to maintain stable avalanche gain operation in a solid-state HARP. The signal and noise properties of the HARP-DRL for optical photon detection were investigated as a function of avalanche gain both theoretically and experimentally, and the results were compared with HARP tube (with electron beam readout) used in previous investigations of zero spatial frequency performance of SHARP. For this new investigation, a solid-state SHARP x-ray image sensor was formed by direct optical coupling of the HARP-DRL with a structured cesium iodide (CsI) scintillator. The x-ray sensitivity of this sensor was measured as a function of avalanche gain and the results were compared with the sensitivity of HARP-DRL measured optically. The dynamic range of HARP-DRL with variable avalanche gain was investigated for the entire exposure range encountered in radiography∕fluoroscopy (R∕F) applications. The signal from HARP-DRL as a function of electric field showed stable avalanche gain, and the noise associated with the avalanche process agrees well with theory and previous measurements from a HARP tube. This result indicates that when coupled with CsI for x-ray detection, the additional noise associated with avalanche gain in HARP-DRL is negligible. The x-ray sensitivity measurements using the SHARP sensor produced identical avalanche gain dependence on electric field as the optical measurements with HARP-DRL. Adjusting the avalanche multiplication gain in HARP-DRL enabled a very wide dynamic range which encompassed all clinically relevant

  2. Scintillator high-gain avalanche rushing photoconductor active-matrix flat panel imager: Zero-spatial frequency x-ray imaging properties of the solid-state SHARP sensor structure

    International Nuclear Information System (INIS)

    Wronski, M.; Zhao, W.; Tanioka, K.; DeCrescenzo, G.; Rowlands, J. A.

    2012-01-01

    Purpose: The authors are investigating the feasibility of a new type of solid-state x-ray imaging sensor with programmable avalanche gain: scintillator high-gain avalanche rushing photoconductor active matrix flat panel imager (SHARP-AMFPI). The purpose of the present work is to investigate the inherent x-ray detection properties of SHARP and demonstrate its wide dynamic range through programmable gain. Methods: A distributed resistive layer (DRL) was developed to maintain stable avalanche gain operation in a solid-state HARP. The signal and noise properties of the HARP-DRL for optical photon detection were investigated as a function of avalanche gain both theoretically and experimentally, and the results were compared with HARP tube (with electron beam readout) used in previous investigations of zero spatial frequency performance of SHARP. For this new investigation, a solid-state SHARP x-ray image sensor was formed by direct optical coupling of the HARP-DRL with a structured cesium iodide (CsI) scintillator. The x-ray sensitivity of this sensor was measured as a function of avalanche gain and the results were compared with the sensitivity of HARP-DRL measured optically. The dynamic range of HARP-DRL with variable avalanche gain was investigated for the entire exposure range encountered in radiography/fluoroscopy (R/F) applications. Results: The signal from HARP-DRL as a function of electric field showed stable avalanche gain, and the noise associated with the avalanche process agrees well with theory and previous measurements from a HARP tube. This result indicates that when coupled with CsI for x-ray detection, the additional noise associated with avalanche gain in HARP-DRL is negligible. The x-ray sensitivity measurements using the SHARP sensor produced identical avalanche gain dependence on electric field as the optical measurements with HARP-DRL. Adjusting the avalanche multiplication gain in HARP-DRL enabled a very wide dynamic range which encompassed all

  3. Radiometric, geometric, and image quality assessment of ALOS AVNIR-2 and PRISM sensors

    Science.gov (United States)

    Saunier, S.; Goryl, P.; Chander, G.; Santer, R.; Bouvet, M.; Collet, B.; Mambimba, A.; Kocaman, Aksakal S.

    2010-01-01

    The Advanced Land Observing Satellite (ALOS) was launched on January 24, 2006, by a Japan Aerospace Exploration Agency (JAXA) H-IIA launcher. It carries three remote-sensing sensors: 1) the Advanced Visible and Near-Infrared Radiometer type 2 (AVNIR-2); 2) the Panchromatic Remote-Sensing Instrument for Stereo Mapping (PRISM); and 3) the Phased-Array type L-band Synthetic Aperture Radar (PALSAR). Within the framework of ALOS Data European Node, as part of the European Space Agency (ESA), the European Space Research Institute worked alongside JAXA to provide contributions to the ALOS commissioning phase plan. This paper summarizes the strategy that was adopted by ESA to define and implement a data verification plan for missions operated by external agencies; these missions are classified by the ESA as third-party missions. The ESA was supported in the design and execution of this plan by GAEL Consultant. The verification of ALOS optical data from PRISM and AVNIR-2 sensors was initiated 4 months after satellite launch, and a team of principal investigators assembled to provide technical expertise. This paper includes a description of the verification plan and summarizes the methodologies that were used for radiometric, geometric, and image quality assessment. The successful completion of the commissioning phase has led to the sensors being declared fit for operations. The consolidated measurements indicate that the radiometric calibration of the AVNIR-2 sensor is stable and agrees with the Landsat-7 Enhanced Thematic Mapper Plus and the Envisat MEdium-Resolution Imaging Spectrometer calibration. The geometrical accuracy of PRISM and AVNIR-2 products improved significantly and remains under control. The PRISM modulation transfer function is monitored for improved characterization.

  4. Range-Measuring Video Sensors

    Science.gov (United States)

    Howard, Richard T.; Briscoe, Jeri M.; Corder, Eric L.; Broderick, David

    2006-01-01

    Optoelectronic sensors of a proposed type would perform the functions of both electronic cameras and triangulation- type laser range finders. That is to say, these sensors would both (1) generate ordinary video or snapshot digital images and (2) measure the distances to selected spots in the images. These sensors would be well suited to use on robots that are required to measure distances to targets in their work spaces. In addition, these sensors could be used for all the purposes for which electronic cameras have been used heretofore. The simplest sensor of this type, illustrated schematically in the upper part of the figure, would include a laser, an electronic camera (either video or snapshot), a frame-grabber/image-capturing circuit, an image-data-storage memory circuit, and an image-data processor. There would be no moving parts. The laser would be positioned at a lateral distance d to one side of the camera and would be aimed parallel to the optical axis of the camera. When the range of a target in the field of view of the camera was required, the laser would be turned on and an image of the target would be stored and preprocessed to locate the angle (a) between the optical axis and the line of sight to the centroid of the laser spot.

  5. Handheld and mobile hyperspectral imaging sensors for wide-area standoff detection of explosives and chemical warfare agents

    Science.gov (United States)

    Gomer, Nathaniel R.; Gardner, Charles W.; Nelson, Matthew P.

    2016-05-01

    Hyperspectral imaging (HSI) is a valuable tool for the investigation and analysis of targets in complex background with a high degree of autonomy. HSI is beneficial for the detection of threat materials on environmental surfaces, where the concentration of the target of interest is often very low and is typically found within complex scenery. Two HSI techniques that have proven to be valuable are Raman and shortwave infrared (SWIR) HSI. Unfortunately, current generation HSI systems have numerous size, weight, and power (SWaP) limitations that make their potential integration onto a handheld or field portable platform difficult. The systems that are field-portable do so by sacrificing system performance, typically by providing an inefficient area search rate, requiring close proximity to the target for screening, and/or eliminating the potential to conduct real-time measurements. To address these shortcomings, ChemImage Sensor Systems (CISS) is developing a variety of wide-field hyperspectral imaging systems. Raman HSI sensors are being developed to overcome two obstacles present in standard Raman detection systems: slow area search rate (due to small laser spot sizes) and lack of eye-safety. SWIR HSI sensors have been integrated into mobile, robot based platforms and handheld variants for the detection of explosives and chemical warfare agents (CWAs). In addition, the fusion of these two technologies into a single system has shown the feasibility of using both techniques concurrently to provide higher probability of detection and lower false alarm rates. This paper will provide background on Raman and SWIR HSI, discuss the applications for these techniques, and provide an overview of novel CISS HSI sensors focused on sensor design and detection results.

  6. High-speed particle tracking in microscopy using SPAD image sensors

    Science.gov (United States)

    Gyongy, Istvan; Davies, Amy; Miguelez Crespo, Allende; Green, Andrew; Dutton, Neale A. W.; Duncan, Rory R.; Rickman, Colin; Henderson, Robert K.; Dalgarno, Paul A.

    2018-02-01

    Single photon avalanche diodes (SPADs) are used in a wide range of applications, from fluorescence lifetime imaging microscopy (FLIM) to time-of-flight (ToF) 3D imaging. SPAD arrays are becoming increasingly established, combining the unique properties of SPADs with widefield camera configurations. Traditionally, the photosensitive area (fill factor) of SPAD arrays has been limited by the in-pixel digital electronics. However, recent designs have demonstrated that by replacing the complex digital pixel logic with simple binary pixels and external frame summation, the fill factor can be increased considerably. A significant advantage of such binary SPAD arrays is the high frame rates offered by the sensors (>100kFPS), which opens up new possibilities for capturing ultra-fast temporal dynamics in, for example, life science cellular imaging. In this work we consider the use of novel binary SPAD arrays in high-speed particle tracking in microscopy. We demonstrate the tracking of fluorescent microspheres undergoing Brownian motion, and in intra-cellular vesicle dynamics, at high frame rates. We thereby show how binary SPAD arrays can offer an important advance in live cell imaging in such fields as intercellular communication, cell trafficking and cell signaling.

  7. THE IMPACT ASSESSMENT OF THE ABANDONED URANIUM MINING EXPLOITATIONS ON ROCKS AND SOILS - ZIMBRU PERIMETER, ARAD COUNTY

    Directory of Open Access Journals (Sweden)

    DIANA M. BANU

    2016-10-01

    Full Text Available The mining exploration and exploitation, especially the activity of uranium mineralization exploration and exploitation has a negative impact on the environment by the alterations of the landscape and the degradation of the environmental factors' quality. The principal environmental factors that could be affected by mining operations resulting from uranium exploitation are: water, air, soil, population, fauna, and flora. The aim of this study is, first, to identify the sources of pollution (natural radionuclides - natural radioactive series of uranium, radium, thorium, potassium and heavy metals that are accompanying the mineralizations for two of the most important environmental factors: rocks and soils: and, second, to assess the pollution impact on those two environmental factors. In order to identify this pollutants and their impact assessment it was selected as a study case an abandoned uranium mining perimeter named the Zimbru perimeter located in Arad County, Romania.

  8. Redox-Active Star Molecules Incorporating the 4-Benzoylpyridinium Cation - Implications for the Charge Transfer Along Branches vs. Across the Perimeter in Dendrimer

    Science.gov (United States)

    Leventis, Nicholas; Yang, Jinua; Fabrizio,Even F.; Rawashdeh, Abdel-Monem M.; Oh, Woon Su; Sotiriou-Leventis, Chariklia

    2004-01-01

    Dendrimers are self-repeating globular branched star molecules, whose fractal structure continues to fascinate, challenge, and inspire. Functional dendrimers may incorporate redox centers, and potential applications include antennae molecules for light harvesting, sensors, mediators, and artificial biomolecules. We report the synthesis and redox properties of four star systems incorporating the 4-benzoyl-N-alkylpyridinium cation; the redox potential varies along the branches but remains constant at fixed radii. Bulk electrolysis shows that at a semi-infinite time scale all redox centers are electrochemically accessible. However, voltammetric analysis (cyclic voltammetry and differential pulse voltammetry) shows that on1y two of the three redox-active centers in the perimeter are electrochemically accessible during potential sweeps as slow as 20 mV/s and as fast as 10 V/s. On the contrary, both redox centers along branches are accessible electrochemically within the same time frame. These results are explained in terms of slow through-space charge transfer and the globular 3-D folding of the molecules and are discussed in terms of their implications on the design of efficient redox functional dendrimers.

  9. FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization

    Science.gov (United States)

    Hirigoyen, Flavien; Crocherie, Axel; Vaillant, Jérôme M.; Cazaux, Yvon

    2008-02-01

    This paper presents a new FDTD-based optical simulation model dedicated to describe the optical performances of CMOS image sensors taking into account diffraction effects. Following market trend and industrialization constraints, CMOS image sensors must be easily embedded into even smaller packages, which are now equipped with auto-focus and short-term coming zoom system. Due to miniaturization, the ray-tracing models used to evaluate pixels optical performances are not accurate anymore to describe the light propagation inside the sensor, because of diffraction effects. Thus we adopt a more fundamental description to take into account these diffraction effects: we chose to use Maxwell-Boltzmann based modeling to compute the propagation of light, and to use a software with an FDTD-based (Finite Difference Time Domain) engine to solve this propagation. We present in this article the complete methodology of this modeling: on one hand incoherent plane waves are propagated to approximate a product-use diffuse-like source, on the other hand we use periodic conditions to limit the size of the simulated model and both memory and computation time. After having presented the correlation of the model with measurements we will illustrate its use in the case of the optimization of a 1.75μm pixel.

  10. The influence of the oblique incident X-ray that affected the image quality of the X-ray CCD sensor

    International Nuclear Information System (INIS)

    Suzuki, Yosuke; Matsumoto, Nobue; Morita, Hiroshi; Ohkawa, Hiromitsu

    1998-01-01

    The influence of the oblique incident X-ray that affected the image quality of the X-ray CCD sensor was examined and its correction was investigated. CDR was adopted in this study and evaluated image quality, by measuring MTF. The oblique projection was clinically permissible to about an oblique incident angle of 40 degrees although it exerts an influence on the magnifying power and density. The estimation of the oblique entrance direction and oblique incident angle was possible, by developing an oblique incident correction marker. When an oblique incident angle of θ degrees was measured, a correction is possible, by compressing the image cos (θ) times perpendicular to the rotational axis of CCD sensor. There was small decline of MTF, in the image where a correction for the influence of oblique incidence was made. By observation of the digital subtracted picture of the image after correction of oblique projection and that of normal, the resemblance in the two images indicated that this correction method was reasonable. (author)

  11. Applications of the Integrated High-Performance CMOS Image Sensor to Range Finders — from Optical Triangulation to the Automotive Field

    Directory of Open Access Journals (Sweden)

    Joe-Air Jiang

    2008-03-01

    Full Text Available With their significant features, the applications of complementary metal-oxidesemiconductor (CMOS image sensors covers a very extensive range, from industrialautomation to traffic applications such as aiming systems, blind guidance, active/passiverange finders, etc. In this paper CMOS image sensor-based active and passive rangefinders are presented. The measurement scheme of the proposed active/passive rangefinders is based on a simple triangulation method. The designed range finders chieflyconsist of a CMOS image sensor and some light sources such as lasers or LEDs. Theimplementation cost of our range finders is quite low. Image processing software to adjustthe exposure time (ET of the CMOS image sensor to enhance the performance oftriangulation-based range finders was also developed. An extensive series of experimentswere conducted to evaluate the performance of the designed range finders. From theexperimental results, the distance measurement resolutions achieved by the active rangefinder and the passive range finder can be better than 0.6% and 0.25% within themeasurement ranges of 1 to 8 m and 5 to 45 m, respectively. Feasibility tests onapplications of the developed CMOS image sensor-based range finders to the automotivefield were also conducted. The experimental results demonstrated that our range finders arewell-suited for distance measurements in this field.

  12. Laser Imaging Video Camera Sees Through Fire, Fog, Smoke

    Science.gov (United States)

    2015-01-01

    Under a series of SBIR contracts with Langley Research Center, inventor Richard Billmers refined a prototype for a laser imaging camera capable of seeing through fire, fog, smoke, and other obscurants. Now, Canton, Ohio-based Laser Imaging through Obscurants (LITO) Technologies Inc. is demonstrating the technology as a perimeter security system at Glenn Research Center and planning its future use in aviation, shipping, emergency response, and other fields.

  13. Control Design and Digital Implementation of a Fast 2-Degree-of-Freedom Translational Optical Image Stabilizer for Image Sensors in Mobile Camera Phones.

    Science.gov (United States)

    Wang, Jeremy H-S; Qiu, Kang-Fu; Chao, Paul C-P

    2017-10-13

    This study presents design, digital implementation and performance validation of a lead-lag controller for a 2-degree-of-freedom (DOF) translational optical image stabilizer (OIS) installed with a digital image sensor in mobile camera phones. Nowadays, OIS is an important feature of modern commercial mobile camera phones, which aims to mechanically reduce the image blur caused by hand shaking while shooting photos. The OIS developed in this study is able to move the imaging lens by actuating its voice coil motors (VCMs) at the required speed to the position that significantly compensates for imaging blurs by hand shaking. The compensation proposed is made possible by first establishing the exact, nonlinear equations of motion (EOMs) for the OIS, which is followed by designing a simple lead-lag controller based on established nonlinear EOMs for simple digital computation via a field-programmable gate array (FPGA) board in order to achieve fast response. Finally, experimental validation is conducted to show the favorable performance of the designed OIS; i.e., it is able to stabilize the lens holder to the desired position within 0.02 s, which is much less than previously reported times of around 0.1 s. Also, the resulting residual vibration is less than 2.2-2.5 μm, which is commensurate to the very small pixel size found in most of commercial image sensors; thus, significantly minimizing image blur caused by hand shaking.

  14. Spatial filtering self-velocimeter for vehicle application using a CMOS linear image sensor

    Science.gov (United States)

    He, Xin; Zhou, Jian; Nie, Xiaoming; Long, Xingwu

    2015-03-01

    The idea of using a spatial filtering velocimeter (SFV) to measure the velocity of a vehicle for an inertial navigation system is put forward. The presented SFV is based on a CMOS linear image sensor with a high-speed data rate, large pixel size, and built-in timing generator. These advantages make the image sensor suitable to measure vehicle velocity. The power spectrum of the output signal is obtained by fast Fourier transform and is corrected by a frequency spectrum correction algorithm. This velocimeter was used to measure the velocity of a conveyor belt driven by a rotary table and the measurement uncertainty is ˜0.54%. Furthermore, it was also installed on a vehicle together with a laser Doppler velocimeter (LDV) to measure self-velocity. The measurement result of the designed SFV is compared with that of the LDV. It is shown that the measurement result of the SFV is coincident with that of the LDV. Therefore, the designed SFV is suitable for a vehicle self-contained inertial navigation system.

  15. Degradation of CMOS image sensors in deep-submicron technology due to γ-irradiation

    Science.gov (United States)

    Rao, Padmakumar R.; Wang, Xinyang; Theuwissen, Albert J. P.

    2008-09-01

    In this work, radiation induced damage mechanisms in deep submicron technology is resolved using finger gated-diodes (FGDs) as a radiation sensitive tool. It is found that these structures are simple yet efficient structures to resolve radiation induced damage in advanced CMOS processes. The degradation of the CMOS image sensors in deep-submicron technology due to γ-ray irradiation is studied by developing a model for the spectral response of the sensor and also by the dark-signal degradation as a function of STI (shallow-trench isolation) parameters. It is found that threshold shifts in the gate-oxide/silicon interface as well as minority carrier life-time variations in the silicon bulk are minimal. The top-layer material properties and the photodiode Si-SiO2 interface quality are degraded due to γ-ray irradiation. Results further suggest that p-well passivated structures are inevitable for radiation-hard designs. It was found that high electrical fields in submicron technologies pose a threat to high quality imaging in harsh environments.

  16. A dual pH and temperature responsive polymeric fluorescent sensor and its imaging application in living cells.

    Science.gov (United States)

    Yin, Liyan; He, Chunsheng; Huang, Chusen; Zhu, Weiping; Wang, Xin; Xu, Yufang; Qian, Xuhong

    2012-05-11

    A polymeric fluorescent sensor PNME, consisting of A4 and N-isopropylacrylamide (NIPAM) units, was synthesized. PNME exhibited dual responses to pH and temperature, and could be used as an intracellular pH sensor for lysosomes imaging. Moreover, it also could sense different temperature change in living cells at 25 and 37 °C, respectively. This journal is © The Royal Society of Chemistry 2012

  17. Exploiting the speckle-correlation scattering matrix for a compact reference-free holographic image sensor.

    Science.gov (United States)

    Lee, KyeoReh; Park, YongKeun

    2016-10-31

    The word 'holography' means a drawing that contains all of the information for light-both amplitude and wavefront. However, because of the insufficient bandwidth of current electronics, the direct measurement of the wavefront of light has not yet been achieved. Though reference-field-assisted interferometric methods have been utilized in numerous applications, introducing a reference field raises several fundamental and practical issues. Here we demonstrate a reference-free holographic image sensor. To achieve this, we propose a speckle-correlation scattering matrix approach; light-field information passing through a thin disordered layer is recorded and retrieved from a single-shot recording of speckle intensity patterns. Self-interference via diffusive scattering enables access to impinging light-field information, when light transport in the diffusive layer is precisely calibrated. As a proof-of-concept, we demonstrate direct holographic measurements of three-dimensional optical fields using a compact device consisting of a regular image sensor and a diffusor.

  18. A Single-Transistor Active Pixel CMOS Image Sensor Architecture

    International Nuclear Information System (INIS)

    Zhang Guo-An; He Jin; Zhang Dong-Wei; Su Yan-Mei; Wang Cheng; Chen Qin; Liang Hai-Lang; Ye Yun

    2012-01-01

    A single-transistor CMOS active pixel image sensor (1 T CMOS APS) architecture is proposed. By switching the photosensing pinned diode, resetting and selecting can be achieved by diode pull-up and capacitive coupling pull-down of the source follower. Thus, the reset and selected transistors can be removed. In addition, the reset and selected signal lines can be shared to reduce the metal signal line, leading to a very high fill factor. The pixel design and operation principles are discussed in detail. The functionality of the proposed 1T CMOS APS architecture has been experimentally verified using a fabricated chip in a standard 0.35 μm CMOS AMIS technology

  19. Surgical neuro navigator guided by preoperative magnetic resonance images, based on a magnetic position sensor

    International Nuclear Information System (INIS)

    Perini, Ana Paula; Siqueira, Rogerio Bulha; Carneiro, Antonio Adilton Oliveira; Oliveira, Lucas Ferrari de; Machado, Helio Rubens

    2009-01-01

    Image guided neurosurgery enables the neurosurgeon to navigate inside the patient's brain using pre-operative images as a guide and a tracking system, during a surgery. Following a calibration procedure, three-dimensional position and orientation of surgical instruments may be transmitted to computer. The spatial information is used to access a region of interest, in the pre-operative images, displaying them to the neurosurgeon during the surgical procedure. However, when a craniotomy is involved and the lesion is removed, movements of brain tissue can be a significant source of error in these conventional navigation systems. The architecture implemented in this work intends the development of a system to surgical planning and orientation guided by ultrasound image. For surgical orientation, the software developed allows the extraction of slices from the volume of the magnetic resonance images (MRI) with orientation supplied by a magnetic position sensor (Polhemus R ). The slices extracted with this software are important because they show the cerebral area that the neurosurgeon is observing during the surgery, and besides they can be correlated with the intra-operative ultrasound images to detect and to correct the deformation of brain tissue during the surgery. Also, a tool for per-operative navigation was developed, providing three orthogonal planes through the image volume. In the methodology used for the software implementation, the Python tm programming language and the Visualization Toolkit (VTK) graphics library were used. The program to extract slices of the MRI volume allowed the application of transformations in the volume, using coordinates supplied by the position sensor. (author)

  20. Non-Quality Controlled Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data Vb0

    Data.gov (United States)

    National Aeronautics and Space Administration — The Non-Quality Controlled Lightning Imaging Sensor (LIS) on International Space Station (ISS) Science Data were collected by the LIS instrument on the ISS used to...

  1. New optical sensor systems for high-resolution satellite, airborne and terrestrial imaging systems

    Science.gov (United States)

    Eckardt, Andreas; Börner, Anko; Lehmann, Frank

    2007-10-01

    The department of Optical Information Systems (OS) at the Institute of Robotics and Mechatronics of the German Aerospace Center (DLR) has more than 25 years experience with high-resolution imaging technology. The technology changes in the development of detectors, as well as the significant change of the manufacturing accuracy in combination with the engineering research define the next generation of spaceborne sensor systems focusing on Earth observation and remote sensing. The combination of large TDI lines, intelligent synchronization control, fast-readable sensors and new focal-plane concepts open the door to new remote-sensing instruments. This class of instruments is feasible for high-resolution sensor systems regarding geometry and radiometry and their data products like 3D virtual reality. Systemic approaches are essential for such designs of complex sensor systems for dedicated tasks. The system theory of the instrument inside a simulated environment is the beginning of the optimization process for the optical, mechanical and electrical designs. Single modules and the entire system have to be calibrated and verified. Suitable procedures must be defined on component, module and system level for the assembly test and verification process. This kind of development strategy allows the hardware-in-the-loop design. The paper gives an overview about the current activities at DLR in the field of innovative sensor systems for photogrammetric and remote sensing purposes.

  2. Fingerprint enhancement using a multispectral sensor

    Science.gov (United States)

    Rowe, Robert K.; Nixon, Kristin A.

    2005-03-01

    The level of performance of a biometric fingerprint sensor is critically dependent on the quality of the fingerprint images. One of the most common types of optical fingerprint sensors relies on the phenomenon of total internal reflectance (TIR) to generate an image. Under ideal conditions, a TIR fingerprint sensor can produce high-contrast fingerprint images with excellent feature definition. However, images produced by the same sensor under conditions that include dry skin, dirt on the skin, and marginal contact between the finger and the sensor, are likely to be severely degraded. This paper discusses the use of multispectral sensing as a means to collect additional images with new information about the fingerprint that can significantly augment the system performance under both normal and adverse sample conditions. In the context of this paper, "multispectral sensing" is used to broadly denote a collection of images taken under different illumination conditions: different polarizations, different illumination/detection configurations, as well as different wavelength illumination. Results from three small studies using an early-stage prototype of the multispectral-TIR (MTIR) sensor are presented along with results from the corresponding TIR data. The first experiment produced data from 9 people, 4 fingers from each person and 3 measurements per finger under "normal" conditions. The second experiment provided results from a study performed to test the relative performance of TIR and MTIR images when taken under extreme dry and dirty conditions. The third experiment examined the case where the area of contact between the finger and sensor is greatly reduced.

  3. Use of capacitive sensors with the instantaneous profile method to determine hydraulic conductivity

    Directory of Open Access Journals (Sweden)

    Eurileny Lucas de Almeida

    Full Text Available ABSTRACT Due to the need to monitor soil water tension continuously, the instantaneous profile method is considered laborious, requiring a lot of time, and especially manpower, to set up and maintain. The aim of this work was to evaluate the possibility of using capacitive sensors in place of tensiometers with the instantaneous profile method in an area of the Lower Acaraú Irrigated Perimeter. The experiment was carried out in a Eutrophic Red-Yellow Argisol. The sensors were installed 15, 30, 45 and 60 cm from the surface, and powered by photovoltaic panels, using a power manager to charge the battery and to supply power at night. Records from the capacitive sensors were collected every five minutes and stored on a data acquisition board. With the simultaneous measurement of soil moisture obtained by the sensors, and the total soil water potential from the soil water retention curve, it was possible to determine the hydraulic conductivity as a function of the volumetric water content for each period using the Richards equation. At the end of the experiment, the advantage of using capacitive sensors with the instantaneous profile method was confirmed as an alternative to using a tensiometer. The main advantages of using capacitive sensors were to make the method less laborious and to allow moisture readings at higher tensions in soils of a sandy texture.

  4. Digital CDS for image sensors with dominant white and 1/f noise

    International Nuclear Information System (INIS)

    Stefanov, K.D.

    2015-01-01

    This paper investigates the performance of digital correlated double sampling (DCDS) for processing of image sensor signals in the presence of white and 1/f noise. The DCDS is compared with the dual slope integrator, which is the optimal analogue processing technique when only white noise is present. Based on the concept of matched filters, the paper derives and explores the optimal signal processing algorithms for signals with dominant 1/f noise, resulting in the highest achievable signal-to-noise ratio (SNR). Experimental results based on optimal DCDS on artificially generated 1/f noise signals are presented and discussed, together with the limitations of the method for more realistic sensor signals. It is shown that the noise level of the optimal DCDS can get close to the theoretical minimum

  5. Indoor and Outdoor Depth Imaging of Leaves With Time-of-Flight and Stereo Vision Sensors

    DEFF Research Database (Denmark)

    Kazmi, Wajahat; Foix, Sergi; Alenya, Guilliem

    2014-01-01

    In this article we analyze the response of Time-of-Flight (ToF) cameras (active sensors) for close range imaging under three different illumination conditions and compare the results with stereo vision (passive) sensors. ToF cameras are sensitive to ambient light and have low resolution but deliver...... poorly under sunlight. Stereo vision is comparatively more robust to ambient illumination and provides high resolution depth data but is constrained by texture of the object along with computational efficiency. Graph cut based stereo correspondence algorithm can better retrieve the shape of the leaves...

  6. Analysis of Morphological Features of Benign and Malignant Breast Cell Extracted From FNAC Microscopic Image Using the Pearsonian System of Curves.

    Science.gov (United States)

    Rajbongshi, Nijara; Bora, Kangkana; Nath, Dilip C; Das, Anup K; Mahanta, Lipi B

    2018-01-01

    Cytological changes in terms of shape and size of nuclei are some of the common morphometric features to study breast cancer, which can be observed by careful screening of fine needle aspiration cytology (FNAC) images. This study attempts to categorize a collection of FNAC microscopic images into benign and malignant classes based on family of probability distribution using some morphometric features of cell nuclei. For this study, features namely area, perimeter, eccentricity, compactness, and circularity of cell nuclei were extracted from FNAC images of both benign and malignant samples using an image processing technique. All experiments were performed on a generated FNAC image database containing 564 malignant (cancerous) and 693 benign (noncancerous) cell level images. The five-set extracted features were reduced to three-set (area, perimeter, and circularity) based on the mean statistic. Finally, the data were fitted to the generalized Pearsonian system of frequency curve, so that the resulting distribution can be used as a statistical model. Pearsonian system is a family of distributions where kappa (κ) is the selection criteria computed as functions of the first four central moments. For the benign group, kappa (κ) corresponding to area, perimeter, and circularity was -0.00004, 0.0000, and 0.04155 and for malignant group it was 1016942, 0.01464, and -0.3213, respectively. Thus, the family of distribution related to these features for the benign and malignant group were different, and therefore, characterization of their probability curve will also be different.

  7. Toward High Altitude Airship Ground-Based Boresight Calibration of Hyperspectral Pushbroom Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Aiwu Zhang

    2015-12-01

    Full Text Available The complexity of the single linear hyperspectral pushbroom imaging based on a high altitude airship (HAA without a three-axis stabilized platform is much more than that based on the spaceborne and airborne. Due to the effects of air pressure, temperature and airflow, the large pitch and roll angles tend to appear frequently that create pushbroom images highly characterized with severe geometric distortions. Thus, the in-flight calibration procedure is not appropriate to apply to the single linear pushbroom sensors on HAA having no three-axis stabilized platform. In order to address this problem, a new ground-based boresight calibration method is proposed. Firstly, a coordinate’s transformation model is developed for direct georeferencing (DG of the linear imaging sensor, and then the linear error equation is derived from it by using the Taylor expansion formula. Secondly, the boresight misalignments are worked out by using iterative least squares method with few ground control points (GCPs and ground-based side-scanning experiments. The proposed method is demonstrated by three sets of experiments: (i the stability and reliability of the method is verified through simulation-based experiments; (ii the boresight calibration is performed using ground-based experiments; and (iii the validation is done by applying on the orthorectification of the real hyperspectral pushbroom images from a HAA Earth observation payload system developed by our research team—“LanTianHao”. The test results show that the proposed boresight calibration approach significantly improves the quality of georeferencing by reducing the geometric distortions caused by boresight misalignments to the minimum level.

  8. Application Development: AN Interactive, Non-Technical Perspective of the Geology and Geomorphology of the Ouray Perimeter Tail, CO.

    Science.gov (United States)

    Allen, H. M.; Giardino, J. R.

    2015-12-01

    Each year people seek respite from their busy lifestyles by traveling to state or national parks, national forests or wilderness areas. The majority of these parks were established in order to help preserve our natural heritage, including wildlife, forests, and the beauty of landscapes formed from thousands of years of geologic/geomorphologic processes. Whilst being able to enjoy the tranquility of nature, tourists are being robbed of a more in-depth experience as a result of the lack of a geologic background. One such location that attracts a large number of summer tourists is the perimeter trail in Ouray, Colorado. Located in the Southwestern portion of Colorado, Ouray is situated in the beautiful San Juan Mountain range along the "Million Dollar Highway." The Perimeter trail is a six-mile trail loop that circles the city of Ouray. The city is a very popular place for summertime tourism because of its unparalleled scenery. Ouray is situated in an area that is riddled with textbook angular unconformities, metasedimentary, sedimentary, and volcanic rocks. In the study area, The San Juans have been beautifully sculpted by an array of major faulting events, glacial activity and volcanics. With the understanding that technology is ever expanding, we think there is no better way to experience the Perimeter Trail than to have an interactive application that will be both educational as well as interesting. This application is a non-technical way of looking at the geology and geomorphology of the perimeter trail. Additionally, a paper brochure shows the most noteworthy points of interest. The brochure contains a brief geologic history of the San Juan Mountains accompanied with annotated photographs to illustrate the complex geology/geomorphology encountered on the trail. The application is based on an interactive three-dimensional map, which can be zoomed to various scales. The app hosts a locational service that uses the phone's GPS to communicate location of the hiker

  9. A Multi-Resolution Mode CMOS Image Sensor with a Novel Two-Step Single-Slope ADC for Intelligent Surveillance Systems

    Directory of Open Access Journals (Sweden)

    Daehyeok Kim

    2017-06-01

    Full Text Available In this paper, we present a multi-resolution mode CMOS image sensor (CIS for intelligent surveillance system (ISS applications. A low column fixed-pattern noise (CFPN comparator is proposed in 8-bit two-step single-slope analog-to-digital converter (TSSS ADC for the CIS that supports normal, 1/2, 1/4, 1/8, 1/16, 1/32, and 1/64 mode of pixel resolution. We show that the scaled-resolution images enable CIS to reduce total power consumption while images hold steady without events. A prototype sensor of 176 × 144 pixels has been fabricated with a 0.18 μm 1-poly 4-metal CMOS process. The area of 4-shared 4T-active pixel sensor (APS is 4.4 μm × 4.4 μm and the total chip size is 2.35 mm × 2.35 mm. The maximum power consumption is 10 mW (with full resolution with supply voltages of 3.3 V (analog and 1.8 V (digital and 14 frame/s of frame rates.

  10. A Multi-Resolution Mode CMOS Image Sensor with a Novel Two-Step Single-Slope ADC for Intelligent Surveillance Systems.

    Science.gov (United States)

    Kim, Daehyeok; Song, Minkyu; Choe, Byeongseong; Kim, Soo Youn

    2017-06-25

    In this paper, we present a multi-resolution mode CMOS image sensor (CIS) for intelligent surveillance system (ISS) applications. A low column fixed-pattern noise (CFPN) comparator is proposed in 8-bit two-step single-slope analog-to-digital converter (TSSS ADC) for the CIS that supports normal, 1/2, 1/4, 1/8, 1/16, 1/32, and 1/64 mode of pixel resolution. We show that the scaled-resolution images enable CIS to reduce total power consumption while images hold steady without events. A prototype sensor of 176 × 144 pixels has been fabricated with a 0.18 μm 1-poly 4-metal CMOS process. The area of 4-shared 4T-active pixel sensor (APS) is 4.4 μm × 4.4 μm and the total chip size is 2.35 mm × 2.35 mm. The maximum power consumption is 10 mW (with full resolution) with supply voltages of 3.3 V (analog) and 1.8 V (digital) and 14 frame/s of frame rates.

  11. Sensor assembly method using silicon interposer with trenches for three-dimensional binocular range sensors

    Science.gov (United States)

    Nakajima, Kazuhiro; Yamamoto, Yuji; Arima, Yutaka

    2018-04-01

    To easily assemble a three-dimensional binocular range sensor, we devised an alignment method for two image sensors using a silicon interposer with trenches. The trenches were formed using deep reactive ion etching (RIE) equipment. We produced a three-dimensional (3D) range sensor using the method and experimentally confirmed that sufficient alignment accuracy was realized. It was confirmed that the alignment accuracy of the two image sensors when using the proposed method is more than twice that of the alignment assembly method on a conventional board. In addition, as a result of evaluating the deterioration of the detection performance caused by the alignment accuracy, it was confirmed that the vertical deviation between the corresponding pixels in the two image sensors is substantially proportional to the decrease in detection performance. Therefore, we confirmed that the proposed method can realize more than twice the detection performance of the conventional method. Through these evaluations, the effectiveness of the 3D binocular range sensor aligned by the silicon interposer with the trenches was confirmed.

  12. Noise Reduction Effect of Multiple-Sampling-Based Signal-Readout Circuits for Ultra-Low Noise CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Shoji Kawahito

    2016-11-01

    Full Text Available This paper discusses the noise reduction effect of multiple-sampling-based signal readout circuits for implementing ultra-low-noise image sensors. The correlated multiple sampling (CMS technique has recently become an important technology for high-gain column readout circuits in low-noise CMOS image sensors (CISs. This paper reveals how the column CMS circuits, together with a pixel having a high-conversion-gain charge detector and low-noise transistor, realizes deep sub-electron read noise levels based on the analysis of noise components in the signal readout chain from a pixel to the column analog-to-digital converter (ADC. The noise measurement results of experimental CISs are compared with the noise analysis and the effect of noise reduction to the sampling number is discussed at the deep sub-electron level. Images taken with three CMS gains of two, 16, and 128 show distinct advantage of image contrast for the gain of 128 (noise(median: 0.29 e−rms when compared with the CMS gain of two (2.4 e−rms, or 16 (1.1 e−rms.

  13. Photoacoustic imaging of blood vessels with a double-ring sensor featuring a narrow angular aperture

    NARCIS (Netherlands)

    Kolkman, R.G.M.; Hondebrink, Erwin; Steenbergen, Wiendelt; van Leeuwen, Ton; de Mul, F.F.M.

    2004-01-01

    A photoacoustic double-ring sensor, featuring a narrow angular aperture, is developed for laser-induced photoacoustic imaging of blood vessels. An integrated optical fiber enables reflection-mode detection of ultrasonic waves. By using the cross-correlation between the signals detected by the two

  14. New definitions of pointing stability - ac and dc effects. [constant and time-dependent pointing error effects on image sensor performance

    Science.gov (United States)

    Lucke, Robert L.; Sirlin, Samuel W.; San Martin, A. M.

    1992-01-01

    For most imaging sensors, a constant (dc) pointing error is unimportant (unless large), but time-dependent (ac) errors degrade performance by either distorting or smearing the image. When properly quantified, the separation of the root-mean-square effects of random line-of-sight motions into dc and ac components can be used to obtain the minimum necessary line-of-sight stability specifications. The relation between stability requirements and sensor resolution is discussed, with a view to improving communication between the data analyst and the control systems engineer.

  15. Adaptive Sensor Optimization and Cognitive Image Processing Using Autonomous Optical Neuroprocessors; TOPICAL

    International Nuclear Information System (INIS)

    CAMERON, STEWART M.

    2001-01-01

    Measurement and signal intelligence demands has created new requirements for information management and interoperability as they affect surveillance and situational awareness. Integration of on-board autonomous learning and adaptive control structures within a remote sensing platform architecture would substantially improve the utility of intelligence collection by facilitating real-time optimization of measurement parameters for variable field conditions. A problem faced by conventional digital implementations of intelligent systems is the conflict between a distributed parallel structure on a sequential serial interface functionally degrading bandwidth and response time. In contrast, optically designed networks exhibit the massive parallelism and interconnect density needed to perform complex cognitive functions within a dynamic asynchronous environment. Recently, all-optical self-organizing neural networks exhibiting emergent collective behavior which mimic perception, recognition, association, and contemplative learning have been realized using photorefractive holography in combination with sensory systems for feature maps, threshold decomposition, image enhancement, and nonlinear matched filters. Such hybrid information processors depart from the classical computational paradigm based on analytic rules-based algorithms and instead utilize unsupervised generalization and perceptron-like exploratory or improvisational behaviors to evolve toward optimized solutions. These systems are robust to instrumental systematics or corrupting noise and can enrich knowledge structures by allowing competition between multiple hypotheses. This property enables them to rapidly adapt or self-compensate for dynamic or imprecise conditions which would be unstable using conventional linear control models. By incorporating an intelligent optical neuroprocessor in the back plane of an imaging sensor, a broad class of high-level cognitive image analysis problems including geometric

  16. Network compensation for missing sensors

    Science.gov (United States)

    Ahumada, Albert J., Jr.; Mulligan, Jeffrey B.

    1991-01-01

    A network learning translation invariance algorithm to compute interpolation functions is presented. This algorithm with one fixed receptive field can construct a linear transformation compensating for gain changes, sensor position jitter, and sensor loss when there are enough remaining sensors to adequately sample the input images. However, when the images are undersampled and complete compensation is not possible, the algorithm need to be modified. For moderate sensor losses, the algorithm works if the transformation weight adjustment is restricted to the weights to output units affected by the loss.

  17. Testing the effects of perimeter fencing and elephant exclosures on lion predation patterns in a Kenyan wildlife conservancy.

    Science.gov (United States)

    Dupuis-Desormeaux, Marc; Davidson, Zeke; Pratt, Laura; Mwololo, Mary; MacDonald, Suzanne E

    2016-01-01

    The use of fences to segregate wildlife can change predator and prey behaviour. Predators can learn to incorporate fencing into their hunting strategies and prey can learn to avoid foraging near fences. A twelve-strand electric predator-proof fence surrounds our study site. There are also porous one-strand electric fences used to create exclosures where elephant (and giraffe) cannot enter in order to protect blocs of browse vegetation for two critically endangered species, the black rhinoceros (Diceros bicornis) and the Grevy's zebra (Equus grevyi). The denser vegetation in these exclosures attracts both browsing prey and ambush predators. In this study we examined if lion predation patterns differed near the perimeter fencing and inside the elephant exclosures by mapping the location of kills. We used a spatial analysis to compare the predation patterns near the perimeter fencing and inside the exclosures to predation in the rest of the conservancy. Predation was not over-represented near the perimeter fence but the pattern of predation near the fence suggests that fences may be a contributing factor to predation success. Overall, we found that predation was over-represented inside and within 50 m of the exclosures. However, by examining individual exclosures in greater detail using a hot spot analysis, we found that only a few exclosures contained lion predation hot spots. Although some exclosures provide good hunting grounds for lions, we concluded that exclosures did not necessarily create prey-traps per se and that managers could continue to use this type of exclusionary fencing to protect stands of dense vegetation.

  18. Testing the effects of perimeter fencing and elephant exclosures on lion predation patterns in a Kenyan wildlife conservancy

    Directory of Open Access Journals (Sweden)

    Marc Dupuis-Desormeaux

    2016-02-01

    Full Text Available The use of fences to segregate wildlife can change predator and prey behaviour. Predators can learn to incorporate fencing into their hunting strategies and prey can learn to avoid foraging near fences. A twelve-strand electric predator-proof fence surrounds our study site. There are also porous one-strand electric fences used to create exclosures where elephant (and giraffe cannot enter in order to protect blocs of browse vegetation for two critically endangered species, the black rhinoceros (Diceros bicornis and the Grevy’s zebra (Equus grevyi. The denser vegetation in these exclosures attracts both browsing prey and ambush predators. In this study we examined if lion predation patterns differed near the perimeter fencing and inside the elephant exclosures by mapping the location of kills. We used a spatial analysis to compare the predation patterns near the perimeter fencing and inside the exclosures to predation in the rest of the conservancy. Predation was not over-represented near the perimeter fence but the pattern of predation near the fence suggests that fences may be a contributing factor to predation success. Overall, we found that predation was over-represented inside and within 50 m of the exclosures. However, by examining individual exclosures in greater detail using a hot spot analysis, we found that only a few exclosures contained lion predation hot spots. Although some exclosures provide good hunting grounds for lions, we concluded that exclosures did not necessarily create prey-traps per se and that managers could continue to use this type of exclusionary fencing to protect stands of dense vegetation.

  19. High Dynamic Range Imaging at the Quantum Limit with Single Photon Avalanche Diode-Based Image Sensors

    Science.gov (United States)

    Mattioli Della Rocca, Francescopaolo

    2018-01-01

    This paper examines methods to best exploit the High Dynamic Range (HDR) of the single photon avalanche diode (SPAD) in a high fill-factor HDR photon counting pixel that is scalable to megapixel arrays. The proposed method combines multi-exposure HDR with temporal oversampling in-pixel. We present a silicon demonstration IC with 96 × 40 array of 8.25 µm pitch 66% fill-factor SPAD-based pixels achieving >100 dB dynamic range with 3 back-to-back exposures (short, mid, long). Each pixel sums 15 bit-planes or binary field images internally to constitute one frame providing 3.75× data compression, hence the 1k frames per second (FPS) output off-chip represents 45,000 individual field images per second on chip. Two future projections of this work are described: scaling SPAD-based image sensors to HDR 1 MPixel formats and shrinking the pixel pitch to 1–3 µm. PMID:29641479

  20. A novel CMOS image sensor system for quantitative loop-mediated isothermal amplification assays to detect food-borne pathogens.

    Science.gov (United States)

    Wang, Tiantian; Kim, Sanghyo; An, Jeong Ho

    2017-02-01

    Loop-mediated isothermal amplification (LAMP) is considered as one of the alternatives to the conventional PCR and it is an inexpensive portable diagnostic system with minimal power consumption. The present work describes the application of LAMP in real-time photon detection and quantitative analysis of nucleic acids integrated with a disposable complementary-metal-oxide semiconductor (CMOS) image sensor. This novel system works as an amplification-coupled detection platform, relying on a CMOS image sensor, with the aid of a computerized circuitry controller for the temperature and light sources. The CMOS image sensor captures the light which is passing through the sensor surface and converts into digital units using an analog-to-digital converter (ADC). This new system monitors the real-time photon variation, caused by the color changes during amplification. Escherichia coli O157 was used as a proof-of-concept target for quantitative analysis, and compared with the results for Staphylococcus aureus and Salmonella enterica to confirm the efficiency of the system. The system detected various DNA concentrations of E. coli O157 in a short time (45min), with a detection limit of 10fg/μL. The low-cost, simple, and compact design, with low power consumption, represents a significant advance in the development of a portable, sensitive, user-friendly, real-time, and quantitative analytic tools for point-of-care diagnosis. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Bone age assessment by digital images

    International Nuclear Information System (INIS)

    Silva, Ana Maria Marques da

    1996-01-01

    An algorithm which allows bone age assessment by digital radiological images was developed. For geometric parameters extraction, the phalangeal and metacarpal regions of interest are enhanced and segmented, through spatial and morphological filtering. This study is based on perimeter, length and area, from distal to proximal portions. The quantification of these parameters make possible comparison between chronological and skeletal age, using growth standard tables

  2. Robust Automated Image Co-Registration of Optical Multi-Sensor Time Series Data: Database Generation for Multi-Temporal Landslide Detection

    Directory of Open Access Journals (Sweden)

    Robert Behling

    2014-03-01

    Full Text Available Reliable multi-temporal landslide detection over longer periods of time requires multi-sensor time series data characterized by high internal geometric stability, as well as high relative and absolute accuracy. For this purpose, a new methodology for fully automated co-registration has been developed allowing efficient and robust spatial alignment of standard orthorectified data products originating from a multitude of optical satellite remote sensing data of varying spatial resolution. Correlation-based co-registration uses world-wide available terrain corrected Landsat Level 1T time series data as the spatial reference, ensuring global applicability. The developed approach has been applied to a multi-sensor time series of 592 remote sensing datasets covering an approximately 12,000 km2 area in Southern Kyrgyzstan (Central Asia strongly affected by landslides. The database contains images acquired during the last 26 years by Landsat (ETM, ASTER, SPOT and RapidEye sensors. Analysis of the spatial shifts obtained from co-registration has revealed sensor-specific alignments ranging between 5 m and more than 400 m. Overall accuracy assessment of these alignments has resulted in a high relative image-to-image accuracy of 17 m (RMSE and a high absolute accuracy of 23 m (RMSE for the whole co-registered database, making it suitable for multi-temporal landslide detection at a regional scale in Southern Kyrgyzstan.

  3. Illumination adaptation with rapid-response color sensors

    Science.gov (United States)

    Zhang, Xinchi; Wang, Quan; Boyer, Kim L.

    2014-09-01

    Smart lighting solutions based on imaging sensors such as webcams or time-of-flight sensors suffer from rising privacy concerns. In this work, we use low-cost non-imaging color sensors to measure local luminous flux of different colors in an indoor space. These sensors have much higher data acquisition rate and are much cheaper than many o_-the-shelf commercial products. We have developed several applications with these sensors, including illumination feedback control and occupancy-driven lighting.

  4. Non-Quality Controlled Lightning Imaging Sensor (LIS) on International Space Station (ISS) Provisional Science Data Vp0

    Data.gov (United States)

    National Aeronautics and Space Administration — The International Space Station (ISS) Lightning Imaging Sensor (LIS) datasets were collected by the LIS instrument on the ISS used to detect the distribution and...

  5. A report on digital image processing and analysis

    International Nuclear Information System (INIS)

    Singh, B.; Alex, J.; Haridasan, G.

    1989-01-01

    This report presents developments in software, connected with digital image processing and analysis in the Centre. In image processing, one resorts to either alteration of grey level values so as to enhance features in the image or resorts to transform domain operations for restoration or filtering. Typical transform domain operations like Karhunen-Loeve transforms are statistical in nature and are used for a good registration of images or template - matching. Image analysis procedures segment grey level images into images contained within selectable windows, for the purpose of estimating geometrical features in the image, like area, perimeter, projections etc. In short, in image processing both the input and output are images, whereas in image analyses, the input is an image whereas the output is a set of numbers and graphs. (author). 19 refs

  6. A Passive Learning Sensor Architecture for Multimodal Image Labeling: An Application for Social Robots

    Directory of Open Access Journals (Sweden)

    Marco A. Gutiérrez

    2017-02-01

    Full Text Available Object detection and classification have countless applications in human–robot interacting systems. It is a necessary skill for autonomous robots that perform tasks in household scenarios. Despite the great advances in deep learning and computer vision, social robots performing non-trivial tasks usually spend most of their time finding and modeling objects. Working in real scenarios means dealing with constant environment changes and relatively low-quality sensor data due to the distance at which objects are often found. Ambient intelligence systems equipped with different sensors can also benefit from the ability to find objects, enabling them to inform humans about their location. For these applications to succeed, systems need to detect the objects that may potentially contain other objects, working with relatively low-resolution sensor data. A passive learning architecture for sensors has been designed in order to take advantage of multimodal information, obtained using an RGB-D camera and trained semantic language models. The main contribution of the architecture lies in the improvement of the performance of the sensor under conditions of low resolution and high light variations using a combination of image labeling and word semantics. The tests performed on each of the stages of the architecture compare this solution with current research labeling techniques for the application of an autonomous social robot working in an apartment. The results obtained demonstrate that the proposed sensor architecture outperforms state-of-the-art approaches.

  7. Features Extraction of Flotation Froth Images and BP Neural Network Soft-Sensor Model of Concentrate Grade Optimized by Shuffled Cuckoo Searching Algorithm

    Directory of Open Access Journals (Sweden)

    Jie-sheng Wang

    2014-01-01

    Full Text Available For meeting the forecasting target of key technology indicators in the flotation process, a BP neural network soft-sensor model based on features extraction of flotation froth images and optimized by shuffled cuckoo search algorithm is proposed. Based on the digital image processing technique, the color features in HSI color space, the visual features based on the gray level cooccurrence matrix, and the shape characteristics based on the geometric theory of flotation froth images are extracted, respectively, as the input variables of the proposed soft-sensor model. Then the isometric mapping method is used to reduce the input dimension, the network size, and learning time of BP neural network. Finally, a shuffled cuckoo search algorithm is adopted to optimize the BP neural network soft-sensor model. Simulation results show that the model has better generalization results and prediction accuracy.

  8. Imaging Sensor Flight and Test Equipment Software

    Science.gov (United States)

    Freestone, Kathleen; Simeone, Louis; Robertson, Byran; Frankford, Maytha; Trice, David; Wallace, Kevin; Wilkerson, DeLisa

    2007-01-01

    The Lightning Imaging Sensor (LIS) is one of the components onboard the Tropical Rainfall Measuring Mission (TRMM) satellite, and was designed to detect and locate lightning over the tropics. The LIS flight code was developed to run on a single onboard digital signal processor, and has operated the LIS instrument since 1997 when the TRMM satellite was launched. The software provides controller functions to the LIS Real-Time Event Processor (RTEP) and onboard heaters, collects the lightning event data from the RTEP, compresses and formats the data for downlink to the satellite, collects housekeeping data and formats the data for downlink to the satellite, provides command processing and interface to the spacecraft communications and data bus, and provides watchdog functions for error detection. The Special Test Equipment (STE) software was designed to operate specific test equipment used to support the LIS hardware through development, calibration, qualification, and integration with the TRMM spacecraft. The STE software provides the capability to control instrument activation, commanding (including both data formatting and user interfacing), data collection, decompression, and display and image simulation. The LIS STE code was developed for the DOS operating system in the C programming language. Because of the many unique data formats implemented by the flight instrument, the STE software was required to comprehend the same formats, and translate them for the test operator. The hardware interfaces to the LIS instrument using both commercial and custom computer boards, requiring that the STE code integrate this variety into a working system. In addition, the requirement to provide RTEP test capability dictated the need to provide simulations of background image data with short-duration lightning transients superimposed. This led to the development of unique code used to control the location, intensity, and variation above background for simulated lightning strikes

  9. Contactless respiratory monitoring system for magnetic resonance imaging applications using a laser range sensor

    Directory of Open Access Journals (Sweden)

    Krug Johannes W.

    2016-09-01

    Full Text Available During a magnetic resonance imaging (MRI exam, a respiratory signal can be required for different purposes, e.g. for patient monitoring, motion compensation or for research studies such as in functional MRI. In addition, respiratory information can be used as a biofeedback for the patient in order to control breath holds or shallow breathing. To reduce patient preparation time or distortions of the MR imaging system, we propose the use of a contactless approach for gathering information about the respiratory activity. An experimental setup based on a commercially available laser range sensor was used to detect respiratory induced motion of the chest or abdomen. This setup was tested using a motion phantom and different human subjects in an MRI scanner. A nasal airflow sensor served as a reference. For both, the phantom as well as the different human subjects, the motion frequency was precisely measured. These results show that a low cost, contactless, laser-based approach can be used to obtain information about the respiratory motion during an MRI exam.

  10. Fast responsive fluorescence turn-on sensor for Cu{sup 2+} and its application in live cell imaging

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jiaoliang, E-mail: wangjiaoliang@126.com [College of Chemistry and Environment Engineering, Hunan City University, Yiyang 413000 (China); Li Hao; Long Liping; Xiao Guqing; Xie Dan [College of Chemistry and Environment Engineering, Hunan City University, Yiyang 413000 (China)

    2012-09-15

    A new effective fluorescent sensor based on rhodamine was synthesized, which was induced by Cu{sup 2+} in aqueous media to produce turn-on fluorescence. The new sensor 1 exhibited good selectivity for Cu{sup 2+} over other heavy and transition metal (HTM) ions in H{sub 2}O/CH{sub 3}CN(7:3, v/v). Upon addition of Cu{sup 2+}, a remarkable color change from colorless to pink was easily observed by the naked eye, and the dramatic fluorescence turn-on was corroborated. Furthermore, kinetic assay indicates that sensor 1 could be used for real-time tracking of Cu{sup 2+} in cells and organisms. In addition, the turn-on fluorescent change upon the addition of Cu{sup 2+} was also applied in bioimaging. - Highlights: Black-Right-Pointing-Pointer A new effective fluorescent sensor based on rhodamine was developed to detect Cu{sup 2+}. Black-Right-Pointing-Pointer The sensor exhibited fast response, good selectivity at physiological pH condition. Black-Right-Pointing-Pointer The sensor was an effective intracellular Cu{sup 2+} ion imaging agent.

  11. Portable reconfigurable line sensor (PRLS) and technology transfer

    International Nuclear Information System (INIS)

    MacKenzie, D.P.; Buckle, T.H.; Blattman, D.A.

    1993-01-01

    The Portable Reconfigurable Line Sensor (PRLS) is a bistatic, pulsed-Doppler, microwave intrusion detection system developed at Sandia National Laboratories for the US Air Force. The PRLS is rapidly and easily deployed, and can detect intruders ranging from a slow creeping intruder to a high speed vehicle. The system has a sharply defined detection zone and will not falsely alarm on nearby traffic. Unlike most microwave sensors, the PRLS requires no alignment or calibration. Its portability, battery operation, ease of setup, and RF alarm reporting capability make it an excellent choice for perimeter, portal, and gap-filler applications in the important new field of rapidly-deployable sensor systems. In October 1992, the US Air Force and Racon, Inc., entered into a Cooperative Research and Development Agreement (CRADA) to commercialize the PRLS, jointly sharing government and industry resources. The Air Force brings the user's perspective and requirements to the cooperative effort. Sandia, serving as the technical arm of the Air Force, adds the actual PRLS technology to the joint effort, and provides security systems and radar development expertise. Racon puts the Air Force requirements and Sandia technology together into a commercial product, making the system meet important commercial manufacturing constraints. The result is a true ''win-win'' situation, with reduced government investment during the commercial development of the PRLS, and industry access to technology not otherwise available

  12. Infrared sensors and sensor fusion; Proceedings of the Meeting, Orlando, FL, May 19-21, 1987

    International Nuclear Information System (INIS)

    Buser, R.G.; Warren, F.B.

    1987-01-01

    The present conference discusses topics in the fields of IR sensor multifunctional design; image modeling, simulation, and detection; IR sensor configurations and components; thermal sensor arrays; silicide-based IR sensors; and IR focal plane array utilization. Attention is given to the fusion of lidar and FLIR for target segmentation and enhancement, the synergetic integration of thermal and visual images for computer vision, the 'Falcon Eye' FLIR system, multifunctional electrooptics and multiaperture sensors for precision-guided munitions, and AI approaches to data integration. Also discussed are the comparative performance of Ir silicide and Pt silicide photodiodes, high fill-factor silicide monolithic arrays, and the characterization of noise in staring IR focal plane arrays

  13. Chip-scale fluorescence microscope based on a silo-filter complementary metal-oxide semiconductor image sensor.

    Science.gov (United States)

    Ah Lee, Seung; Ou, Xiaoze; Lee, J Eugene; Yang, Changhuei

    2013-06-01

    We demonstrate a silo-filter (SF) complementary metal-oxide semiconductor (CMOS) image sensor for a chip-scale fluorescence microscope. The extruded pixel design with metal walls between neighboring pixels guides fluorescence emission through the thick absorptive filter to the photodiode of a pixel. Our prototype device achieves 13 μm resolution over a wide field of view (4.8 mm × 4.4 mm). We demonstrate bright-field and fluorescence longitudinal imaging of living cells in a compact, low-cost configuration.

  14. A contest of sensors in close range 3D imaging: performance evaluation with a new metric test object

    Directory of Open Access Journals (Sweden)

    M. Hess

    2014-06-01

    Full Text Available An independent means of 3D image quality assessment is introduced, addressing non-professional users of sensors and freeware, which is largely characterized as closed-sourced and by the absence of quality metrics for processing steps, such as alignment. A performance evaluation of commercially available, state-of-the-art close range 3D imaging technologies is demonstrated with the help of a newly developed Portable Metric Test Artefact. The use of this test object provides quality control by a quantitative assessment of 3D imaging sensors. It will enable users to give precise specifications which spatial resolution and geometry recording they expect as outcome from their 3D digitizing process. This will lead to the creation of high-quality 3D digital surrogates and 3D digital assets. The paper is presented in the form of a competition of teams, and a possible winner will emerge.

  15. Concept Study of Multi Sensor Detection Imaging and Explosive Confirmation of Mines

    Science.gov (United States)

    1998-03-20

    surface feature removal can be achieved in LMR images. Small Business Technology Transfer (STTR) Solicitation Topic 97T006 Mufi -Sensor Detection...divided by the applied voltage. This is mathematically given by: 00 Y-I-G+jB = 1o+2E’. COS m4; m1l 1-1 = j120 72(+a) where G = the input conductance...of detector operation that are incorporated into a mathematical algorithm to convert detector impedance characteristics into recognizable indicators

  16. Atomic-Scale Nuclear Spin Imaging Using Quantum-Assisted Sensors in Diamond

    Directory of Open Access Journals (Sweden)

    A. Ajoy

    2015-01-01

    Full Text Available Nuclear spin imaging at the atomic level is essential for the understanding of fundamental biological phenomena and for applications such as drug discovery. The advent of novel nanoscale sensors promises to achieve the long-standing goal of single-protein, high spatial-resolution structure determination under ambient conditions. In particular, quantum sensors based on the spin-dependent photoluminescence of nitrogen-vacancy (NV centers in diamond have recently been used to detect nanoscale ensembles of external nuclear spins. While NV sensitivity is approaching single-spin levels, extracting relevant information from a very complex structure is a further challenge since it requires not only the ability to sense the magnetic field of an isolated nuclear spin but also to achieve atomic-scale spatial resolution. Here, we propose a method that, by exploiting the coupling of the NV center to an intrinsic quantum memory associated with the nitrogen nuclear spin, can reach a tenfold improvement in spatial resolution, down to atomic scales. The spatial resolution enhancement is achieved through coherent control of the sensor spin, which creates a dynamic frequency filter selecting only a few nuclear spins at a time. We propose and analyze a protocol that would allow not only sensing individual spins in a complex biomolecule, but also unraveling couplings among them, thus elucidating local characteristics of the molecule structure.

  17. Physiological intracellular crowdedness is defined by perimeter to area ratio of subcellular compartments

    Directory of Open Access Journals (Sweden)

    Noriko eHiroi

    2012-07-01

    Full Text Available The intracellular environment is known to be a crowded and inhomogeneous space. Such an in vivo environment differs from a well-diluted, homogeneous environment for biochemical reactions. However, the effects of both crowdedness and the inhomogeneity of environment on the behavior of a mobile particle have not yet been investigated sufficiently. As described in this paper, we constructed artificial reaction spaces with fractal models, which are assumed to be non-reactive solid obstacles in a reaction space with crevices that function as operating ranges for mobile particles threading the space. Because of the homogeneity of the structures of artificial reaction spaces, the models succeeded in reproducing the physiological fractal dimension of solid structures with a smaller number of non-reactive obstacles than in the physiological condition. This incomplete compatibility was mitigated when we chose a suitable condition of a perimeter-to-area ratio of the operating range to our model. Our results also show that a simulation space is partitioned into convenient reaction compartments as an in vivo environment with the exact amount of solid structures estimated from TEM images. The characteristics of these compartments engender larger mean square displacement of a mobile particle than that of particles in smaller compartments. Subsequently, the particles start to show confined particle-like behavior. These results are compatible with our previously presented results, which predicted that a physiological environment would produce quick-response and slow-exhaustion reactions.

  18. Use of LST images from MODIS/AQUA sensor as an indication of frost occurrence in RS

    Directory of Open Access Journals (Sweden)

    Débora de S. Simões

    2015-10-01

    Full Text Available ABSTRACTAlthough frost occurrence causes severe losses in agriculture, especially in the south of Brazil, the data of minimum air temperature (Tmin currently available for monitoring and predicting frosts show insufficient spatial distribution. This study aimed to evaluate the MDY11A1 (LST – Land Surface Temperature product, from the MODIS sensor on board the AQUA satellite as an estimator of frost occurrence in the southeast of the state of Rio Grande do Sul, Brazil. LST images from the nighttime overpass of the MODIS/AQUA sensor for the months of June, July and August from 2006 to 2012, and data from three conventional weather stations of the National Institute of Meteorology (INMET were used. Consistency was observed between Tmin data measured in weather stations and LST data obtained from the MODIS sensor. According to the results, LSTs below 3 ºC recorded by the MODIS/AQUA sensor are an indication of a favorable scenario to frost occurrence.

  19. Study on polarized optical flow algorithm for imaging bionic polarization navigation micro sensor

    Science.gov (United States)

    Guan, Le; Liu, Sheng; Li, Shi-qi; Lin, Wei; Zhai, Li-yuan; Chu, Jin-kui

    2018-05-01

    At present, both the point source and the imaging polarization navigation devices only can output the angle information, which means that the velocity information of the carrier cannot be extracted from the polarization field pattern directly. Optical flow is an image-based method for calculating the velocity of pixel point movement in an image. However, for ordinary optical flow, the difference in pixel value as well as the calculation accuracy can be reduced in weak light. Polarization imaging technology has the ability to improve both the detection accuracy and the recognition probability of the target because it can acquire the extra polarization multi-dimensional information of target radiation or reflection. In this paper, combining the polarization imaging technique with the traditional optical flow algorithm, a polarization optical flow algorithm is proposed, and it is verified that the polarized optical flow algorithm has good adaptation in weak light and can improve the application range of polarization navigation sensors. This research lays the foundation for day and night all-weather polarization navigation applications in future.

  20. Optical fiber sensors for image formation in radiodiagnostic - preliminary essays; Sensores a fibra optica para formacao de imagens em radiodiagnostico - ensaios preliminares

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Cesar C. de; Werneck, Marcelo M. [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Biomedica

    1998-07-01

    This work describes preliminary experiments that will bring subsidies to analyze the capability to implement a system able to capture radiological images with new sensor system, comprised by FOs scanning process and I-CCD camera. These experiments have the main objective to analyze the optical response from FOs bundle, with several typos of scintillators associated with them, when it is submitted to medical x-rays exposition. (author)

  1. A Sensitive Dynamic and Active Pixel Vision Sensor for Color or Neural Imaging Applications.

    Science.gov (United States)

    Moeys, Diederik Paul; Corradi, Federico; Li, Chenghan; Bamford, Simeon A; Longinotti, Luca; Voigt, Fabian F; Berry, Stewart; Taverni, Gemma; Helmchen, Fritjof; Delbruck, Tobi

    2018-02-01

    Applications requiring detection of small visual contrast require high sensitivity. Event cameras can provide higher dynamic range (DR) and reduce data rate and latency, but most existing event cameras have limited sensitivity. This paper presents the results of a 180-nm Towerjazz CIS process vision sensor called SDAVIS192. It outputs temporal contrast dynamic vision sensor (DVS) events and conventional active pixel sensor frames. The SDAVIS192 improves on previous DAVIS sensors with higher sensitivity for temporal contrast. The temporal contrast thresholds can be set down to 1% for negative changes in logarithmic intensity (OFF events) and down to 3.5% for positive changes (ON events). The achievement is possible through the adoption of an in-pixel preamplification stage. This preamplifier reduces the effective intrascene DR of the sensor (70 dB for OFF and 50 dB for ON), but an automated operating region control allows up to at least 110-dB DR for OFF events. A second contribution of this paper is the development of characterization methodology for measuring DVS event detection thresholds by incorporating a measure of signal-to-noise ratio (SNR). At average SNR of 30 dB, the DVS temporal contrast threshold fixed pattern noise is measured to be 0.3%-0.8% temporal contrast. Results comparing monochrome and RGBW color filter array DVS events are presented. The higher sensitivity of SDAVIS192 make this sensor potentially useful for calcium imaging, as shown in a recording from cultured neurons expressing calcium sensitive green fluorescent protein GCaMP6f.

  2. Modeling the dark current histogram induced by gold contamination in complementary-metal-oxide-semiconductor image sensors

    Energy Technology Data Exchange (ETDEWEB)

    Domengie, F., E-mail: florian.domengie@st.com; Morin, P. [STMicroelectronics Crolles 2 (SAS), 850 Rue Jean Monnet, 38926 Crolles Cedex (France); Bauza, D. [CNRS, IMEP-LAHC - Grenoble INP, Minatec: 3, rue Parvis Louis Néel, CS 50257, 38016 Grenoble Cedex 1 (France)

    2015-07-14

    We propose a model for dark current induced by metallic contamination in a CMOS image sensor. Based on Shockley-Read-Hall kinetics, the expression of dark current proposed accounts for the electric field enhanced emission factor due to the Poole-Frenkel barrier lowering and phonon-assisted tunneling mechanisms. To that aim, we considered the distribution of the electric field magnitude and metal atoms in the depth of the pixel. Poisson statistics were used to estimate the random distribution of metal atoms in each pixel for a given contamination dose. Then, we performed a Monte-Carlo-based simulation for each pixel to set the number of metal atoms the pixel contained and the enhancement factor each atom underwent, and obtained a histogram of the number of pixels versus dark current for the full sensor. Excellent agreement with the dark current histogram measured on an ion-implanted gold-contaminated imager has been achieved, in particular, for the description of the distribution tails due to the pixel regions in which the contaminant atoms undergo a large electric field. The agreement remains very good when increasing the temperature by 15 °C. We demonstrated that the amplification of the dark current generated for the typical electric fields encountered in the CMOS image sensors, which depends on the nature of the metal contaminant, may become very large at high electric field. The electron and hole emissions and the resulting enhancement factor are described as a function of the trap characteristics, electric field, and temperature.

  3. Resistive wall heating due to image current on the beam chamber for a superconducting undulator.

    Energy Technology Data Exchange (ETDEWEB)

    Kim, S. H. (Accelerator Systems Division (APS))

    2012-03-27

    The image-current heating on the resistive beam chamber of a superconducting undulator (SCU) was calculated based on the normal and anomalous skin effects. Using the bulk resistivity of copper for the beam chamber, the heat loads were calculated for the residual resistivity ratios (RRRs) of unity at room temperature to 100 K at a cryogenic temperature as the reference. Then, using the resistivity of the specific aluminum alloy 6053-T5, which will be used for the SCU beam chamber, the heat loads were calculated. An electron beam stored in a storage ring induces an image current on the inner conducting wall, mainly within a skin depth, of the beam chamber. The image current, with opposite charge to the electron beam, travels along the chamber wall in the same direction as the electron beam. The average current in the storage ring consists of a number of bunches. When the pattern of the bunched beam is repeated according to the rf frequency, the beam current may be expressed in terms of a Fourier series. The time structure of the image current is assumed to be the same as that of the beam current. For a given resistivity of the chamber inner wall, the application ofthe normal or anomalous skin effect will depend on the harmonic numbers of the Fourier series of the beam current and the temperature of the chamber. For a round beam chamber with a ratius r, much larger than the beam size, one can assume that the image current density as well as the density square, may be uniform around the perimeter 2{pi}r. For the SCU beam chamber, which has a relatively narrow vertical gap compared to the width, the effective perimeter was estimated since the heat load should be proportional to the inverse of the perimeter.

  4. Robust site security using smart seismic array technology and multi-sensor data fusion

    Science.gov (United States)

    Hellickson, Dean; Richards, Paul; Reynolds, Zane; Keener, Joshua

    2010-04-01

    Traditional site security systems are susceptible to high individual sensor nuisance alarm rates that reduce the overall system effectiveness. Visual assessment of intrusions can be intensive and manually difficult as cameras are slewed by the system to non intrusion areas or as operators respond to nuisance alarms. Very little system intrusion performance data are available other than discrete sensor alarm indications that provide no real value. This paper discusses the system architecture, integration and display of a multi-sensor data fused system for wide area surveillance, local site intrusion detection and intrusion classification. The incorporation of a novel seismic array of smart sensors using FK Beamforming processing that greatly enhances the overall system detection and classification performance of the system is discussed. Recent test data demonstrates the performance of the seismic array within several different installations and its ability to classify and track moving targets at significant standoff distances with exceptional immunity to background clutter and noise. Multi-sensor data fusion is applied across a suite of complimentary sensors eliminating almost all nuisance alarms while integrating within a geographical information system to feed a visual-fusion display of the area being secured. Real-time sensor detection and intrusion classification data is presented within a visual-fusion display providing greatly enhanced situational awareness, system performance information and real-time assessment of intrusions and situations of interest with limited security operator involvement. This approach scales from a small local perimeter to very large geographical area and can be used across multiple sites controlled at a single command and control station.

  5. A Solar Position Sensor Based on Image Vision.

    Science.gov (United States)

    Ruelas, Adolfo; Velázquez, Nicolás; Villa-Angulo, Carlos; Acuña, Alexis; Rosales, Pedro; Suastegui, José

    2017-07-29

    Solar collector technologies operate with better performance when the Sun beam direction is normal to the capturing surface, and for that to happen despite the relative movement of the Sun, solar tracking systems are used, therefore, there are rules and standards that need minimum accuracy for these tracking systems to be used in solar collectors' evaluation. Obtaining accuracy is not an easy job, hence in this document the design, construction and characterization of a sensor based on a visual system that finds the relative azimuth error and height of the solar surface of interest, is presented. With these characteristics, the sensor can be used as a reference in control systems and their evaluation. The proposed sensor is based on a microcontroller with a real-time clock, inertial measurement sensors, geolocation and a vision sensor, that obtains the angle of incidence from the sunrays' direction as well as the tilt and sensor position. The sensor's characterization proved how a measurement of a focus error or a Sun position can be made, with an accuracy of 0.0426° and an uncertainty of 0.986%, which can be modified to reach an accuracy under 0.01°. The validation of this sensor was determined showing the focus error on one of the best commercial solar tracking systems, a Kipp & Zonen SOLYS 2. To conclude, the solar tracking sensor based on a vision system meets the Sun detection requirements and components that meet the accuracy conditions to be used in solar tracking systems and their evaluation or, as a tracking and orientation tool, on photovoltaic installations and solar collectors.

  6. Amorphous and Polycrystalline Photoconductors for Direct Conversion Flat Panel X-Ray Image Sensors

    Directory of Open Access Journals (Sweden)

    Karim S. Karim

    2011-05-01

    Full Text Available In the last ten to fifteen years there has been much research in using amorphous and polycrystalline semiconductors as x-ray photoconductors in various x-ray image sensor applications, most notably in flat panel x-ray imagers (FPXIs. We first outline the essential requirements for an ideal large area photoconductor for use in a FPXI, and discuss how some of the current amorphous and polycrystalline semiconductors fulfill these requirements. At present, only stabilized amorphous selenium (doped and alloyed a-Se has been commercialized, and FPXIs based on a-Se are particularly suitable for mammography, operating at the ideal limit of high detective quantum efficiency (DQE. Further, these FPXIs can also be used in real-time, and have already been used in such applications as tomosynthesis. We discuss some of the important attributes of amorphous and polycrystalline x-ray photoconductors such as their large area deposition ability, charge collection efficiency, x-ray sensitivity, DQE, modulation transfer function (MTF and the importance of the dark current. We show the importance of charge trapping in limiting not only the sensitivity but also the resolution of these detectors. Limitations on the maximum acceptable dark current and the corresponding charge collection efficiency jointly impose a practical constraint that many photoconductors fail to satisfy. We discuss the case of a-Se in which the dark current was brought down by three orders of magnitude by the use of special blocking layers to satisfy the dark current constraint. There are also a number of polycrystalline photoconductors, HgI2 and PbO being good examples, that show potential for commercialization in the same way that multilayer stabilized a-Se x-ray photoconductors were developed for commercial applications. We highlight the unique nature of avalanche multiplication in a-Se and how it has led to the development of the commercial HARP video-tube. An all solid state version of the

  7. Design of a Solar Tracking System Using the Brightest Region in the Sky Image Sensor.

    Science.gov (United States)

    Wei, Ching-Chuan; Song, Yu-Chang; Chang, Chia-Chi; Lin, Chuan-Bi

    2016-11-25

    Solar energy is certainly an energy source worth exploring and utilizing because of the environmental protection it offers. However, the conversion efficiency of solar energy is still low. If the photovoltaic panel perpendicularly tracks the sun, the solar energy conversion efficiency will be improved. In this article, we propose an innovative method to track the sun using an image sensor. In our method, it is logical to assume the points of the brightest region in the sky image representing the location of the sun. Then, the center of the brightest region is assumed to be the solar-center, and is mathematically calculated using an embedded processor (Raspberry Pi). Finally, the location information on the sun center is sent to the embedded processor to control two servo motors that are capable of moving both horizontally and vertically to track the sun. In comparison with the existing sun tracking methods using image sensors, such as the Hough transform method, our method based on the brightest region in the sky image remains accurate under conditions such as a sunny day and building shelter. The practical sun tracking system using our method was implemented and tested. The results reveal that the system successfully captured the real sun center in most weather conditions, and the servo motor system was able to direct the photovoltaic panel perpendicularly to the sun center. In addition, our system can be easily and practically integrated, and can operate in real-time.

  8. Design of a Solar Tracking System Using the Brightest Region in the Sky Image Sensor

    Directory of Open Access Journals (Sweden)

    Ching-Chuan Wei

    2016-11-01

    Full Text Available Solar energy is certainly an energy source worth exploring and utilizing because of the environmental protection it offers. However, the conversion efficiency of solar energy is still low. If the photovoltaic panel perpendicularly tracks the sun, the solar energy conversion efficiency will be improved. In this article, we propose an innovative method to track the sun using an image sensor. In our method, it is logical to assume the points of the brightest region in the sky image representing the location of the sun. Then, the center of the brightest region is assumed to be the solar-center, and is mathematically calculated using an embedded processor (Raspberry Pi. Finally, the location information on the sun center is sent to the embedded processor to control two servo motors that are capable of moving both horizontally and vertically to track the sun. In comparison with the existing sun tracking methods using image sensors, such as the Hough transform method, our method based on the brightest region in the sky image remains accurate under conditions such as a sunny day and building shelter. The practical sun tracking system using our method was implemented and tested. The results reveal that the system successfully captured the real sun center in most weather conditions, and the servo motor system was able to direct the photovoltaic panel perpendicularly to the sun center. In addition, our system can be easily and practically integrated, and can operate in real-time.

  9. Temporal Noise Analysis of Charge-Domain Sampling Readout Circuits for CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Xiaoliang Ge

    2018-02-01

    Full Text Available This paper presents a temporal noise analysis of charge-domain sampling readout circuits for Complementary Metal-Oxide Semiconductor (CMOS image sensors. In order to address the trade-off between the low input-referred noise and high dynamic range, a Gm-cell-based pixel together with a charge-domain correlated-double sampling (CDS technique has been proposed to provide a way to efficiently embed a tunable conversion gain along the read-out path. Such readout topology, however, operates in a non-stationery large-signal behavior, and the statistical properties of its temporal noise are a function of time. Conventional noise analysis methods for CMOS image sensors are based on steady-state signal models, and therefore cannot be readily applied for Gm-cell-based pixels. In this paper, we develop analysis models for both thermal noise and flicker noise in Gm-cell-based pixels by employing the time-domain linear analysis approach and the non-stationary noise analysis theory, which help to quantitatively evaluate the temporal noise characteristic of Gm-cell-based pixels. Both models were numerically computed in MATLAB using design parameters of a prototype chip, and compared with both simulation and experimental results. The good agreement between the theoretical and measurement results verifies the effectiveness of the proposed noise analysis models.

  10. Two-step single slope/SAR ADC with error correction for CMOS image sensor.

    Science.gov (United States)

    Tang, Fang; Bermak, Amine; Amira, Abbes; Amor Benammar, Mohieddine; He, Debiao; Zhao, Xiaojin

    2014-01-01

    Conventional two-step ADC for CMOS image sensor requires full resolution noise performance in the first stage single slope ADC, leading to high power consumption and large chip area. This paper presents an 11-bit two-step single slope/successive approximation register (SAR) ADC scheme for CMOS image sensor applications. The first stage single slope ADC generates a 3-bit data and 1 redundant bit. The redundant bit is combined with the following 8-bit SAR ADC output code using a proposed error correction algorithm. Instead of requiring full resolution noise performance, the first stage single slope circuit of the proposed ADC can tolerate up to 3.125% quantization noise. With the proposed error correction mechanism, the power consumption and chip area of the single slope ADC are significantly reduced. The prototype ADC is fabricated using 0.18 μ m CMOS technology. The chip area of the proposed ADC is 7 μ m × 500 μ m. The measurement results show that the energy efficiency figure-of-merit (FOM) of the proposed ADC core is only 125 pJ/sample under 1.4 V power supply and the chip area efficiency is 84 k  μ m(2) · cycles/sample.

  11. Two-Step Single Slope/SAR ADC with Error Correction for CMOS Image Sensor

    Directory of Open Access Journals (Sweden)

    Fang Tang

    2014-01-01

    Full Text Available Conventional two-step ADC for CMOS image sensor requires full resolution noise performance in the first stage single slope ADC, leading to high power consumption and large chip area. This paper presents an 11-bit two-step single slope/successive approximation register (SAR ADC scheme for CMOS image sensor applications. The first stage single slope ADC generates a 3-bit data and 1 redundant bit. The redundant bit is combined with the following 8-bit SAR ADC output code using a proposed error correction algorithm. Instead of requiring full resolution noise performance, the first stage single slope circuit of the proposed ADC can tolerate up to 3.125% quantization noise. With the proposed error correction mechanism, the power consumption and chip area of the single slope ADC are significantly reduced. The prototype ADC is fabricated using 0.18 μm CMOS technology. The chip area of the proposed ADC is 7 μm × 500 μm. The measurement results show that the energy efficiency figure-of-merit (FOM of the proposed ADC core is only 125 pJ/sample under 1.4 V power supply and the chip area efficiency is 84 k μm2·cycles/sample.

  12. Single Photon Counting Large Format Imaging Sensors with High Spatial and Temporal Resolution

    Science.gov (United States)

    Siegmund, O. H. W.; Ertley, C.; Vallerga, J. V.; Cremer, T.; Craven, C. A.; Lyashenko, A.; Minot, M. J.

    High time resolution astronomical and remote sensing applications have been addressed with microchannel plate based imaging, photon time tagging detector sealed tube schemes. These are being realized with the advent of cross strip readout techniques with high performance encoding electronics and atomic layer deposited (ALD) microchannel plate technologies. Sealed tube devices up to 20 cm square have now been successfully implemented with sub nanosecond timing and imaging. The objective is to provide sensors with large areas (25 cm2 to 400 cm2) with spatial resolutions of 5 MHz and event timing accuracy of 100 ps. High-performance ASIC versions of these electronics are in development with better event rate, power and mass suitable for spaceflight instruments.

  13. Wide dynamic logarithmic InGaAs sensor suitable for eye-safe active imaging

    Science.gov (United States)

    Ni, Yang; Bouvier, Christian; Arion, Bogdan; Noguier, Vincent

    2016-05-01

    In this paper, we present a simple method to analyze the injection efficiency of the photodiode interface circuit under fast shuttering conditions for active Imaging applications. This simple model has been inspired from the companion model for reactive elements largely used in CAD. In this paper, we demonstrate that traditional CTIA photodiode interface is not adequate for active imaging where fast and precise shuttering operation is necessary. Afterwards we present a direct amplification based photodiode interface which can provide an accurate and fast shuttering operation on photodiode. These considerations have been used in NIT's newly developed ROIC and corresponding SWIR sensors both in VGA 15um pitch (NSC1201) and also in QVGA 25um pitch (NSC1401).

  14. An automatic analyzer of solid state nuclear track detectors using an optic RAM as image sensor

    International Nuclear Information System (INIS)

    Staderini, E.M.; Castellano, A.

    1986-01-01

    An optic RAM is a conventional digital random access read/write dynamic memory device featuring a quartz windowed package and memory cells regularly ordered on the chip. Such a device is used as an image sensor because each cell retains data stored in it for a time depending on the intensity of the light incident on the cell itself. The authors have developed a system which uses an optic RAM to acquire and digitize images from electrochemically etched CR39 solid state nuclear track detectors (SSNTD) in the track count rate up to 5000 cm -2 . On the digital image so obtained, a microprocessor, with appropriate software, performs image analysis, filtering, tracks counting and evaluation. (orig.)

  15. Self-amplified CMOS image sensor using a current-mode readout circuit

    Science.gov (United States)

    Santos, Patrick M.; de Lima Monteiro, Davies W.; Pittet, Patrick

    2014-05-01

    The feature size of the CMOS processes decreased during the past few years and problems such as reduced dynamic range have become more significant in voltage-mode pixels, even though the integration of more functionality inside the pixel has become easier. This work makes a contribution on both sides: the possibility of a high signal excursion range using current-mode circuits together with functionality addition by making signal amplification inside the pixel. The classic 3T pixel architecture was rebuild with small modifications to integrate a transconductance amplifier providing a current as an output. The matrix with these new pixels will operate as a whole large transistor outsourcing an amplified current that will be used for signal processing. This current is controlled by the intensity of the light received by the matrix, modulated pixel by pixel. The output current can be controlled by the biasing circuits to achieve a very large range of output signal levels. It can also be controlled with the matrix size and this permits a very high degree of freedom on the signal level, observing the current densities inside the integrated circuit. In addition, the matrix can operate at very small integration times. Its applications would be those in which fast imaging processing, high signal amplification are required and low resolution is not a major problem, such as UV image sensors. Simulation results will be presented to support: operation, control, design, signal excursion levels and linearity for a matrix of pixels that was conceived using this new concept of sensor.

  16. Image registration of naval IR images

    Science.gov (United States)

    Rodland, Arne J.

    1996-06-01

    In a real world application an image from a stabilized sensor on a moving platform will not be 100 percent stabilized. There will always be a small unknown error in the stabilization due to factors such as dynamic deformations in the structure between sensor and reference Inertial Navigation Unit, servo inaccuracies, etc. For a high resolution imaging sensor this stabilization error causes the image to move several pixels in unknown direction between frames. TO be able to detect and track small moving objects from such a sensor, this unknown movement of the sensor image must be estimated. An algorithm that searches for land contours in the image has been evaluated. The algorithm searches for high contrast points distributed over the whole image. As long as moving objects in the scene only cover a small area of the scene, most of the points are located on solid ground. By matching the list of points from frame to frame, the movement of the image due to stabilization errors can be estimated and compensated. The point list is searched for points with diverging movement from the estimated stabilization error. These points are then assumed to be located on moving objects. Points assumed to be located on moving objects are gradually exchanged with new points located in the same area. Most of the processing is performed on the list of points and not on the complete image. The algorithm is therefore very fast and well suited for real time implementation. The algorithm has been tested on images from an experimental IR scanner. Stabilization errors were added artificially to the image such that the output from the algorithm could be compared with the artificially added stabilization errors.

  17. Deep Learning-Based Banknote Fitness Classification Using the Reflection Images by a Visible-Light One-Dimensional Line Image Sensor

    Directory of Open Access Journals (Sweden)

    Tuyen Danh Pham

    2018-02-01

    Full Text Available In automatic paper currency sorting, fitness classification is a technique that assesses the quality of banknotes to determine whether a banknote is suitable for recirculation or should be replaced. Studies on using visible-light reflection images of banknotes for evaluating their usability have been reported. However, most of them were conducted under the assumption that the denomination and input direction of the banknote are predetermined. In other words, a pre-classification of the type of input banknote is required. To address this problem, we proposed a deep learning-based fitness-classification method that recognizes the fitness level of a banknote regardless of the denomination and input direction of the banknote to the system, using the reflection images of banknotes by visible-light one-dimensional line image sensor and a convolutional neural network (CNN. Experimental results on the banknote image databases of the Korean won (KRW and the Indian rupee (INR with three fitness levels, and the Unites States dollar (USD with two fitness levels, showed that our method gives better classification accuracy than other methods.

  18. Deep Learning-Based Banknote Fitness Classification Using the Reflection Images by a Visible-Light One-Dimensional Line Image Sensor.

    Science.gov (United States)

    Pham, Tuyen Danh; Nguyen, Dat Tien; Kim, Wan; Park, Sung Ho; Park, Kang Ryoung

    2018-02-06

    In automatic paper currency sorting, fitness classification is a technique that assesses the quality of banknotes to determine whether a banknote is suitable for recirculation or should be replaced. Studies on using visible-light reflection images of banknotes for evaluating their usability have been reported. However, most of them were conducted under the assumption that the denomination and input direction of the banknote are predetermined. In other words, a pre-classification of the type of input banknote is required. To address this problem, we proposed a deep learning-based fitness-classification method that recognizes the fitness level of a banknote regardless of the denomination and input direction of the banknote to the system, using the reflection images of banknotes by visible-light one-dimensional line image sensor and a convolutional neural network (CNN). Experimental results on the banknote image databases of the Korean won (KRW) and the Indian rupee (INR) with three fitness levels, and the Unites States dollar (USD) with two fitness levels, showed that our method gives better classification accuracy than other methods.

  19. Preferential Hyperacuity Perimeter (PreView PHP) for detecting choroidal neovascularization study.

    Science.gov (United States)

    Alster, Yair; Bressler, Neil M; Bressler, Susan B; Brimacombe, Judith A; Crompton, R Michael; Duh, Yi-Jing; Gabel, Veit-Peter; Heier, Jeffrey S; Ip, Michael S; Loewenstein, Anat; Packo, Kirk H; Stur, Michael; Toaff, Techiya

    2005-10-01

    To assess the ability of the Preferential Hyperacuity Perimeter (PreView PHP; Carl Zeiss Meditec, Dublin, CA) to detect recent-onset choroidal neovascularization (CNV) resulting from age-related macular degeneration (AMD) and to differentiate it from an intermediate stage of AMD. Prospective, comparative, concurrent, nonrandomized, multicenter study. Eligible participants' study eyes had a corrected visual acuity of 20/160 or better and either untreated CNV from AMD diagnosed within the last 60 days or an intermediate stage of AMD. After obtaining consent, visual acuity with habitual correction, masked PHP testing, stereoscopic color fundus photography, and fluorescein angiography were performed. Photographs and angiograms were evaluated by graders masked to diagnosis and PHP results. The reading center's diagnosis determined if the patient was categorized as having intermediate AMD or neovascular AMD. A successful study outcome was defined a priori as a sensitivity of at least 80% and a specificity of at least 80%. Of 185 patients who gave consent to be enrolled, 11 (6%) had PHP results judged to be unreliable. An additional 52 were not included because they did not meet all eligibility criteria. Of the remaining 122 patients, 57 had an intermediate stage of AMD and 65 had neovascular AMD. The sensitivity to detect newly diagnosed CNV using PHP testing was 82% (95% confidence interval [CI], 70%-90%). The specificity to differentiate newly diagnosed CNV from the intermediate stage of AMD using PHP testing was 88% (95% CI, 76%-95%). Preferential Hyperacuity Perimeter testing can detect recent-onset CNV resulting from AMD and can differentiate it from an intermediate stage of AMD with high sensitivity and specificity. These data suggest that monitoring with PHP should detect most cases of CNV of recent onset with few false-positive results at a stage when treatment usually would be beneficial. Thus, this monitoring should be considered in the management of the

  20. Cost-Efficient Wafer-Level Capping for MEMS and Imaging Sensors by Adhesive Wafer Bonding

    Directory of Open Access Journals (Sweden)

    Simon J. Bleiker

    2016-10-01

    Full Text Available Device encapsulation and packaging often constitutes a substantial part of the fabrication cost of micro electro-mechanical systems (MEMS transducers and imaging sensor devices. In this paper, we propose a simple and cost-effective wafer-level capping method that utilizes a limited number of highly standardized process steps as well as low-cost materials. The proposed capping process is based on low-temperature adhesive wafer bonding, which ensures full complementary metal-oxide-semiconductor (CMOS compatibility. All necessary fabrication steps for the wafer bonding, such as cavity formation and deposition of the adhesive, are performed on the capping substrate. The polymer adhesive is deposited by spray-coating on the capping wafer containing the cavities. Thus, no lithographic patterning of the polymer adhesive is needed, and material waste is minimized. Furthermore, this process does not require any additional fabrication steps on the device wafer, which lowers the process complexity and fabrication costs. We demonstrate the proposed capping method by packaging two different MEMS devices. The two MEMS devices include a vibration sensor and an acceleration switch, which employ two different electrical interconnection schemes. The experimental results show wafer-level capping with excellent bond quality due to the re-flow behavior of the polymer adhesive. No impediment to the functionality of the MEMS devices was observed, which indicates that the encapsulation does not introduce significant tensile nor compressive stresses. Thus, we present a highly versatile, robust, and cost-efficient capping method for components such as MEMS and imaging sensors.

  1. Farmers’ willingness to pay for surface water in the West Mitidja irrigated perimeter, northern Algeria

    International Nuclear Information System (INIS)

    Azzi, M.; Calatrava, J.; Bedrani, S.

    2018-01-01

    Algeria is among the most water-stressed countries in the world. Because of its climatic conditions, irrigation is essential for agricultural production. Water prices paid by farmers in public irrigation districts are very low and do not cover the operation and maintenance (O&M) costs of the irrigated perimeters, thus leading to the deterioration of these infrastructures. The objective of this paper is to analyse whether farmer’s in the West Mitidja irrigation district in northern Algeria would be willing to pay more for surface water in order to maintain the water supply service in its current conditions. We estimated farmers’ willingness to pay (WTP) for water using data from a dichotomous choice contingent valuation survey to 112 randomly selected farmers. Farmers’ responses were modelled using logistic regression techniques. We also analysed which technical, structural, social and economic characteristics of farms and farmers explain the differences in WTP. Our results showed that nearly 80% of the surveyed farmers are willing to pay an extra price for irrigation water. The average WTP was 64% greater than the price currently paid by farmers, suggesting some scope for improving the financial resources of the Mitidja irrigated perimeter, but insufficient to cover all O&M costs. Some of the key identified factors that affect WTP for surface water relate to farm ownership, access to groundwater resources, cropping patterns, farmers’ agricultural training and risk exposure.

  2. Image-receptor performance: a comparison of Trophy RVG UI sensor and Kodak Ektaspeed Plus film.

    Science.gov (United States)

    Ludlow, J; Mol, A

    2001-01-01

    Objective. This study compares the physical characteristics of the RVG UI sensor (RVG) with Ektaspeed Plus film. Dose-response curves were generated for film and for each of 6 available RVG modes. An aluminum step-wedge was used to evaluate exposure latitude. Spatial resolution was assessed by using a line-pair test tool. Latitude and resolution were assessed by observers for both modalities. The RVG was further characterized by its modulation transfer function. Exposure latitude was equal for film and RVG in the periodontal mode. Other gray scale modes demonstrated much lower latitude. The average maximum resolution was 15.3 line-pairs per millimeter (lp/mm) for RVG in high-resolution mode, 10.5 lp/mm for RVG in low-resolution mode, and 20 lp/mm for film (P <.0001). Modulation transfer function measurements supported the subjective assessments. In periodontal mode, the RVG UI sensor demonstrates exposure latitude similar to that of Ektaspeed Plus film. Film images exhibit significantly higher spatial resolution than the RVG images acquired in high-resolution mode.

  3. Polymer Optical Fibre Sensors for Endoscopic Opto-Acoustic Imaging

    DEFF Research Database (Denmark)

    Broadway, Christian; Gallego, Daniel; Woyessa, Getinet

    2015-01-01

    in existing publications. A great advantage can be obtained for endoscopy due to a small size and array potential to provide discrete imaging speed improvements. Optical fibre exhibits numerous advantages over conventional piezo-electric transducers, such as immunity from electromagnetic interference...... is the physical size of the device, allowing compatibility with current technology, while governing flexibility of the distal end of the endoscope based on the needs of the sensor. Polymer optical fibre (POF) presents a novel approach for endoscopic applications and has been positively discussed and compared...... and a higher resolution at small sizes. Furthermore, micro structured polymer optical fibres offer over 12 times the sensitivity of silica fibre. We present a polymer fibre Bragg grating ultrasound detector with a core diameter of 125 microns. We discuss the ultrasonic signals received and draw conclusions...

  4. Rigorous derivation of the perimeter generating functions for the mean-squared radius of gyration of rectangular, Ferrers and pyramid polygons

    International Nuclear Information System (INIS)

    Lin, Keh Ying

    2006-01-01

    We have derived rigorously the perimeter generating functions for the mean-squared radius of gyration of rectangular, Ferrers and pyramid polygons. These functions were found by Jensen recently. His nonrigorous results are based on the analysis of the long series expansions. (comment)

  5. Coseismic displacements from SAR image offsets between different satellite sensors: Application to the 2001 Bhuj (India) earthquake

    KAUST Repository

    Wang, Teng; Wei, Shengji; Jonsson, Sigurjon

    2015-01-01

    preearthquake ERS and postearthquake Envisat images. The rupture model estimated from these cross-sensor offsets and teleseismic waveforms shows a compact fault slip pattern with fairly short rise times (<3 s) and a large stress drop (20 MPa), explaining

  6. Starshade mechanical design for the Habitable Exoplanet imaging mission concept (HabEx)

    Science.gov (United States)

    Arya, Manan; Webb, David; McGown, James; Lisman, P. Douglas; Shaklan, Stuart; Bradford, S. Case; Steeves, John; Hilgemann, Evan; Trease, Brian; Thomson, Mark; Warwick, Steve; Freebury, Gregg; Gull, Jamie

    2017-09-01

    An external occulter for starlight suppression - a starshade - flying in formation with the Habitable Exoplanet Imaging Mission Concept (HabEx) space telescope could enable the direct imaging and spectrographic characterization of Earthlike exoplanets in the habitable zone. This starshade is flown between the telescope and the star, and suppresses stellar light sufficiently to allow the imaging of the faint reflected light of the planet. This paper presents a mechanical architecture for this occulter, which must stow in a 5 m-diameter launch fairing and then deploy to about a 80 m-diameter structure. The optical performance of the starshade requires that the edge profile is accurate and stable. The stowage and deployment of the starshade to meet these requirements present unique challenges that are addressed in this proposed architecture. The mechanical architecture consists of a number of petals attached to a deployable perimeter truss, which is connected to central hub using tensioned spokes. The petals are furled around the stowed perimeter truss for launch. Herein is described a mechanical design solution that supports an 80 m-class starshade for flight as part of HabEx.

  7. Photonic sensor opportunities for distributed and wireless systems in security applications

    Science.gov (United States)

    Krohn, David

    2006-10-01

    There are broad ranges of homeland security sensing applications that can be facilitated by distributed fiber optic sensors and photonics integrated wireless systems. These applications include [1]: Pipeline, (Monitoring, Security); Smart structures (Bridges, Tunnels, Dams, Public spaces); Power lines (Monitoring, Security); Transportation security; Chemical/biological detection; Wide area surveillance - perimeter; and Port Security (Underwater surveillance, Cargo container). Many vital assets which cover wide areas, such as pipeline and borders, are under constant threat of being attacked or breached. There is a rapidly emerging need to be able to provide identification of intrusion threats to such vital assets. Similar problems exit for monitoring the basic infrastructure such as water supply, power utilities, communications systems as well as transportation. There is a need to develop a coordinated and integrated solution for the detection of threats. From a sensor standpoint, consideration must not be limited to detection, but how does detection lead to intervention and deterrence. Fiber optic sensor technology must be compatible with other surveillance technologies such as wireless mote technology to facilitate integration. In addition, the multi-functionality of fiber optic sensors must be expanded to include bio-chemical detection. There have been a number of barriers for the acceptance and broad use of smart fiber optic sensors. Compared to telecommunications, the volume is low. This fact coupled with proprietary and custom specifications has kept the price of fiber optic sensors high. There is a general lack of a manufacturing infrastructure and lack of standards for packaging and reliability. Also, there are several competing technologies; some photonic based and other approaches based on conventional non-photonic technologies.

  8. Recognizing Banknote Fitness with a Visible Light One Dimensional Line Image Sensor

    Directory of Open Access Journals (Sweden)

    Tuyen Danh Pham

    2015-08-01

    Full Text Available In general, dirty banknotes that have creases or soiled surfaces should be replaced by new banknotes, whereas clean banknotes should be recirculated. Therefore, the accurate classification of banknote fitness when sorting paper currency is an important and challenging task. Most previous research has focused on sensors that used visible, infrared, and ultraviolet light. Furthermore, there was little previous research on the fitness classification for Indian paper currency. Therefore, we propose a new method for classifying the fitness of Indian banknotes, with a one-dimensional line image sensor that uses only visible light. The fitness of banknotes is usually determined by various factors such as soiling, creases, and tears, etc. although we just consider banknote soiling in our research. This research is novel in the following four ways: first, there has been little research conducted on fitness classification for the Indian Rupee using visible-light images. Second, the classification is conducted based on the features extracted from the regions of interest (ROIs, which contain little texture. Third, 1-level discrete wavelet transformation (DWT is used to extract the features for discriminating between fit and unfit banknotes. Fourth, the optimal DWT features that represent the fitness and unfitness of banknotes are selected based on linear regression analysis with ground-truth data measured by densitometer. In addition, the selected features are used as the inputs to a support vector machine (SVM for the final classification of banknote fitness. Experimental results showed that our method outperforms other methods.

  9. Recognizing Banknote Fitness with a Visible Light One Dimensional Line Image Sensor.

    Science.gov (United States)

    Pham, Tuyen Danh; Park, Young Ho; Kwon, Seung Yong; Nguyen, Dat Tien; Vokhidov, Husan; Park, Kang Ryoung; Jeong, Dae Sik; Yoon, Sungsoo

    2015-08-27

    In general, dirty banknotes that have creases or soiled surfaces should be replaced by new banknotes, whereas clean banknotes should be recirculated. Therefore, the accurate classification of banknote fitness when sorting paper currency is an important and challenging task. Most previous research has focused on sensors that used visible, infrared, and ultraviolet light. Furthermore, there was little previous research on the fitness classification for Indian paper currency. Therefore, we propose a new method for classifying the fitness of Indian banknotes, with a one-dimensional line image sensor that uses only visible light. The fitness of banknotes is usually determined by various factors such as soiling, creases, and tears, etc. although we just consider banknote soiling in our research. This research is novel in the following four ways: first, there has been little research conducted on fitness classification for the Indian Rupee using visible-light images. Second, the classification is conducted based on the features extracted from the regions of interest (ROIs), which contain little texture. Third, 1-level discrete wavelet transformation (DWT) is used to extract the features for discriminating between fit and unfit banknotes. Fourth, the optimal DWT features that represent the fitness and unfitness of banknotes are selected based on linear regression analysis with ground-truth data measured by densitometer. In addition, the selected features are used as the inputs to a support vector machine (SVM) for the final classification of banknote fitness. Experimental results showed that our method outperforms other methods.

  10. High-Performance Motion Estimation for Image Sensors with Video Compression

    Directory of Open Access Journals (Sweden)

    Weizhi Xu

    2015-08-01

    Full Text Available It is important to reduce the time cost of video compression for image sensors in video sensor network. Motion estimation (ME is the most time-consuming part in video compression. Previous work on ME exploited intra-frame data reuse in a reference frame to improve the time efficiency but neglected inter-frame data reuse. We propose a novel inter-frame data reuse scheme which can exploit both intra-frame and inter-frame data reuse for ME in video compression (VC-ME. Pixels of reconstructed frames are kept on-chip until they are used by the next current frame to avoid off-chip memory access. On-chip buffers with smart schedules of data access are designed to perform the new data reuse scheme. Three levels of the proposed inter-frame data reuse scheme are presented and analyzed. They give different choices with tradeoff between off-chip bandwidth requirement and on-chip memory size. All three levels have better data reuse efficiency than their intra-frame counterparts, so off-chip memory traffic is reduced effectively. Comparing the new inter-frame data reuse scheme with the traditional intra-frame data reuse scheme, the memory traffic can be reduced by 50% for VC-ME.

  11. The EO-1 hyperion and advanced land imager sensors for use in tundra classification studies within the Upper Kuparuk River Basin, Alaska

    Science.gov (United States)

    Hall-Brown, Mary

    The heterogeneity of Arctic vegetation can make land cover classification vey difficult when using medium to small resolution imagery (Schneider et al., 2009; Muller et al., 1999). Using high radiometric and spatial resolution imagery, such as the SPOT 5 and IKONOS satellites, have helped arctic land cover classification accuracies rise into the 80 and 90 percentiles (Allard, 2003; Stine et al., 2010; Muller et al., 1999). However, those increases usually come at a high price. High resolution imagery is very expensive and can often add tens of thousands of dollars onto the cost of the research. The EO-1 satellite launched in 2002 carries two sensors that have high specral and/or high spatial resolutions and can be an acceptable compromise between the resolution versus cost issues. The Hyperion is a hyperspectral sensor with the capability of collecting 242 spectral bands of information. The Advanced Land Imager (ALI) is an advanced multispectral sensor whose spatial resolution can be sharpened to 10 meters. This dissertation compares the accuracies of arctic land cover classifications produced by the Hyperion and ALI sensors to the classification accuracies produced by the Systeme Pour l' Observation de le Terre (SPOT), the Landsat Thematic Mapper (TM) and the Landsat Enhanced Thematic Mapper Plus (ETM+) sensors. Hyperion and ALI images from August 2004 were collected over the Upper Kuparuk River Basin, Alaska. Image processing included the stepwise discriminant analysis of pixels that were positively classified from coinciding ground control points, geometric and radiometric correction, and principle component analysis. Finally, stratified random sampling was used to perform accuracy assessments on satellite derived land cover classifications. Accuracy was estimated from an error matrix (confusion matrix) that provided the overall, producer's and user's accuracies. This research found that while the Hyperion sensor produced classfication accuracies that were

  12. Adaptive Gain and Analog Wavelet Transform for Low-Power Infrared Image Sensors

    Directory of Open Access Journals (Sweden)

    P. Villard

    2012-01-01

    Full Text Available A decorrelation and analog-to-digital conversion scheme aiming to reduce the power consumption of infrared image sensors is presented in this paper. To exploit both intraframe redundancy and inherent photon shot noise characteristics, a column based 1D Haar analog wavelet transform combined with variable gain amplification prior to A/D conversion is used. This allows to use only an 11-bit ADC, instead of a 13-bit one, and to save 15% of data transfer. An 8×16 pixels test circuit demonstrates this functionality.

  13. An Intrusion Detection System for the Protection of Railway Assets Using Fiber Bragg Grating Sensors

    Directory of Open Access Journals (Sweden)

    Angelo Catalano

    2014-09-01

    Full Text Available We demonstrate the ability of Fiber Bragg Gratings (FBGs sensors to protect large areas from unauthorized activities in railway scenarios such as stations or tunnels. We report on the technological strategy adopted to protect a specific depot, representative of a common scenario for security applications in the railway environment. One of the concerns in the protection of a railway area centers on the presence of rail-tracks, which cannot be obstructed with physical barriers. We propose an integrated optical fiber system composed of FBG strain sensors that can detect human intrusion for protection of the perimeter combined with FBG accelerometer sensors for protection of rail-track access. Several trials were carried out in indoor and outdoor environments. The results demonstrate that FBG strain sensors bonded under a ribbed rubber mat enable the detection of intruder break-in via the pressure induced on the mat, whereas the FBG accelerometers installed under the rails enable the detection of intruders walking close to the railroad tracks via the acoustic surface waves generated by footsteps. Based on a single enabling technology, this integrated system represents a valuable intrusion detection system for railway security and could be integrated with other sensing functionalities in the railway field using fiber optic technology.

  14. Recce NG: from Recce sensor to image intelligence (IMINT)

    Science.gov (United States)

    Larroque, Serge

    2001-12-01

    Recce NG (Reconnaissance New Generation) is presented as a complete and optimized Tactical Reconnaissance System. Based on a new generation Pod integrating high resolution Dual Band sensors, the system has been designed with the operational lessons learnt from the last Peace Keeping Operations in Bosnia and Kosovo. The technical solutions retained as component modules of a full IMINT acquisition system, take benefit of the state of art in the following key technologies: Advanced Mission Planning System for long range stand-off Manned Recce, Aircraft and/or Pod tasking, operating sophisticated back-up software tools, high resolution 3D geo data and improved/combat proven MMI to reduce planning delays, Mature Dual Band sensors technology to achieve the Day and Night Recce Mission, including advanced automatic operational functions, as azimuth and roll tracking capabilities, low risk in Pod integration and in carrier avionics, controls and displays upgrades, to save time in operational turn over and maintenance, High rate Imagery Down Link, for Real Time or Near Real Time transmission, fully compatible with STANAG 7085 requirements, Advanced IMINT Exploitation Ground Segment, combat proven, NATO interoperable (STANAG 7023), integrating high value software tools for accurate location, improved radiometric image processing and open link to the C4ISR systems. The choice of an industrial Prime contractor mastering across the full system, all the prior listed key products and technologies, is mandatory to a successful delivery in terms of low Cost, Risk and Time Schedule.

  15. Wireless multimedia sensor networks on reconfigurable hardware information reduction techniques

    CERN Document Server

    Ang, Li-minn; Chew, Li Wern; Yeong, Lee Seng; Chia, Wai Chong

    2013-01-01

    Traditional wireless sensor networks (WSNs) capture scalar data such as temperature, vibration, pressure, or humidity. Motivated by the success of WSNs and also with the emergence of new technology in the form of low-cost image sensors, researchers have proposed combining image and audio sensors with WSNs to form wireless multimedia sensor networks (WMSNs).

  16. 1T Pixel Using Floating-Body MOSFET for CMOS Image Sensors

    Directory of Open Access Journals (Sweden)

    Guo-Neng Lu

    2009-01-01

    Full Text Available We present a single-transistor pixel for CMOS image sensors (CIS. It is a floating-body MOSFET structure, which is used as photo-sensing device and source-follower transistor, and can be controlled to store and evacuate charges. Our investigation into this 1T pixel structure includes modeling to obtain analytical description of conversion gain. Model validation has been done by comparing theoretical predictions and experimental results. On the other hand, the 1T pixel structure has been implemented in different configurations, including rectangular-gate and ring-gate designs, and variations of oxidation parameters for the fabrication process. The pixel characteristics are presented and discussed.

  17. 1T Pixel Using Floating-Body MOSFET for CMOS Image Sensors.

    Science.gov (United States)

    Lu, Guo-Neng; Tournier, Arnaud; Roy, François; Deschamps, Benoît

    2009-01-01

    We present a single-transistor pixel for CMOS image sensors (CIS). It is a floating-body MOSFET structure, which is used as photo-sensing device and source-follower transistor, and can be controlled to store and evacuate charges. Our investigation into this 1T pixel structure includes modeling to obtain analytical description of conversion gain. Model validation has been done by comparing theoretical predictions and experimental results. On the other hand, the 1T pixel structure has been implemented in different configurations, including rectangular-gate and ring-gate designs, and variations of oxidation parameters for the fabrication process. The pixel characteristics are presented and discussed.

  18. The environmental impacts of land transformation in the coastal perimeter of the Mar Menor lagoon (Spain)

    OpenAIRE

    GARCÍA-AYLLON VEINTIMILLA, SALVADOR; Miralles García, José Luis

    2014-01-01

    Assigned to WIT Press all rights under copyright that may exist in and to the Work and any associated written or multimedia components or other enhancements accompanying the Work. The distinctive environment of the lagoon has long been attractive for visitors. A surge in touristic activities has taken place in the area since the early 1960s, characterised by intense urban development along the lagoon s perimeter to accommodate the growing seasonal population. This phenomenon has particular...

  19. Advances in multi-sensor data fusion: algorithms and applications.

    Science.gov (United States)

    Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying

    2009-01-01

    With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.

  20. Dual-Beam Antenna Design for Autonomous Sensor Network Applications

    Directory of Open Access Journals (Sweden)

    Jean-Marie Floc'h

    2012-01-01

    Full Text Available This paper describes our contribution in the ANR project called CAPNET dedicated to the site security (autonomous sensor network. The network is autonomous in term of energy and it is very easy to deploy on the site (the time to deploy each node of the network is around 10 minutes. The first demonstrator was deployed in the fire base station of Brest, France with 10 nodes with a security perimeter around 1.5 km. Our contribution takes place in the field of antennas, with the development of two systems: a single-beam antenna reserved for the supervisor or the last node of the network, and a dual-beam antenna dedicated to the node in linear configuration. For the design and optimization of antennas, we use HFSS CAD software from ANSOFT. The antennas have been designed and successfully measured.

  1. Vapor transport deposition of large-area polycrystalline CdTe for radiation image sensor application

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Keedong; Cha, Bokyung; Heo, Duchang; Jeon, Sungchae [Korea Electrotechnology Research Institute, 111 Hanggaul-ro, Ansan-si, Gyeonggi-do 426-170 (Korea, Republic of)

    2014-07-15

    Vapor transport deposition (VTD) process delivers saturated vapor to substrate, resulting in high-throughput and scalable process. In addition, VTD can maintain lower substrate temperature than close-spaced sublimation (CSS). The motivation of this work is to adopt several advantages of VTD for radiation image sensor application. Polycrystalline CdTe films were obtained on 300 mm x 300 mm indium tin oxide (ITO) coated glass. The polycrystalline CdTe film has columnar structure with average grain size of 3 μm ∝ 9 μm, which can be controlled by changing the substrate temperature. In order to analyze electrical and X-ray characteristics, ITO-CdTe-Al sandwich structured device was fabricated. Effective resistivity of the polycrystalline CdTe film was ∝1.4 x 10{sup 9}Ωcm. The device was operated under hole-collection mode. The responsivity and the μτ product estimated to be 6.8 μC/cm{sup 2}R and 5.5 x 10{sup -7} cm{sup 2}/V. The VTD can be a process of choice for monolithic integration of CdTe thick film for radiation image sensor and CMOS/TFT circuitry. (copyright 2014 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  2. Farmers’ willingness to pay for surface water in the West Mitidja irrigated perimeter, northern Algeria

    Directory of Open Access Journals (Sweden)

    Malika Azzi

    2018-04-01

    Full Text Available Algeria is among the most water-stressed countries in the world. Because of its climatic conditions, irrigation is essential for agricultural production. Water prices paid by farmers in public irrigation districts are very low and do not cover the operation and maintenance (O&M costs of the irrigated perimeters, thus leading to the deterioration of these infrastructures. The objective of this paper is to analyse whether farmer’s in the West Mitidja irrigation district in northern Algeria would be willing to pay more for surface water in order to maintain the water supply service in its current conditions. We estimated farmers’ willingness to pay (WTP for water using data from a dichotomous choice contingent valuation survey to 112 randomly selected farmers. Farmers’ responses were modelled using logistic regression techniques. We also analysed which technical, structural, social and economic characteristics of farms and farmers explain the differences in WTP. Our results showed that nearly 80% of the surveyed farmers are willing to pay an extra price for irrigation water. The average WTP was 64% greater than the price currently paid by farmers, suggesting some scope for improving the financial resources of the Mitidja irrigated perimeter, but insufficient to cover all O&M costs. Some of the key identified factors that affect WTP for surface water relate to farm ownership, access to groundwater resources, cropping patterns, farmers’ agricultural training and risk exposure.

  3. Rapid immuno-analytical system physically integrated with lens-free CMOS image sensor for food-borne pathogens.

    Science.gov (United States)

    Jeon, Jin-Woo; Kim, Jee-Hyun; Lee, Jong-Mook; Lee, Won-Ho; Lee, Do-Young; Paek, Se-Hwan

    2014-02-15

    To realize an inexpensive, pocket-sized immunosensor system, a rapid test devise based on cross-flow immuno-chromatography was physically combined with a lens-free CMOS image sensor (CIS), which was then applied to the detection of the food-borne pathogen, Salmonella typhimurium (S. typhimurium). Two CISs, each retaining 1.3 mega pixel array, were mounted on a printed circuit board to fabricate a disposable sensing module, being connectable with a signal detection system. For the bacterial analysis, a cellulose membrane-based immunosensing platform, ELISA-on-a-chip (EOC), was employed, being integrated with the CIS module, and the antigen-antibody reaction sites were aligned with the respective sensor. In such sensor construction, the chemiluminescent signals produced from the EOC are transferred directly into the sensors and are converted to electric signals on the detector. The EOC-CIS integrated sensor was capable of detecting a traceable amount of the bacterium (4.22 × 10(3)CFU/mL), nearly comparable to that adopting a sophisticated detector such as cooled-charge-coupled device, while having greatly reduced dimensions and cost. Upon coupling with immuno-magnetic separation, the sensor showed an additional 67-fold enhancement in the detection limit. Furthermore, a real sample test was carried out for fish muscles inoculated with a sample of 3.3CFU S. typhimurium per 10 g, which was able to be detected earlier than 6h after the onset of pre-enrichment by culture. © 2013 Elsevier B.V. All rights reserved.

  4. Area G perimeter surface-soil and single-stage water sampling: Environmental surveillance for fiscal year 94, Group ESH-19. Progress report

    International Nuclear Information System (INIS)

    Conrad, R.; Childs, M.; Lyons, C.R.; Coriz, F.

    1996-08-01

    ESH-19 personnel collected soil and single-stage water samples around the perimeter of Area G at Los Alamos National Laboratory during FY94 to characterize possible contaminant movement out of Area G through surface-water and sediment runoff. These samples were analyzed for tritium, total uranium, isotopic plutonium, americium-241, and cesium-137. Ten metals were also analyzed on selected soils using analytical laboratory techniques. All radiochemical data are compared with analogous samples collected during FY 93 and reported in LA-12986. Baseline concentrations for future disposal operations were established for metals and radionuclides by a sampling program in the proposed Area G Expansion Area. Considering the amount of radioactive waste that has been disposed at Area G, there is evidence of only low concentrations of radionuclides on perimeter surface soils. Consequently, little radioactivity is leaving the confines of Area G via the surface water runoff pathway

  5. Simple force balance accelerometer/seismometer based on a tuning fork displacement sensor

    International Nuclear Information System (INIS)

    Stuart-Watson, D.; Tapson, J.

    2004-01-01

    Seismometers and microelectromechanical system accelerometers use the force-balance principle to obtain measurements. In these instruments the displacement of a mass object by an unknown force is sensed using a very high-resolution displacement sensor. The position of the object is then stabilized by applying an equal and opposite force to it. The magnitude of the stabilizing force is easily measured, and is assumed to be equivalent to the unknown force. These systems are critically dependent on the displacement sensor. In this article we use a resonant quartz tuning fork as the sensor. The tuning fork is operated so that its oscillation is lightly damped by the proximity of the movable mass object. Changes in the position of the mass object cause changes in the phase of the fork's resonance; this is used as the feedback variable in controlling the mass position. We have developed an acceleration sensor using this principle. The mass object is a piezoelectric bimorph diaphragm which is anchored around its perimeter, allowing direct electronic control of the displacement of its center. The tuning fork is brought very close to the diaphragm center, and is connected into a self-oscillating feedback circuit which has phase and amplitude as outputs. The diaphragm position is adjusted by a feedback loop, using phase as the feedback variable, to keep it in a constant position with respect to the tuning fork. The measured noise for this sensor is approximately 10.0 mg in a bandwidth of 100 Hz, which is substantially better than commercial systems of equivalent cost and size

  6. Novel Hall sensors developed for magnetic field imaging systems

    International Nuclear Information System (INIS)

    Cambel, Vladimir; Karapetrov, Goran; Novosad, Valentyn; Bartolome, Elena; Gregusova, Dagmar; Fedor, Jan; Kudela, Robert; Soltys, Jan

    2007-01-01

    We report here on the fabrication and application of novel planar Hall sensors based on shallow InGaP/AlGaAs/GaAs heterostructure with a two-dimensional electron gas (2DEG) as an active layer. The sensors are developed for two kinds of experiments. In the first one, magnetic samples are placed directly on the Hall sensor. Room temperature experiments of permalloy objects evaporated onto the sensor are presented. In the second experiment, the sensor scans close over a multigranular superconducting sample prepared on a YBCO thin film. Large-area and high-resolution scanning experiments were performed at 4.2 K with the Hall probe scanning system in a liquid helium flow cryostat

  7. Bilevel thresholding of sliced image of sludge floc.

    Science.gov (United States)

    Chu, C P; Lee, D J

    2004-02-15

    This work examined the feasibility of employing various thresholding algorithms to determining the optimal bilevel thresholding value for estimating the geometric parameters of sludge flocs from the microtome sliced images and from the confocal laser scanning microscope images. Morphological information extracted from images depends on the bilevel thresholding value. According to the evaluation on the luminescence-inverted images and fractal curves (quadric Koch curve and Sierpinski carpet), Otsu's method yields more stable performance than other histogram-based algorithms and is chosen to obtain the porosity. The maximum convex perimeter method, however, can probe the shapes and spatial distribution of the pores among the biomass granules in real sludge flocs. A combined algorithm is recommended for probing the sludge floc structure.

  8. Fabrication of CMOS-compatible nanopillars for smart bio-mimetic CMOS image sensors

    KAUST Repository

    Saffih, Faycal

    2012-06-01

    In this paper, nanopillars with heights of 1μm to 5μm and widths of 250nm to 500nm have been fabricated with a near room temperature etching process. The nanopillars were achieved with a continuous deep reactive ion etching technique and utilizing PMMA (polymethylmethacrylate) and Chromium as masking layers. As opposed to the conventional Bosch process, the usage of the unswitched deep reactive ion etching technique resulted in nanopillars with smooth sidewalls with a measured surface roughness of less than 40nm. Moreover, undercut was nonexistent in the nanopillars. The proposed fabrication method achieves etch rates four times faster when compared to the state-of-the-art, leading to higher throughput and more vertical side walls. The fabrication of the nanopillars was carried out keeping the CMOS process in mind to ultimately obtain a CMOS-compatible process. This work serves as an initial step in the ultimate objective of integrating photo-sensors based on these nanopillars seamlessly along with the controlling transistors to build a complete bio-inspired smart CMOS image sensor on the same wafer. © 2012 IEEE.

  9. Novel 2D-sequential color code system employing Image Sensor Communications for Optical Wireless Communications

    Directory of Open Access Journals (Sweden)

    Trang Nguyen

    2016-06-01

    Full Text Available The IEEE 802.15.7r1 Optical Wireless Communications Task Group (TG7r1, also known as the revision of the IEEE 802.15.7 Visible Light Communication standard targeting the commercial usage of visible light communication systems, is of interest in this paper. The paper is mainly concerned with Image Sensor Communications (ISC of TG7r1; however, the major challenge facing ISC, as addressed in the Technical Consideration Document (TCD of TG7r1, is Image Sensor Compatibility among the variety of different commercial cameras on the market. One of the most challenging but interesting compatibility requirements is the need to support the verified presence of frame rate variation. This paper proposes a novel design for 2D-sequential color code. Compared to a QR-code-based sequential transmission, the proposed design of 2D-sequential code can overcome the above challenge that it is compatible with different frame rate variations and different shutter operations, and has the ability to mitigate the rolling effect as well as the rotating effect while effectively minimizing transmission overhead. Practical implementations are demonstrated and a performance comparison is presented.

  10. The enhanced cyan fluorescent protein: a sensitive pH sensor for fluorescence lifetime imaging.

    Science.gov (United States)

    Poëa-Guyon, Sandrine; Pasquier, Hélène; Mérola, Fabienne; Morel, Nicolas; Erard, Marie

    2013-05-01

    pH is an important parameter that affects many functions of live cells, from protein structure or function to several crucial steps of their metabolism. Genetically encoded pH sensors based on pH-sensitive fluorescent proteins have been developed and used to monitor the pH of intracellular compartments. The quantitative analysis of pH variations can be performed either by ratiometric or fluorescence lifetime detection. However, most available genetically encoded pH sensors are based on green and yellow fluorescent proteins and are not compatible with multicolor approaches. Taking advantage of the strong pH sensitivity of enhanced cyan fluorescent protein (ECFP), we demonstrate here its suitability as a sensitive pH sensor using fluorescence lifetime imaging. The intracellular ECFP lifetime undergoes large changes (32 %) in the pH 5 to pH 7 range, which allows accurate pH measurements to better than 0.2 pH units. By fusion of ECFP with the granular chromogranin A, we successfully measured the pH in secretory granules of PC12 cells, and we performed a kinetic analysis of intragranular pH variations in living cells exposed to ammonium chloride.

  11. Cross-comparison of the IRS-P6 AWiFS sensor with the L5 TM, L7 ETM+, & Terra MODIS sensors

    Science.gov (United States)

    Chander, G.; Xiong, X.; Angal, A.; Choi, T.; Malla, R.

    2009-01-01

    As scientists and decision makers increasingly rely on multiple Earth-observing satellites to address urgent global issues, it is imperative that they can rely on the accuracy of Earth-observing data products. This paper focuses on the crosscomparison of the Indian Remote Sensing (IRS-P6) Advanced Wide Field Sensor (AWiFS) with the Landsat 5 (L5) Thematic Mapper (TM), Landsat 7 (L7) Enhanced Thematic Mapper Plus (ETM+), and Terra Moderate Resolution Imaging Spectroradiometer (MODIS) sensors. The cross-comparison was performed using image statistics based on large common areas observed by the sensors within 30 minutes. Because of the limited availability of simultaneous observations between the AWiFS and the Landsat and MODIS sensors, only a few images were analyzed. These initial results are presented. Regression curves and coefficients of determination for the top-of-atmosphere (TOA) trends from these sensors were generated to quantify the uncertainty in these relationships and to provide an assessment of the calibration differences between these sensors. ?? 2009 SPIE.

  12. Open architecture of smart sensor suites

    Science.gov (United States)

    Müller, Wilmuth; Kuwertz, Achim; Grönwall, Christina; Petersson, Henrik; Dekker, Rob; Reinert, Frank; Ditzel, Maarten

    2017-10-01

    Experiences from recent conflicts show the strong need for smart sensor suites comprising different multi-spectral imaging sensors as core elements as well as additional non-imaging sensors. Smart sensor suites should be part of a smart sensor network - a network of sensors, databases, evaluation stations and user terminals. Its goal is to optimize the use of various information sources for military operations such as situation assessment, intelligence, surveillance, reconnaissance, target recognition and tracking. Such a smart sensor network will enable commanders to achieve higher levels of situational awareness. Within the study at hand, an open system architecture was developed in order to increase the efficiency of sensor suites. The open system architecture for smart sensor suites, based on a system-of-systems approach, enables combining different sensors in multiple physical configurations, such as distributed sensors, co-located sensors combined in a single package, tower-mounted sensors, sensors integrated in a mobile platform, and trigger sensors. The architecture was derived from a set of system requirements and relevant scenarios. Its mode of operation is adaptable to a series of scenarios with respect to relevant objects of interest, activities to be observed, available transmission bandwidth, etc. The presented open architecture is designed in accordance with the NATO Architecture Framework (NAF). The architecture allows smart sensor suites to be part of a surveillance network, linked e.g. to a sensor planning system and a C4ISR center, and to be used in combination with future RPAS (Remotely Piloted Aircraft Systems) for supporting a more flexible dynamic configuration of RPAS payloads.

  13. Pesticide residue quantification analysis by hyperspectral imaging sensors

    Science.gov (United States)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  14. AMA Conferences 2015. SENSOR 2015. 17th international conference on sensors and measurement technology. IRS2 2015. 14th international conference on infrared sensors and systems. Proceedings

    International Nuclear Information System (INIS)

    2015-01-01

    This meeting paper contains presentations of two conferences: SENSOR 2015 and IRS 2 (= International conference on InfraRed Sensors and systems). The first part of SENSOR 2015 contains the following chapters: (A) SENSOR PRINCIPLES: A.1: Mechanical sensors; A.2: Optical sensors; A.3: Ultrasonic sensors; A.4: Microacoustic sensors; A.5: Magnetic sensors; A.6: Impedance sensors; A.7: Gas sensors; A.8: Flow sensors; A.9: Dimensional measurement; A.10: Temperature and humidity sensors; A.11: Chemosensors; A.12: Biosensors; A.13: Embedded sensors; A.14: Sensor-actuator systems; (B) SENSOR TECHNOLOGY: B.1: Sensor design; B.2: Numerical simulation of sensors; B.3: Sensor materials; B.4: MEMS technology; B.5: Micro-Nano-Integration; B.6: Packaging; B.7: Materials; B.8: Thin films; B.9: Sensor production; B.10: Sensor reliability; B.11: Calibration and testing; B.12: Optical fibre sensors. (C) SENSOR ELECTRONICS AND COMMUNICATION: C.1: Sensor electronics; C.2: Sensor networks; C.3: Wireless sensors; C.4: Sensor communication; C.5: Energy harvesting; C.6: Measuring systems; C.7: Embedded systems; C.8: Self-monitoring and diagnosis; (D) APPLICATIONS: D.1: Medical measuring technology; D.2: Ambient assisted living; D.3: Process measuring technology; D.4: Automotive; D.5: Sensors in energy technology; D.6: Production technology; D.7: Security technology; D.8: Smart home; D.9: Household technology. The second part with the contributions of the IRS 2 2015 is structured as follows: (E) INFRARED SENSORS: E.1: Photon detectors; E.2: Thermal detectors; E.3: Cooled detectors; E.4: Uncooled detectors; E.5: Sensor modules; E.6: Sensor packaging. (G) INFRARED SYSTEMS AND APPLICATIONS: G.1: Thermal imaging; G.2: Pyrometry / contactless temperature measurement; G.3: Gas analysis; G.4: Spectroscopy; G.5: Motion control and presence detection; G.6: Security and safety monitoring; G.7: Non-destructive testing; F: INFRARED SYSTEM COMPONENTS: F.1: Infrared optics; F.2: Optical modulators; F.3

  15. Convolutional neural network-based classification system design with compressed wireless sensor network images.

    Science.gov (United States)

    Ahn, Jungmo; Park, JaeYeon; Park, Donghwan; Paek, Jeongyeup; Ko, JeongGil

    2018-01-01

    With the introduction of various advanced deep learning algorithms, initiatives for image classification systems have transitioned over from traditional machine learning algorithms (e.g., SVM) to Convolutional Neural Networks (CNNs) using deep learning software tools. A prerequisite in applying CNN to real world applications is a system that collects meaningful and useful data. For such purposes, Wireless Image Sensor Networks (WISNs), that are capable of monitoring natural environment phenomena using tiny and low-power cameras on resource-limited embedded devices, can be considered as an effective means of data collection. However, with limited battery resources, sending high-resolution raw images to the backend server is a burdensome task that has direct impact on network lifetime. To address this problem, we propose an energy-efficient pre- and post- processing mechanism using image resizing and color quantization that can significantly reduce the amount of data transferred while maintaining the classification accuracy in the CNN at the backend server. We show that, if well designed, an image in its highly compressed form can be well-classified with a CNN model trained in advance using adequately compressed data. Our evaluation using a real image dataset shows that an embedded device can reduce the amount of transmitted data by ∼71% while maintaining a classification accuracy of ∼98%. Under the same conditions, this process naturally reduces energy consumption by ∼71% compared to a WISN that sends the original uncompressed images.

  16. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    Science.gov (United States)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  17. Novel Magnetic and Chemical Micro Sensors for In-situ, Real-time, and Unattended Use

    International Nuclear Information System (INIS)

    Datskou, I.

    2001-01-01

    There exists a need to develop novel, advanced, unattended magnetic and chemical micro-sensor systems for successful detection, localization, classification and tracking of ground time critical targets of interest. Consistent with the underlying long-term objectives of the development of unattended ground sensors (UGS) program they have investigated the use of a new planted ground sensor platform based on Micro-Electro-Mechanical Systems (MEMS) that can offer magnetic, chemical and possibly acoustic detection. The envisioned micro-system will be low-power and low-cost and will be built around a single type of microstructure element integrating a monolithic optical system and electronics package. This micro sensor can also incorporate burst telemetry to transmit the information, a renewable power source and will be capable of operating under field conditions, with sufficient sensitivity to permit high detection rates, and with sufficient chemical selectivity to prevent high false alarm rates. Preliminary studies, initial designs, and key predicted performance parameters will be presented. Possible applications of such a system include sensitive perimeter monitoring such as minefields and military/nuclear bases, vehicle detection, and aircraft navigation systems, and drug enforcement operations. The results of the present work demonstrate that the microcalorimetric spectroscopy technique can be applied to detect and identify chemicals in the ppm level and the studied microcantilever-based magnetometer can provide sensitivities in the order of 1(micro)T

  18. 29 CFR Appendix F to Subpart R of... - Perimeter Columns: Non-Mandatory Guidelines for Complying With § 1926.756(e) To Protect the...

    Science.gov (United States)

    2010-07-01

    ... to Subpart R of Part 1926 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND... Steel Erection Pt. 1926, Subpt. R, App. F Appendix F to Subpart R of Part 1926—Perimeter Columns: Non...

  19. Ukraine-Japanese-Swedish project: Upgrading of perimeter protection system at Kharkov Institute of Physics and Technology (KIPT)

    International Nuclear Information System (INIS)

    Mikahaylov, V.; Lapshin, V.; Ek, P.; Flyghed, L.; Nilsson, A.; Ooka, N.; Shimizu, K.; Tanuma, K.

    2001-01-01

    Full text: Since the Ukraine voluntarily accepted the status of a non-nuclear-weapons state and concluded a Safeguards Agreement with the IAEA, the Kharkov Institute of Physics and Technology (KIPT) as a nuclear facility using the nuclear material of category 1, has become a Ukrainian priority object for the international community's efforts to ensure nuclear non-proliferation measures and to bring the existing protection systems to the generally accepted security standards. In March 1996, at the meeting held under the auspices of the IAEA in Kiev, the representatives from Japan, Sweden and the USA agreed to provide technical assistance concerning improvement of the nuclear material accountancy and control and physical protection system (MPC and A) available at KIPT. The Technical Secretariat of the Japan-Ukraine Committee for Co-operation on Reducing Nuclear Weapons and Swedish Nuclear Power Inspectorate undertook to solve the most expensive and labour-consuming task namely, the upgrading of the perimeter protection system at KIPT. This included that the current perimeter system, comprising several kilometers, should be completely replaced. Besides the above-mentioned problem, the upgrading should be carried out with the institute in operation. Thus, it was not allowed to replace the existing protection system by a new one unless KIPT was constantly protected. This required the creation of a new protected zone that to a large extent was occupied by the communication equipment, buildings, trees and other objects interfering with the work. All these difficulties required very comprehensive development of the project design as well as a great deal of flexibility during the implementation of the project. These problems were all successfully resolved thanks to a well working project organization, composed of experts from KIPT, JAERI and ANS, involving the highly qualified Swedish technical experts who played a leading role. In the framework of implementation of the

  20. A pixel design for X-ray imaging with CdTe sensors

    Energy Technology Data Exchange (ETDEWEB)

    Lambropoulos, C.P.; Zervakis, E.G. [Technological Educational Institute of Halkis, Psahna - Evia (Greece); Loukas, D. [Institute of Nuclear Physics, NCSR Demokritos, Agia Paraskevi - Attiki (Greece)

    2008-07-01

    A readout architecture appropriate for X-ray Imaging using charge integration has been designed. Each pixel consists of a capacitive transimpedance amplifier, a sample and hold circuit a comparator and an 8 bit DRAM. Pixel level A/D conversion and local storage of the digitized signal is performed. The target sensors are 100{mu}m x 100 {mu}m CdTe pixel detectors and integration time of 1ms or less can be achieved. Special measures have been taken to minimize the gain fixed pattern noise and the reset noise, while purely digital correlation double sampling can be performed. (copyright 2008 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  1. A pixel design for X-ray imaging with CdTe sensors

    International Nuclear Information System (INIS)

    Lambropoulos, C.P.; Zervakis, E.G.; Loukas, D.

    2008-01-01

    A readout architecture appropriate for X-ray Imaging using charge integration has been designed. Each pixel consists of a capacitive transimpedance amplifier, a sample and hold circuit a comparator and an 8 bit DRAM. Pixel level A/D conversion and local storage of the digitized signal is performed. The target sensors are 100μm x 100 μm CdTe pixel detectors and integration time of 1ms or less can be achieved. Special measures have been taken to minimize the gain fixed pattern noise and the reset noise, while purely digital correlation double sampling can be performed. (copyright 2008 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  2. Communications for unattended sensor networks

    Science.gov (United States)

    Nemeroff, Jay L.; Angelini, Paul; Orpilla, Mont; Garcia, Luis; DiPierro, Stefano

    2004-07-01

    The future model of the US Army's Future Combat Systems (FCS) and the Future Force reflects a combat force that utilizes lighter armor protection than the current standard. Survival on the future battlefield will be increased by the use of advanced situational awareness provided by unattended tactical and urban sensors that detect, identify, and track enemy targets and threats. Successful implementation of these critical sensor fields requires the development of advanced sensors, sensor and data-fusion processors, and a specialized communications network. To ensure warfighter and asset survivability, the communications must be capable of near real-time dissemination of the sensor data using robust, secure, stealthy, and jam resistant links so that the proper and decisive action can be taken. Communications will be provided to a wide-array of mission-specific sensors that are capable of processing data from acoustic, magnetic, seismic, and/or Chemical, Biological, Radiological, and Nuclear (CBRN) sensors. Other, more powerful, sensor node configurations will be capable of fusing sensor data and intelligently collect and process data images from infrared or visual imaging cameras. The radio waveform and networking protocols being developed under the Soldier Level Integrated Communications Environment (SLICE) Soldier Radio Waveform (SRW) and the Networked Sensors for the Future Force Advanced Technology Demonstration are part of an effort to develop a common waveform family which will operate across multiple tactical domains including dismounted soldiers, ground sensor, munitions, missiles and robotics. These waveform technologies will ultimately be transitioned to the JTRS library, specifically the Cluster 5 requirement.

  3. Wireless Integrated Network Sensors Next Generation

    National Research Council Canada - National Science Library

    Merrill, William

    2004-01-01

    ..., autonomous networking, and distributed operations for wireless networked sensor systems. Multiple types of sensor systems were developed and provided including capabilities for acoustic, seismic, passive infrared detection, and visual imaging...

  4. High-speed uncooled MWIR hostile fire indication sensor

    Science.gov (United States)

    Zhang, L.; Pantuso, F. P.; Jin, G.; Mazurenko, A.; Erdtmann, M.; Radhakrishnan, S.; Salerno, J.

    2011-06-01

    Hostile fire indication (HFI) systems require high-resolution sensor operation at extremely high speeds to capture hostile fire events, including rocket-propelled grenades, anti-aircraft artillery, heavy machine guns, anti-tank guided missiles and small arms. HFI must also be conducted in a waveband with large available signal and low background clutter, in particular the mid-wavelength infrared (MWIR). The shortcoming of current HFI sensors in the MWIR is the bandwidth of the sensor is not sufficient to achieve the required frame rate at the high sensor resolution. Furthermore, current HFI sensors require cryogenic cooling that contributes to size, weight, and power (SWAP) in aircraft-mounted applications where these factors are at a premium. Based on its uncooled photomechanical infrared imaging technology, Agiltron has developed a low-SWAP, high-speed MWIR HFI sensor that breaks the bandwidth bottleneck typical of current infrared sensors. This accomplishment is made possible by using a commercial-off-the-shelf, high-performance visible imager as the readout integrated circuit and physically separating this visible imager from the MWIR-optimized photomechanical sensor chip. With this approach, we have achieved high-resolution operation of our MWIR HFI sensor at 1000 fps, which is unprecedented for an uncooled infrared sensor. We have field tested our MWIR HFI sensor for detecting all hostile fire events mentioned above at several test ranges under a wide range of environmental conditions. The field testing results will be presented.

  5. AMA Conferences 2015. SENSOR 2015. 17th international conference on sensors and measurement technology. IRS{sup 2} 2015. 14th international conference on infrared sensors and systems. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-07-01

    This meeting paper contains presentations of two conferences: SENSOR 2015 and IRS{sup 2} (= International conference on InfraRed Sensors and systems). The first part of SENSOR 2015 contains the following chapters: (A) SENSOR PRINCIPLES: A.1: Mechanical sensors; A.2: Optical sensors; A.3: Ultrasonic sensors; A.4: Microacoustic sensors; A.5: Magnetic sensors; A.6: Impedance sensors; A.7: Gas sensors; A.8: Flow sensors; A.9: Dimensional measurement; A.10: Temperature and humidity sensors; A.11: Chemosensors; A.12: Biosensors; A.13: Embedded sensors; A.14: Sensor-actuator systems; (B) SENSOR TECHNOLOGY: B.1: Sensor design; B.2: Numerical simulation of sensors; B.3: Sensor materials; B.4: MEMS technology; B.5: Micro-Nano-Integration; B.6: Packaging; B.7: Materials; B.8: Thin films; B.9: Sensor production; B.10: Sensor reliability; B.11: Calibration and testing; B.12: Optical fibre sensors. (C) SENSOR ELECTRONICS AND COMMUNICATION: C.1: Sensor electronics; C.2: Sensor networks; C.3: Wireless sensors; C.4: Sensor communication; C.5: Energy harvesting; C.6: Measuring systems; C.7: Embedded systems; C.8: Self-monitoring and diagnosis; (D) APPLICATIONS: D.1: Medical measuring technology; D.2: Ambient assisted living; D.3: Process measuring technology; D.4: Automotive; D.5: Sensors in energy technology; D.6: Production technology; D.7: Security technology; D.8: Smart home; D.9: Household technology. The second part with the contributions of the IRS{sup 2} 2015 is structured as follows: (E) INFRARED SENSORS: E.1: Photon detectors; E.2: Thermal detectors; E.3: Cooled detectors; E.4: Uncooled detectors; E.5: Sensor modules; E.6: Sensor packaging. (G) INFRARED SYSTEMS AND APPLICATIONS: G.1: Thermal imaging; G.2: Pyrometry / contactless temperature measurement; G.3: Gas analysis; G.4: Spectroscopy; G.5: Motion control and presence detection; G.6: Security and safety monitoring; G.7: Non-destructive testing; F: INFRARED SYSTEM COMPONENTS: F.1: Infrared optics; F.2: Optical

  6. Optical sensors for earth observation. Chikyu kansokuyo kogaku sensor

    Energy Technology Data Exchange (ETDEWEB)

    Ono, A [National Research Laboratory of Metrology, Tsukuba (Japan)

    1991-10-10

    Developments are made on an optical imager (ASTER) used to collect mainly images of land areas and an infrared sounder (IMG) to measure vertical air temperature distribution and vertical concentration distribution of specific gases, as satellite mounted sensors for earth observation. All the sensor characteristics of the ASTER comprising a visible near infrared radiometer, short wave infrared radiometer and thermal infrared radiometer are required to be capable of providing measurement, evaluation and assurance at the required accuracies during the entire life time. A problem to be solved is how to combine the on-ground calibration prior to launching, on-satellite calibration, and calibration between the test site and the sensors. The IMG is a Fourier transform spectroscopic infrared sounder, which is demanded of a high wave resolution over extended periods of time as well as a high radiation measuring capability. Also required are the level elevation of analysis algorithms to solve inverse problems from the observed radiation spectra, and the data base with high accuracy. 19 refs., 4 figs., 4 tabs.

  7. Sensor Pods: Multi-Resolution Surveys from a Light Aircraft

    Directory of Open Access Journals (Sweden)

    Conor Cahalane

    2017-02-01

    Full Text Available Airborne remote sensing, whether performed from conventional aerial survey platforms such as light aircraft or the more recent Remotely Piloted Airborne Systems (RPAS has the ability to compliment mapping generated using earth-orbiting satellites, particularly for areas that may experience prolonged cloud cover. Traditional aerial platforms are costly but capture spectral resolution imagery over large areas. RPAS are relatively low-cost, and provide very-high resolution imagery but this is limited to small areas. We believe that we are the first group to retrofit these new, low-cost, lightweight sensors in a traditional aircraft. Unlike RPAS surveys which have a limited payload, this is the first time that a method has been designed to operate four distinct RPAS sensors simultaneously—hyperspectral, thermal, hyper, RGB, video. This means that imagery covering a broad range of the spectrum captured during a single survey, through different imaging capture techniques (frame, pushbroom, video can be applied to investigate different multiple aspects of the surrounding environment such as, soil moisture, vegetation vitality, topography or drainage, etc. In this paper, we present the initial results validating our innovative hybrid system adapting dedicated RPAS sensors for a light aircraft sensor pod, thereby providing the benefits of both methodologies. Simultaneous image capture with a Nikon D800E SLR and a series of dedicated RPAS sensors, including a FLIR thermal imager, a four-band multispectral camera and a 100-band hyperspectral imager was enabled by integration in a single sensor pod operating from a Cessna c172. However, to enable accurate sensor fusion for image analysis, each sensor must first be combined in a common vehicle coordinate system and a method for triggering, time-stamping and calculating the position/pose of each sensor at the time of image capture devised. Initial tests were carried out over agricultural regions with

  8. An efficient and secure partial image encryption for wireless multimedia sensor networks using discrete wavelet transform, chaotic maps and substitution box

    Science.gov (United States)

    Khan, Muazzam A.; Ahmad, Jawad; Javaid, Qaisar; Saqib, Nazar A.

    2017-03-01

    Wireless Sensor Networks (WSN) is widely deployed in monitoring of some physical activity and/or environmental conditions. Data gathered from WSN is transmitted via network to a central location for further processing. Numerous applications of WSN can be found in smart homes, intelligent buildings, health care, energy efficient smart grids and industrial control systems. In recent years, computer scientists has focused towards findings more applications of WSN in multimedia technologies, i.e. audio, video and digital images. Due to bulky nature of multimedia data, WSN process a large volume of multimedia data which significantly increases computational complexity and hence reduces battery time. With respect to battery life constraints, image compression in addition with secure transmission over a wide ranged sensor network is an emerging and challenging task in Wireless Multimedia Sensor Networks. Due to the open nature of the Internet, transmission of data must be secure through a process known as encryption. As a result, there is an intensive demand for such schemes that is energy efficient as well as highly secure since decades. In this paper, discrete wavelet-based partial image encryption scheme using hashing algorithm, chaotic maps and Hussain's S-Box is reported. The plaintext image is compressed via discrete wavelet transform and then the image is shuffled column-wise and row wise-wise via Piece-wise Linear Chaotic Map (PWLCM) and Nonlinear Chaotic Algorithm, respectively. To get higher security, initial conditions for PWLCM are made dependent on hash function. The permuted image is bitwise XORed with random matrix generated from Intertwining Logistic map. To enhance the security further, final ciphertext is obtained after substituting all elements with Hussain's substitution box. Experimental and statistical results confirm the strength of the anticipated scheme.

  9. Comparison of perimeter trap crop varieties: effects on herbivory, pollination, and yield in butternut squash.

    Science.gov (United States)

    Adler, L S; Hazzard, R V

    2009-02-01

    Perimeter trap cropping (PTC) is a method of integrated pest management (IPM) in which the main crop is surrounded with a perimeter trap crop that is more attractive to pests. Blue Hubbard (Cucurbita maxima Duch.) is a highly effective trap crop for butternut squash (C. moschata Duch. ex Poir) attacked by striped cucumber beetles (Acalymma vittatum Fabricius), but its limited marketability may reduce adoption of PTC by growers. Research comparing border crop varieties is necessary to provide options for growers. Furthermore, pollinators are critical for cucurbit yield, and the effect of PTC on pollination to main crops is unknown. We examined the effect of five border treatments on herbivory, pollination, and yield in butternut squash and manipulated herbivory and pollination to compare their importance for main crop yield. Blue Hubbard, buttercup squash (C. maxima Duch.), and zucchini (C. pepo L.) were equally attractive to cucumber beetles. Border treatments did not affect butternut leaf damage, but butternut flowers had the fewest beetles when surrounded by Blue Hubbard or buttercup squash. Yield was highest in the Blue Hubbard and buttercup treatments, but this effect was not statistically significant. Native bees accounted for 87% of pollinator visits, and pollination did not limit yield. There was no evidence that border crops competed with the main crop for pollinators. Our results suggest that both buttercup squash and zucchini may be viable alternatives to Blue Hubbard as borders for the main crop of butternut squash. Thus, growers may have multiple border options that reduce pesticide use, effectively manage pests, and do not disturb mutualist interactions with pollinators.

  10. Research on photoconductor radiological sensors

    International Nuclear Information System (INIS)

    Beaumont, Francois

    1989-01-01

    Because of the evolution of medical imaging techniques to digital systems, it is necessary to replace radiological film which has many drawbacks, by a detector quite as efficient and quickly giving a digitable signal. The purpose of this thesis is to find new X-ray digital imaging processes using photoconductor materials such as amorphous selenium. After reviewing the principle of direct radiology and functions to be served by the X-ray sensor (i e. detection, memory, assignment, visualization), we explain specification. We especially show the constraints due to the object to be radio-graphed (condition of minimal exposure), and to the reading signal (electronic noise detection associated with a reading frequency). As a result of this study, a first photoconductor sensor could be designed. Its principle is based on photo carrier trapping at dielectric-photoconductor structure interface. The reading system needs the scanning of a laser beam upon the sensor surface. The dielectric-photoconductor structure enabled us to estimate the possibilities offered by the sensor and to build a complete x-ray imaging system. The originality of thermos-dielectric sensor, that was next studied, is to allow a thermal assignment reading. The chosen system consists in varying the ferroelectric polymer capacity whose dielectric permittivity is weak at room temperature. The thermo-dielectric material was studied by thermal or Joule effect stimulation. During our experiments, trapping was found in a sensor made of amorphous selenium between two electrodes. This new effect was performed and enabled us to expose a first interpretation. Eventually, the comparison of these new sensor concepts with radiological film shows the advantage of the proposed solution. (author) [fr

  11. Microscopy imaging device with advanced imaging properties

    Science.gov (United States)

    Ghosh, Kunal; Burns, Laurie; El Gamal, Abbas; Schnitzer, Mark J.; Cocker, Eric; Ho, Tatt Wei

    2015-11-24

    Systems, methods and devices are implemented for microscope imaging solutions. One embodiment of the present disclosure is directed toward an epifluorescence microscope. The microscope includes an image capture circuit including an array of optical sensor. An optical arrangement is configured to direct excitation light of less than about 1 mW to a target object in a field of view of that is at least 0.5 mm.sup.2 and to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors. The optical arrangement and array of optical sensors are each sufficiently close to the target object to provide at least 2.5 .mu.m resolution for an image of the field of view.

  12. Increasing Linear Dynamic Range of a CMOS Image Sensor

    Science.gov (United States)

    Pain, Bedabrata

    2007-01-01

    A generic design and a corresponding operating sequence have been developed for increasing the linear-response dynamic range of a complementary metal oxide/semiconductor (CMOS) image sensor. The design provides for linear calibrated dual-gain pixels that operate at high gain at a low signal level and at low gain at a signal level above a preset threshold. Unlike most prior designs for increasing dynamic range of an image sensor, this design does not entail any increase in noise (including fixed-pattern noise), decrease in responsivity or linearity, or degradation of photometric calibration. The figure is a simplified schematic diagram showing the circuit of one pixel and pertinent parts of its column readout circuitry. The conventional part of the pixel circuit includes a photodiode having a small capacitance, CD. The unconventional part includes an additional larger capacitance, CL, that can be connected to the photodiode via a transfer gate controlled in part by a latch. In the high-gain mode, the signal labeled TSR in the figure is held low through the latch, which also helps to adapt the gain on a pixel-by-pixel basis. Light must be coupled to the pixel through a microlens or by back illumination in order to obtain a high effective fill factor; this is necessary to ensure high quantum efficiency, a loss of which would minimize the efficacy of the dynamic- range-enhancement scheme. Once the level of illumination of the pixel exceeds the threshold, TSR is turned on, causing the transfer gate to conduct, thereby adding CL to the pixel capacitance. The added capacitance reduces the conversion gain, and increases the pixel electron-handling capacity, thereby providing an extension of the dynamic range. By use of an array of comparators also at the bottom of the column, photocharge voltages on sampling capacitors in each column are compared with a reference voltage to determine whether it is necessary to switch from the high-gain to the low-gain mode. Depending upon

  13. Multi-Sensor Mud Detection

    Science.gov (United States)

    Rankin, Arturo L.; Matthies, Larry H.

    2010-01-01

    Robust mud detection is a critical perception requirement for Unmanned Ground Vehicle (UGV) autonomous offroad navigation. A military UGV stuck in a mud body during a mission may have to be sacrificed or rescued, both of which are unattractive options. There are several characteristics of mud that may be detectable with appropriate UGV-mounted sensors. For example, mud only occurs on the ground surface, is cooler than surrounding dry soil during the daytime under nominal weather conditions, is generally darker than surrounding dry soil in visible imagery, and is highly polarized. However, none of these cues are definitive on their own. Dry soil also occurs on the ground surface, shadows, snow, ice, and water can also be cooler than surrounding dry soil, shadows are also darker than surrounding dry soil in visible imagery, and cars, water, and some vegetation are also highly polarized. Shadows, snow, ice, water, cars, and vegetation can all be disambiguated from mud by using a suite of sensors that span multiple bands in the electromagnetic spectrum. Because there are military operations when it is imperative for UGV's to operate without emitting strong, detectable electromagnetic signals, passive sensors are desirable. JPL has developed a daytime mud detection capability using multiple passive imaging sensors. Cues for mud from multiple passive imaging sensors are fused into a single mud detection image using a rule base, and the resultant mud detection is localized in a terrain map using range data generated from a stereo pair of color cameras.

  14. A 45 nm Stacked CMOS Image Sensor Process Technology for Submicron Pixel.

    Science.gov (United States)

    Takahashi, Seiji; Huang, Yi-Min; Sze, Jhy-Jyi; Wu, Tung-Ting; Guo, Fu-Sheng; Hsu, Wei-Cheng; Tseng, Tung-Hsiung; Liao, King; Kuo, Chin-Chia; Chen, Tzu-Hsiang; Chiang, Wei-Chieh; Chuang, Chun-Hao; Chou, Keng-Yu; Chung, Chi-Hsien; Chou, Kuo-Yu; Tseng, Chien-Hsien; Wang, Chuan-Joung; Yaung, Dun-Nien

    2017-12-05

    A submicron pixel's light and dark performance were studied by experiment and simulation. An advanced node technology incorporated with a stacked CMOS image sensor (CIS) is promising in that it may enhance performance. In this work, we demonstrated a low dark current of 3.2 e - /s at 60 °C, an ultra-low read noise of 0.90 e - ·rms, a high full well capacity (FWC) of 4100 e - , and blooming of 0.5% in 0.9 μm pixels with a pixel supply voltage of 2.8 V. In addition, the simulation study result of 0.8 μm pixels is discussed.

  15. Empirical electro-optical and x-ray performance evaluation of CMOS active pixels sensor for low dose, high resolution x-ray medical imaging

    International Nuclear Information System (INIS)

    Arvanitis, C. D.; Bohndiek, S. E.; Royle, G.; Blue, A.; Liang, H. X.; Clark, A.; Prydderch, M.; Turchetta, R.; Speller, R.

    2007-01-01

    Monolithic complementary metal oxide semiconductor (CMOS) active pixel sensors with high performance have gained attention in the last few years in many scientific and space applications. In order to evaluate the increasing capabilities of this technology, in particular where low dose high resolution x-ray medical imaging is required, critical electro-optical and physical x-ray performance evaluation was determined. The electro-optical performance includes read noise, full well capacity, interacting quantum efficiency, and pixels cross talk. The x-ray performance, including x-ray sensitivity, modulation transfer function, noise power spectrum, and detection quantum efficiency, has been evaluated in the mammographic energy range. The sensor is a 525x525 standard three transistor CMOS active pixel sensor array with more than 75% fill factor and 25x25 μm pixel pitch. Reading at 10 f/s, it is found that the sensor has 114 electrons total additive noise, 10 5 electrons full well capacity with shot noise limited operation, and 34% interacting quantum efficiency at 530 nm. Two different structured CsI:Tl phosphors with thickness 95 and 115 μm, respectively, have been optically coupled via a fiber optic plate to the array resulting in two different system configurations. The sensitivity of the two different system configurations was 43 and 47 electrons per x-ray incident on the sensor. The MTF at 10% of the two different system configurations was 9.5 and 9 cycles/mm with detective quantum efficiency of 0.45 and 0.48, respectively, close to zero frequency at ∼0.44 μC/kg (1.72 mR) detector entrance exposure. The detector was quantum limited at low spatial frequencies and its performance was comparable with high resolution a:Si and charge coupled device based x-ray imagers. The detector also demonstrates almost an order of magnitude lower noise than active matrix flat panel imagers. The results suggest that CMOS active pixel sensors when coupled to structured CsI:Tl can

  16. Area G perimeter surface-soil and single-stage water sampling: Environmental surveillance for fiscal year 1993

    International Nuclear Information System (INIS)

    Conrad, R.; Childs, M.; Rivera-Dirks, C.; Coriz, F.

    1995-07-01

    Area G, in Technical Area 54, has been the principle facility at Los Alamos National Laboratory for the storage and disposal of low-level and transuranic (TRU) radioactive wastes since 1957. The current environmental investigation consisted of ESH-19 personnel who collected soil and single-stage water samples around the perimeter of Area G to characterize possible contaminant movement through surface-water runoff. These samples were analyzed for tritium, total uranium, isotopic plutonium, americium-241 (soil only), and cesium 137. The metals, mercury, lead, and barium, were analyzed using x-ray fluorescence

  17. Sensor influence in digital 3λ holographic interferometry

    International Nuclear Information System (INIS)

    Desse, J M; Picart, P; Tankam, P

    2011-01-01

    In digital holographic interferometry, the resolution of the reconstructed hologram depends on the pixel size and pixel number of the sensor used for recording. When different wavelengths are simultaneously used as a luminous source for the interferometer, the shape and the overlapping of three filters of a color sensor strongly influence the three reconstructed images. This problem can be directly visualized in 2D Fourier planes on red, green and blue channels. To better understand this problem and to avoid parasitic images generated at the reconstruction, three different sensors have been tested: a CCD sensor equipped with a Bayer filter, a Foveon sensor and a 3CCD sensor. The first one is a Bayer mosaic where one half of the pixels detect the green color and only one-quarter detect the red or blue color. As the missing data are interpolated among color detection positions, offsets and artifacts are generated. The second one is a specific sensor constituted with three stacked photodiode layers. Its technology is different from that of the classical color mosaic sensor because each pixel location detects the three colors simultaneously. So, the three colors are recorded simultaneously with identical spatial resolution, which corresponds to the spatial resolution of the sensor. However, the spectral curve of the sensor is large along each wavelength since the color segmentation is based on the penetration depth of the photons in silicon. Finally, with a 3CCD sensor, each image is recorded on three different sensors with the same resolution. In order to test the sensor influence, we have developed a specific optical bench which allows the near wake flow around a circular cylinder at Mach 0.45 to be characterized. Finally, best results have been obtained with the 3CDD sensor

  18. Development of a thinned back-illuminated CMOS active pixel sensor for extreme ultraviolet spectroscopy and imaging in space science

    International Nuclear Information System (INIS)

    Waltham, N.R.; Prydderch, M.; Mapson-Menard, H.; Pool, P.; Harris, A.

    2007-01-01

    We describe our programme to develop a large-format, science-grade, monolithic CMOS active pixel sensor for future space science missions, and in particular an extreme ultraviolet (EUV) spectrograph for solar physics studies on ESA's Solar Orbiter. Our route to EUV sensitivity relies on adapting the back-thinning and rear-illumination techniques first developed for CCD sensors. Our first large-format sensor consists of 4kx3k 5 μm pixels fabricated on a 0.25 μm CMOS imager process. Wafer samples of these sensors have been thinned by e2v technologies with the aim of obtaining good sensitivity at EUV wavelengths. We present results from both front- and back-illuminated versions of this sensor. We also present our plans to develop a new sensor of 2kx2k 10 μm pixels, which will be fabricated on a 0.35 μm CMOS process. In progress towards this goal, we have designed a test-structure consisting of six arrays of 512x512 10 μm pixels. Each of the arrays has been given a different pixel design to allow verification of our models, and our progress towards optimizing a design for minimal system readout noise and maximum dynamic range. These sensors will also be back-thinned for characterization at EUV wavelengths

  19. Creation of 3D Multi-Body Orthodontic Models by Using Independent Imaging Sensors

    Directory of Open Access Journals (Sweden)

    Armando Viviano Razionale

    2013-02-01

    Full Text Available In the field of dental health care, plaster models combined with 2D radiographs are widely used in clinical practice for orthodontic diagnoses. However, complex malocclusions can be better analyzed by exploiting 3D digital dental models, which allow virtual simulations and treatment planning processes. In this paper, dental data captured by independent imaging sensors are fused to create multi-body orthodontic models composed of teeth, oral soft tissues and alveolar bone structures. The methodology is based on integrating Cone-Beam Computed Tomography (CBCT and surface structured light scanning. The optical scanner is used to reconstruct tooth crowns and soft tissues (visible surfaces through the digitalization of both patients’ mouth impressions and plaster casts. These data are also used to guide the segmentation of internal dental tissues by processing CBCT data sets. The 3D individual dental tissues obtained by the optical scanner and the CBCT sensor are fused within multi-body orthodontic models without human supervisions to identify target anatomical structures. The final multi-body models represent valuable virtual platforms to clinical diagnostic and treatment planning.

  20. A Shack-Hartmann Sensor for Single-Shot Multi-Contrast Imaging with Hard X-rays

    Directory of Open Access Journals (Sweden)

    Tomy dos Santos Rolo

    2018-05-01

    Full Text Available An array of compound refractive X-ray lenses (CRL with 20 × 20 lenslets, a focal distance of 20cm and a visibility of 0.93 is presented. It can be used as a Shack-Hartmann sensor for hard X-rays (SHARX for wavefront sensing and permits for true single-shot multi-contrast imaging the dynamics of materials with a spatial resolution in the micrometer range, sensitivity on nanosized structures and temporal resolution on the microsecond scale. The object’s absorption and its induced wavefront shift can be assessed simultaneously together with information from diffraction channels. In contrast to the established Hartmann sensors the SHARX has an increased flux efficiency through focusing of the beam rather than blocking parts of it. We investigated the spatiotemporal behavior of a cavitation bubble induced by laser pulses. Furthermore, we validated the SHARX by measuring refraction angles of a single diamond CRL, where we obtained an angular resolution better than 4 μ rad.