WorldWideScience

Sample records for camera tubes

  1. Ultra-fast framing camera tube

    Science.gov (United States)

    Kalibjian, Ralph

    1981-01-01

    An electronic framing camera tube features focal plane image dissection and synchronized restoration of the dissected electron line images to form two-dimensional framed images. Ultra-fast framing is performed by first streaking a two-dimensional electron image across a narrow slit, thereby dissecting the two-dimensional electron image into sequential electron line images. The dissected electron line images are then restored into a framed image by a restorer deflector operated synchronously with the dissector deflector. The number of framed images on the tube's viewing screen is equal to the number of dissecting slits in the tube. The distinguishing features of this ultra-fast framing camera tube are the focal plane dissecting slits, and the synchronously-operated restorer deflector which restores the dissected electron line images into a two-dimensional framed image. The framing camera tube can produce image frames having high spatial resolution of optical events in the sub-100 picosecond range.

  2. New camera tube improves ultrasonic inspection system

    Science.gov (United States)

    Berger, H.; Collis, W. J.; Jacobs, J. E.

    1968-01-01

    Electron multiplier, incorporated into the camera tube of an ultrasonic imaging system, improves resolution, effectively shields low level circuits, and provides a high level signal input to the television camera. It is effective for inspection of metallic materials for bonds, voids, and homogeneity.

  3. Study of Permanent Magnet Focusing for Astronomical Camera Tubes

    Science.gov (United States)

    Long, D. C.; Lowrance, J. L.

    1975-01-01

    A design is developed of a permanent magnet assembly (PMA) useful as the magnetic focusing unit for the 35 and 70 mm (diagonal) format SEC tubes. Detailed PMA designs for both tubes are given, and all data on their magnetic configuration, size, weight, and structure of magnetic shields adequate to screen the camera tube from the earth's magnetic field are presented. A digital computer is used for the PMA design simulations, and the expected operational performance of the PMA is ascertained through the calculation of a series of photoelectron trajectories. A large volume where the magnetic field uniformity is greater than 0.5% appears obtainable, and the point spread function (PSF) and modulation transfer function(MTF) indicate nearly ideal performance. The MTF at 20 cycles per mm exceeds 90%. The weight and volume appear tractable for the large space telescope and ground based application.

  4. Environmental test report for the WX-32335 SEC camera tube. [for International Ultraviolet Explorer

    Science.gov (United States)

    Malanoski, R. J.

    1973-01-01

    The environmental testing activity on the WX-32335 was carried out to determine if this tube type could withstand the environmental requirements established for the International Ultraviolet Explorer (IUE) camera tube (WX-32224). The results of the tests led to the following conclusions: (1) The WX-32335 as processed with a CsTe photocathode surface can withstand the temperature extremes established for the IUE camera tube without damage to the photocathode surface or without introducing background signal in the tube after one hour of dark integration. (2) The WX-32335 built with a WX-32224 type target support structure can withstand the sinusoidal vibration requirements established for the IUE camera tube. (3) Although the vibration test of the WX-32335 type tubes built with the flat target ring structure could not be completed, there was no indication that these tubes could not withstand the sinusoidal vibration requirements established for the IUE camera tube.

  5. Camera Embedded Single Lumen Tube as a Rescue Device for Airway Handling during Lung Separation

    DEFF Research Database (Denmark)

    Højberg Holm, Jimmy; Andersen, Claus

    2016-01-01

    for the surgery to proceed (Figure 2). The rest of procedure was uneventful with normal one-lung ventilation and a smooth awakening and extubation. We report a case of unexpected technical difficulties when isolating the lung in pulmonary surgery for lung cancer, a problem that could lead to cancellation......Lung isolation in thoracic surgery will usually be achieved either with a double-lumen tube (DLT) or a bronchial blocker (BB). However, even when conducted by anesthesiologists with particular interest and expertise in thoracic anesthesia, the procedure may be troublesome and time consuming.......Keywords: Thoracic anesthesia; Airway handling; VivaSight; Vivasight-SL; Lobectomy; Camera-embedded tube; Endotracheal; Lung isolation; Video tube Taking the small stature into account, use of a small conventional 35-Fr right sided DLT was planned for the procedure. As it turned out, this tube could not be passed...

  6. Camera Embedded Single Lumen Tube as a Rescue Device for Airway Handling during Lung Separation

    DEFF Research Database (Denmark)

    Højberg Holm, Jimmy; Andersen, Claus

    2016-01-01

    Lung isolation in thoracic surgery will usually be achieved either with a double-lumen tube (DLT) or a bronchial blocker (BB). However, even when conducted by anesthesiologists with particular interest and expertise in thoracic anesthesia, the procedure may be troublesome and time consuming.......Keywords: Thoracic anesthesia; Airway handling; VivaSight; Vivasight-SL; Lobectomy; Camera-embedded tube; Endotracheal; Lung isolation; Video tube Taking the small stature into account, use of a small conventional 35-Fr right sided DLT was planned for the procedure. As it turned out, this tube could not be passed...... beyond the vocal cords because too much resistance was felt. We therefore changed to a smaller DLT, and as a DLT size 28-Fr is only available in a left sided version [1], we opted for this. Unfortunately it turned out, that our fiberoptic broncoscope could not be advanced through an ET of this size...

  7. Recent developments on ISPA-cameras for gamma ray imaging gamma imaging with an electrostatic crossed focussed ISPA-tube

    CERN Document Server

    D'Ambrosio, C; Gys, Thierry; Leutz, H; Piedigrossi, D; Puertolas, D; Rosso, E

    2000-01-01

    The Imaging Silicon Pixel Array (ISPA)-tube is a position-sensitive hybrid photon detector. Originally developed for high-energy physics purposes, it has also been used for biomedical applications. Two kinds of ISPA-tube prototypes have been tested successfully in the field of gamma ray imaging. The current developments aim at obtaining a detector dedicated to single-photon emission imaging. In this paper, we present the first use in a gamma camera of a new ISPA-tube prototype having an increased active input surface of 40 mm diameter and a de-magnifying electron optics. The quartz input window of the tube is optically coupled to a 3.5 cm/sup 2/ YAlO/sub 3/:Ce detector array with 0.6 mm/sup 2/ single elements. (11 refs).

  8. Study of performance of small gamma camera consisting of crystal pixel array and position sensitive photomultiplier tube

    Institute of Scientific and Technical Information of China (English)

    ZHU Jie; LIU Shi-Tao; LEI Xiao-Wen; YAN Tian-Xin; XU Zi-Zong; WANG Zhao-Min

    2005-01-01

    The performance of gamma camera with NaI(T1) array coupled with position sensitive photomultiplier tube (PSPMT) R2486 has been studied. The pixel size of NaI(T1) crystal is 2mm×2mm and the overall dimension of the array is 48.2mm×48.2mm×5mm. There are 484 pixels in a 22×22 matrix. Because each pixel can produce a much focused light spot and restrict the spread of photons, position resolution of the gamma camera is mainly determined by pixel size. It is shown that crystal array pixel can reduce shrinkage effect and improve intrinsic position resolution greatly via restricting the spread of photons. Experimental results demonstrate that its position resolution and linearity are much improved comparing with the gamma camera using planar crystals coupled with PSPMT.

  9. YouTube War: Fighting in a World of Cameras in Every Cell Phone and Photoshop on Every Computer

    Science.gov (United States)

    2009-11-01

    capitalize on viewers’ desires to produce their own content. CBS News, as part of its coverage of the NCAA basketball tournament in 2007, encouraged...executed on camera (and as the beheadings-for- camera seemed to taper off, perhaps for fear that the raw savagery displayed was hurting the very move

  10. Vacuum Camera Cooler

    Science.gov (United States)

    Laugen, Geoffrey A.

    2011-01-01

    Acquiring cheap, moving video was impossible in a vacuum environment, due to camera overheating. This overheating is brought on by the lack of cooling media in vacuum. A water-jacketed camera cooler enclosure machined and assembled from copper plate and tube has been developed. The camera cooler (see figure) is cup-shaped and cooled by circulating water or nitrogen gas through copper tubing. The camera, a store-bought "spy type," is not designed to work in a vacuum. With some modifications the unit can be thermally connected when mounted in the cup portion of the camera cooler. The thermal conductivity is provided by copper tape between parts of the camera and the cooled enclosure. During initial testing of the demonstration unit, the camera cooler kept the CPU (central processing unit) of this video camera at operating temperature. This development allowed video recording of an in-progress test, within a vacuum environment.

  11. Tower Camera

    Data.gov (United States)

    Oak Ridge National Laboratory — The tower camera in Barrow provides hourly images of ground surrounding the tower. These images may be used to determine fractional snow cover as winter arrives, for...

  12. Cardiac cameras.

    Science.gov (United States)

    Travin, Mark I

    2011-05-01

    Cardiac imaging with radiotracers plays an important role in patient evaluation, and the development of suitable imaging instruments has been crucial. While initially performed with the rectilinear scanner that slowly transmitted, in a row-by-row fashion, cardiac count distributions onto various printing media, the Anger scintillation camera allowed electronic determination of tracer energies and of the distribution of radioactive counts in 2D space. Increased sophistication of cardiac cameras and development of powerful computers to analyze, display, and quantify data has been essential to making radionuclide cardiac imaging a key component of the cardiac work-up. Newer processing algorithms and solid state cameras, fundamentally different from the Anger camera, show promise to provide higher counting efficiency and resolution, leading to better image quality, more patient comfort and potentially lower radiation exposure. While the focus has been on myocardial perfusion imaging with single-photon emission computed tomography, increased use of positron emission tomography is broadening the field to include molecular imaging of the myocardium and of the coronary vasculature. Further advances may require integrating cardiac nuclear cameras with other imaging devices, ie, hybrid imaging cameras. The goal is to image the heart and its physiological processes as accurately as possible, to prevent and cure disease processes.

  13. CCD Camera

    Science.gov (United States)

    Roth, Roger R.

    1983-01-01

    A CCD camera capable of observing a moving object which has varying intensities of radiation eminating therefrom and which may move at varying speeds is shown wherein there is substantially no overlapping of successive images and wherein the exposure times and scan times may be varied independently of each other.

  14. Mini gamma camera, camera system and method of use

    Science.gov (United States)

    Majewski, Stanislaw; Weisenberger, Andrew G.; Wojcik, Randolph F.

    2001-01-01

    A gamma camera comprising essentially and in order from the front outer or gamma ray impinging surface: 1) a collimator, 2) a scintillator layer, 3) a light guide, 4) an array of position sensitive, high resolution photomultiplier tubes, and 5) printed circuitry for receipt of the output of the photomultipliers. There is also described, a system wherein the output supplied by the high resolution, position sensitive photomultipiler tubes is communicated to: a) a digitizer and b) a computer where it is processed using advanced image processing techniques and a specific algorithm to calculate the center of gravity of any abnormality observed during imaging, and c) optional image display and telecommunications ports.

  15. Streak Tubes for Diagnostics of Lasers and Plasmas

    Science.gov (United States)

    Sokolov, A. Yu; Konovalov, P. I.; Nurtdinov, R. I.; Vikulin, M. P.; Pryanishnikov, I. G.; Dolotov, A. S.; Krapiva, P. S.

    2016-09-01

    Designing a facility for laser fusion research requires sufficient advancement in diagnostics techniques for lasers and plasmas, including those involving streak camera imaging. Maximum specifications of streak cameras depend on the parameters of streak tubes. The paper illustrates how these devices function, and which of their parameters are limiting. The paper presents a novel technological platform designed at VNIIA, which was used to develop a new generation of streak tubes. Using these streak tubes in streak cameras, the efficiency of streak camera imaging techniques can be improved by several orders of magnitude, and new techniques can be designed.

  16. Solid State Replacement of Rotating Mirror Cameras

    Energy Technology Data Exchange (ETDEWEB)

    Frank, A M; Bartolick, J M

    2006-08-25

    Rotating mirror cameras have been the mainstay of mega-frame per second imaging for decades. There is still no electronic camera that can match a film based rotary mirror camera for the combination of frame count, speed, resolution and dynamic range. The rotary mirror cameras are predominantly used in the range of 0.1 to 100 micro-seconds per frame, for 25 to more than a hundred frames. Electron tube gated cameras dominate the sub microsecond regime but are frame count limited. Video cameras are pushing into the microsecond regime but are resolution limited by the high data rates. An all solid state architecture, dubbed ''In-situ Storage Image Sensor'' or ''ISIS'', by Prof. Goji Etoh, has made its first appearance into the market and its evaluation is discussed. Recent work at Lawrence Livermore National Laboratory has concentrated both on evaluation of the presently available technologies and exploring the capabilities of the ISIS architecture. It is clear though there is presently no single chip camera that can simultaneously match the rotary mirror cameras, the ISIS architecture has the potential to approach their performance.

  17. Portable mini gamma camera for medical applications

    CERN Document Server

    Porras, E; Benlloch, J M; El-Djalil-Kadi-Hanifi, M; López, S; Pavon, N; Ruiz, J A; Sánchez, F; Sebastiá, A

    2002-01-01

    A small, portable and low-cost gamma camera for medical applications has been developed and clinically tested. This camera, based on a scintillator crystal and a Position Sensitive Photo-Multiplier Tube, has a useful field of view of 4.6 cm diameter and provides 2.2 mm of intrinsic spatial resolution. Its mobility and light weight allow to reach the patient from any desired direction. This camera images small organs with high efficiency and so addresses the demand for devices of specific clinical applications. In this paper, we present the camera and briefly describe the procedures that have led us to choose its configuration and the image reconstruction method. The clinical tests and diagnostic capability are also presented and discussed.

  18. Neutron camera employing row and column summations

    Science.gov (United States)

    Clonts, Lloyd G.; Diawara, Yacouba; Donahue, Jr, Cornelius; Montcalm, Christopher A.; Riedel, Richard A.; Visscher, Theodore

    2016-06-14

    For each photomultiplier tube in an Anger camera, an R.times.S array of preamplifiers is provided to detect electrons generated within the photomultiplier tube. The outputs of the preamplifiers are digitized to measure the magnitude of the signals from each preamplifier. For each photomultiplier tube, a corresponding summation circuitry including R row summation circuits and S column summation circuits numerically add the magnitudes of the signals from preamplifiers for each row and for each column to generate histograms. For a P.times.Q array of photomultiplier tubes, P.times.Q summation circuitries generate P.times.Q row histograms including R entries and P.times.Q column histograms including S entries. The total set of histograms include P.times.Q.times.(R+S) entries, which can be analyzed by a position calculation circuit to determine the locations of events (detection of a neutron).

  19. Making Ceramic Cameras

    Science.gov (United States)

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  20. Constrained space camera assembly

    Science.gov (United States)

    Heckendorn, Frank M.; Anderson, Erin K.; Robinson, Casandra W.; Haynes, Harriet B.

    1999-01-01

    A constrained space camera assembly which is intended to be lowered through a hole into a tank, a borehole or another cavity. The assembly includes a generally cylindrical chamber comprising a head and a body and a wiring-carrying conduit extending from the chamber. Means are included in the chamber for rotating the body about the head without breaking an airtight seal formed therebetween. The assembly may be pressurized and accompanied with a pressure sensing means for sensing if a breach has occurred in the assembly. In one embodiment, two cameras, separated from their respective lenses, are installed on a mounting apparatus disposed in the chamber. The mounting apparatus includes means allowing both longitudinal and lateral movement of the cameras. Moving the cameras longitudinally focuses the cameras, and moving the cameras laterally away from one another effectively converges the cameras so that close objects can be viewed. The assembly further includes means for moving lenses of different magnification forward of the cameras.

  1. Adapting Virtual Camera Behaviour

    DEFF Research Database (Denmark)

    Burelli, Paolo

    2013-01-01

    In a three-dimensional virtual environment aspects such as narrative and interaction completely depend on the camera since the camera defines the player’s point of view. Most research works in automatic camera control aim to take the control of this aspect from the player to automatically gen...

  2. Digital Pinhole Camera

    Science.gov (United States)

    Lancor, Rachael; Lancor, Brian

    2014-01-01

    In this article we describe how the classic pinhole camera demonstration can be adapted for use with digital cameras. Students can easily explore the effects of the size of the pinhole and its distance from the sensor on exposure time, magnification, and image quality. Instructions for constructing a digital pinhole camera and our method for…

  3. Results of the prototype camera for FACT

    Energy Technology Data Exchange (ETDEWEB)

    Anderhub, H. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Backes, M. [Technische Universitaet Dortmund, D-44221 Dortmund (Germany); Biland, A.; Boller, A.; Braun, I. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Bretz, T. [Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Commichau, S.; Commichau, V. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Dorner, D. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); INTEGRAL Science Data Center, CH-1290 Versoix (Switzerland); Gendotti, A.; Grimm, O.; Gunten, H. von; Hildebrand, D.; Horisberger, U. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Koehne, J.-H. [Technische Universitaet Dortmund, D-44221 Dortmund (Germany); Kraehenbuehl, T., E-mail: thomas.kraehenbuehl@phys.ethz.c [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Kranich, D.; Lorenz, E.; Lustermann, W. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Mannheim, K. [Universitaet Wuerzburg, D-97074 Wuerzburg (Germany)

    2011-05-21

    The maximization of the photon detection efficiency (PDE) is a key issue in the development of cameras for Imaging Atmospheric Cherenkov Telescopes. Geiger-mode Avalanche Photodiodes (G-APD) are a promising candidate to replace the commonly used photomultiplier tubes by offering a larger PDE and in addition a facilitated handling. The FACT (First G-APD Cherenkov Telescope) project evaluates the feasibility of this change by building a camera based on 1440 G-APDs for an existing small telescope. As a first step towards a full camera, a prototype module using 144 G-APDs was successfully built and tested. The strong temperature dependence of G-APDs is compensated using a feedback system, which allows to keep the gain of the G-APDs constant to 0.5%.

  4. Microchannel plate streak camera

    Science.gov (United States)

    Wang, Ching L.

    1989-01-01

    An improved streak camera in which a microchannel plate electron multiplier is used in place of or in combination with the photocathode used in prior streak cameras. The improved streak camera is far more sensitive to photons (UV to gamma-rays) than the conventional x-ray streak camera which uses a photocathode. The improved streak camera offers gamma-ray detection with high temporal resolution. It also offers low-energy x-ray detection without attenuation inside the cathode. Using the microchannel plate in the improved camera has resulted in a time resolution of about 150 ps, and has provided a sensitivity sufficient for 1000 KeV x-rays.

  5. Microprocessor-controlled wide-range streak camera

    Science.gov (United States)

    Lewis, Amy E.; Hollabaugh, Craig

    2006-08-01

    Bechtel Nevada/NSTec recently announced deployment of their fifth generation streak camera. This camera incorporates many advanced features beyond those currently available for streak cameras. The arc-resistant driver includes a trigger lockout mechanism, actively monitors input trigger levels, and incorporates a high-voltage fault interrupter for user safety and tube protection. The camera is completely modular and may deflect over a variable full-sweep time of 15 nanoseconds to 500 microseconds. The camera design is compatible with both large- and small-format commercial tubes from several vendors. The embedded microprocessor offers Ethernet connectivity, and XML [extensible markup language]-based configuration management with non-volatile parameter storage using flash-based storage media. The camera's user interface is platform-independent (Microsoft Windows, Unix, Linux, Macintosh OSX) and is accessible using an AJAX [asynchronous Javascript and XML]-equipped modem browser, such as Internet Explorer 6, Firefox, or Safari. User interface operation requires no installation of client software or browser plug-in technology. Automation software can also access the camera configuration and control using HTTP [hypertext transfer protocol]. The software architecture supports multiple-simultaneous clients, multiple cameras, and multiple module access with a standard browser. The entire user interface can be customized.

  6. Solid state video cameras

    CERN Document Server

    Cristol, Y

    2013-01-01

    Solid State Video Cameras reviews the state of the art in the field of solid-state television cameras as compiled from patent literature. Organized into 10 chapters, the book begins with the basic array types of solid-state imagers and appropriate read-out circuits and methods. Documents relating to improvement of picture quality, such as spurious signal suppression, uniformity correction, or resolution enhancement, are also cited. The last part considerssolid-state color cameras.

  7. LSST Camera Optics Design

    Energy Technology Data Exchange (ETDEWEB)

    Riot, V J; Olivier, S; Bauman, B; Pratuch, S; Seppala, L; Gilmore, D; Ku, J; Nordby, M; Foss, M; Antilogus, P; Morgado, N

    2012-05-24

    The Large Synoptic Survey Telescope (LSST) uses a novel, three-mirror, telescope design feeding a camera system that includes a set of broad-band filters and three refractive corrector lenses to produce a flat field at the focal plane with a wide field of view. Optical design of the camera lenses and filters is integrated in with the optical design of telescope mirrors to optimize performance. We discuss the rationale for the LSST camera optics design, describe the methodology for fabricating, coating, mounting and testing the lenses and filters, and present the results of detailed analyses demonstrating that the camera optics will meet their performance goals.

  8. Ringfield lithographic camera

    Science.gov (United States)

    Sweatt, William C.

    1998-01-01

    A projection lithography camera is presented with a wide ringfield optimized so as to make efficient use of extreme ultraviolet radiation from a large area radiation source (e.g., D.sub.source .apprxeq.0.5 mm). The camera comprises four aspheric mirrors optically arranged on a common axis of symmetry with an increased etendue for the camera system. The camera includes an aperture stop that is accessible through a plurality of partial aperture stops to synthesize the theoretical aperture stop. Radiation from a mask is focused to form a reduced image on a wafer, relative to the mask, by reflection from the four aspheric mirrors.

  9. Camera as Cultural Critique

    DEFF Research Database (Denmark)

    Suhr, Christian

    2015-01-01

    What does the use of cameras entail for the production of cultural critique in anthropology? Visual anthropological analysis and cultural critique starts at the very moment a camera is brought into the field or existing visual images are engaged. The framing, distances, and interactions between...... to establish analysis as a continued, iterative movement of transcultural dialogue and critique....

  10. Camera Operator and Videographer

    Science.gov (United States)

    Moore, Pam

    2007-01-01

    Television, video, and motion picture camera operators produce images that tell a story, inform or entertain an audience, or record an event. They use various cameras to shoot a wide range of material, including television series, news and sporting events, music videos, motion pictures, documentaries, and training sessions. Those who film or…

  11. Dry imaging cameras

    Directory of Open Access Journals (Sweden)

    I K Indrajit

    2011-01-01

    Full Text Available Dry imaging cameras are important hard copy devices in radiology. Using dry imaging camera, multiformat images of digital modalities in radiology are created from a sealed unit of unexposed films. The functioning of a modern dry camera, involves a blend of concurrent processes, in areas of diverse sciences like computers, mechanics, thermal, optics, electricity and radiography. Broadly, hard copy devices are classified as laser and non laser based technology. When compared with the working knowledge and technical awareness of different modalities in radiology, the understanding of a dry imaging camera is often superficial and neglected. To fill this void, this article outlines the key features of a modern dry camera and its important issues that impact radiology workflow.

  12. Thermal Cameras and Applications

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. This type of camera was originally developed as a surveillance and night vision tool for the military, but recently the price has dropped, significantly opening up...... a broader field of applications. Deploying this type of sensor in vision systems eliminates the illumination problems of normal greyscale and RGB cameras. This survey provides an overview of the current applications of thermal cameras. Applications include animals, agriculture, buildings, gas detection......, industrial, and military applications, as well as detection, tracking, and recognition of humans. Moreover, this survey describes the nature of thermal radiation and the technology of thermal cameras....

  13. Ear Tubes

    Science.gov (United States)

    ... of the ear drum or eustachian tube, Down Syndrome, cleft palate, and barotrauma (injury to the middle ear caused by a reduction of air pressure, ... specialist) may be warranted if you or your child has experienced repeated ... fluid in the middle ear, barotrauma, or have an anatomic abnormality that ...

  14. Radioisotope study of Eustachian tube. A preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    De Rossi, G.; Campioni, P.; Vaccaro, A.

    1988-08-01

    Radioisotope studies of Eustachian tube are suggested in the preoperative phase of tympanoplasty, in order to assess tubal drainage and secretion. The use of gamma camera fitted to a computer allowed the AA, to calculate some semi-quantitative parameters for an exact assessment of the radioactivity transit from the tympanic cass up to the pharyngeal cavity, throughout the Eustachian tube.

  15. The power of YouTube

    Science.gov (United States)

    Moriarty, Philip

    2014-03-01

    As one of the presenters of the hugely successful Sixty Symbols series of YouTube science videos, Philip Moriarty describes his experiences in front of the camera and how they have transformed his ideas about bringing physics to wider audiences.

  16. Do Speed Cameras Reduce Collisions?

    OpenAIRE

    Skubic, Jeffrey; Johnson, Steven B.; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods – before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not indepe...

  17. Do speed cameras reduce collisions?

    Science.gov (United States)

    Skubic, Jeffrey; Johnson, Steven B; Salvino, Chris; Vanhoy, Steven; Hu, Chengcheng

    2013-01-01

    We investigated the effects of speed cameras along a 26 mile segment in metropolitan Phoenix, Arizona. Motor vehicle collisions were retrospectively identified according to three time periods - before cameras were placed, while cameras were in place and after cameras were removed. A 14 mile segment in the same area without cameras was used for control purposes. Five cofounding variables were eliminated. In this study, the placement or removal of interstate highway speed cameras did not independently affect the incidence of motor vehicle collisions.

  18. Advanced CCD camera developments

    Energy Technology Data Exchange (ETDEWEB)

    Condor, A. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    Two charge coupled device (CCD) camera systems are introduced and discussed, describing briefly the hardware involved, and the data obtained in their various applications. The Advanced Development Group Defense Sciences Engineering Division has been actively designing, manufacturing, fielding state-of-the-art CCD camera systems for over a decade. These systems were originally developed for the nuclear test program to record data from underground nuclear tests. Today, new and interesting application for these systems have surfaced and development is continuing in the area of advanced CCD camera systems, with the new CCD camera that will allow experimenters to replace film for x-ray imaging at the JANUS, USP, and NOVA laser facilities.

  19. TARGETLESS CAMERA CALIBRATION

    Directory of Open Access Journals (Sweden)

    L. Barazzetti

    2012-09-01

    Full Text Available In photogrammetry a camera is considered calibrated if its interior orientation parameters are known. These encompass the principal distance, the principal point position and some Additional Parameters used to model possible systematic errors. The current state of the art for automated camera calibration relies on the use of coded targets to accurately determine the image correspondences. This paper presents a new methodology for the efficient and rigorous photogrammetric calibration of digital cameras which does not require any longer the use of targets. A set of images depicting a scene with a good texture are sufficient for the extraction of natural corresponding image points. These are automatically matched with feature-based approaches and robust estimation techniques. The successive photogrammetric bundle adjustment retrieves the unknown camera parameters and their theoretical accuracies. Examples, considerations and comparisons with real data and different case studies are illustrated to show the potentialities of the proposed methodology.

  20. TOUCHSCREEN USING WEB CAMERA

    Directory of Open Access Journals (Sweden)

    Kuntal B. Adak

    2015-10-01

    Full Text Available In this paper we present a web camera based touchscreen system which uses a simple technique to detect and locate finger. We have used a camera and regular screen to achieve our goal. By capturing the video and calculating position of finger on the screen, we can determine the touch position and do some function on that location. Our method is very easy and simple to implement. Even our system requirement is less expensive compare to other techniques.

  1. The Circular Camera Movement

    DEFF Research Database (Denmark)

    Hansen, Lennard Højbjerg

    2014-01-01

    It has been an accepted precept in film theory that specific stylistic features do not express specific content. Nevertheless, it is possible to find many examples in the history of film in which stylistic features do express specific content: for instance, the circular camera movement is used...... such as the circular camera movement. Keywords: embodied perception, embodied style, explicit narration, interpretation, style pattern, television style...

  2. Jejunostomy feeding tube

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/patientinstructions/000181.htm Jejunostomy feeding tube To use the sharing features on this ... vomiting Your child's stomach is bloated Alternate Names Feeding - jejunostomy tube; G-J tube; J-tube; Jejunum ...

  3. Segment Based Camera Calibration

    Institute of Scientific and Technical Information of China (English)

    马颂德; 魏国庆; 等

    1993-01-01

    The basic idea of calibrating a camera system in previous approaches is to determine camera parmeters by using a set of known 3D points as calibration reference.In this paper,we present a method of camera calibration in whih camera parameters are determined by a set of 3D lines.A set of constraints is derived on camea parameters in terms of perspective line mapping.Form these constraints,the same perspective transformation matrix as that for point mapping can be computed linearly.The minimum number of calibration lines is 6.This result generalizes that of Liu,Huang and Faugeras[12] for camera location determination in which at least 8 line correspondences are required for linear computation of camera location.Since line segments in an image can be located easily and more accurately than points,the use of lines as calibration reference tends to ease the computation in inage preprocessing and to improve calibration accuracy.Experimental results on the calibration along with stereo reconstruction are reported.

  4. Neutron counting with cameras

    Energy Technology Data Exchange (ETDEWEB)

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo [Institut Laue Langevin, Grenoble (France)

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involved are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)

  5. The Dark Energy Camera

    Energy Technology Data Exchange (ETDEWEB)

    Flaugher, B. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States). et al.

    2015-04-11

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250-μm thick fully depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2k x 4k CCDs for imaging and 12 2k x 2k CCDs for guiding and focus. The CCDs have 15μm x 15μm pixels with a plate scale of 0.263" per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construction, installation, and current status.

  6. CAOS-CMOS camera.

    Science.gov (United States)

    Riza, Nabeel A; La Torre, Juan Pablo; Amin, M Junaid

    2016-06-13

    Proposed and experimentally demonstrated is the CAOS-CMOS camera design that combines the coded access optical sensor (CAOS) imager platform with the CMOS multi-pixel optical sensor. The unique CAOS-CMOS camera engages the classic CMOS sensor light staring mode with the time-frequency-space agile pixel CAOS imager mode within one programmable optical unit to realize a high dynamic range imager for extreme light contrast conditions. The experimentally demonstrated CAOS-CMOS camera is built using a digital micromirror device, a silicon point-photo-detector with a variable gain amplifier, and a silicon CMOS sensor with a maximum rated 51.3 dB dynamic range. White light imaging of three different brightness simultaneously viewed targets, that is not possible by the CMOS sensor, is achieved by the CAOS-CMOS camera demonstrating an 82.06 dB dynamic range. Applications for the camera include industrial machine vision, welding, laser analysis, automotive, night vision, surveillance and multispectral military systems.

  7. The Dark Energy Camera

    CERN Document Server

    Flaugher, B; Honscheid, K; Abbott, T M C; Alvarez, O; Angstadt, R; Annis, J T; Antonik, M; Ballester, O; Beaufore, L; Bernstein, G M; Bernstein, R A; Bigelow, B; Bonati, M; Boprie, D; Brooks, D; Buckley-Geer, E J; Campa, J; Cardiel-Sas, L; Castander, F J; Castilla, J; Cease, H; Cela-Ruiz, J M; Chappa, S; Chi, E; Cooper, C; da Costa, L N; Dede, E; Derylo, G; DePoy, D L; de Vicente, J; Doel, P; Drlica-Wagner, A; Eiting, J; Elliott, A E; Emes, J; Estrada, J; Neto, A Fausti; Finley, D A; Flores, R; Frieman, J; Gerdes, D; Gladders, M D; Gregory, B; Gutierrez, G R; Hao, J; Holland, S E; Holm, S; Huffman, D; Jackson, C; James, D J; Jonas, M; Karcher, A; Karliner, I; Kent, S; Kessler, R; Kozlovsky, M; Kron, R G; Kubik, D; Kuehn, K; Kuhlmann, S; Kuk, K; Lahav, O; Lathrop, A; Lee, J; Levi, M E; Lewis, P; Li, T S; Mandrichenko, I; Marshall, J L; Martinez, G; Merritt, K W; Miquel, R; Munoz, F; Neilsen, E H; Nichol, R C; Nord, B; Ogando, R; Olsen, J; Palio, N; Patton, K; Peoples, J; Plazas, A A; Rauch, J; Reil, K; Rheault, J -P; Roe, N A; Rogers, H; Roodman, A; Sanchez, E; Scarpine, V; Schindler, R H; Schmidt, R; Schmitt, R; Schubnell, M; Schultz, K; Schurter, P; Scott, L; Serrano, S; Shaw, T M; Smith, R C; Soares-Santos, M; Stefanik, A; Stuermer, W; Suchyta, E; Sypniewski, A; Tarle, G; Thaler, J; Tighe, R; Tran, C; Tucker, D; Walker, A R; Wang, G; Watson, M; Weaverdyck, C; Wester, W; Woods, R; Yanny, B

    2015-01-01

    The Dark Energy Camera is a new imager with a 2.2-degree diameter field of view mounted at the prime focus of the Victor M. Blanco 4-meter telescope on Cerro Tololo near La Serena, Chile. The camera was designed and constructed by the Dark Energy Survey Collaboration, and meets or exceeds the stringent requirements designed for the wide-field and supernova surveys for which the collaboration uses it. The camera consists of a five element optical corrector, seven filters, a shutter with a 60 cm aperture, and a CCD focal plane of 250 micron thick fully-depleted CCDs cooled inside a vacuum Dewar. The 570 Mpixel focal plane comprises 62 2kx4k CCDs for imaging and 12 2kx2k CCDs for guiding and focus. The CCDs have 15 microns x15 microns pixels with a plate scale of 0.263 arc sec per pixel. A hexapod system provides state-of-the-art focus and alignment capability. The camera is read out in 20 seconds with 6-9 electrons readout noise. This paper provides a technical description of the camera's engineering, construct...

  8. photomultiplier tube

    CERN Multimedia

    photomultiplier tubes. A device to convert light into an electric signal (the name is often abbreviated to PM). Photomultipliers are used in all detectors based on scintillating material (i.e. based on large numbers of fibres which produce scintillation light at the passage of a charged particle). A photomultiplier consists of 3 main parts: firstly, a photocathode where photons are converted into electrons by the photoelectric effect; secondly, a multiplier chain consisting of a serie of dynodes which multiply the number of electron; finally, an anode, which collects the resulting current.

  9. photomultiplier tubes

    CERN Multimedia

    photomultiplier tubes. A device to convert light into an electric signal (the name is often abbreviated to PM). Photomultipliers are used in all detectors based on scintillating material (i.e. based on large numbers of fibres which produce scintillation light at the passage of a charged particle). A photomultiplier consists of 3 main parts: firstly, a photocathode where photons are converted into electrons by the photoelectric effect; secondly, a multiplier chain consisting of a serie of dynodes which multiply the number of electron; finally, an anode, which collects the resulting current.

  10. HIGH SPEED CAMERA

    Science.gov (United States)

    Rogers, B.T. Jr.; Davis, W.C.

    1957-12-17

    This patent relates to high speed cameras having resolution times of less than one-tenth microseconds suitable for filming distinct sequences of a very fast event such as an explosion. This camera consists of a rotating mirror with reflecting surfaces on both sides, a narrow mirror acting as a slit in a focal plane shutter, various other mirror and lens systems as well as an innage recording surface. The combination of the rotating mirrors and the slit mirror causes discrete, narrow, separate pictures to fall upon the film plane, thereby forming a moving image increment of the photographed event. Placing a reflecting surface on each side of the rotating mirror cancels the image velocity that one side of the rotating mirror would impart, so as a camera having this short a resolution time is thereby possible.

  11. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  12. A tube-in-tube thermophotovoltaic generator

    Energy Technology Data Exchange (ETDEWEB)

    Ashcroft, J.; Campbell, B.; Depoy, D.

    1996-12-31

    A thermophotovoltaic device includes at least one thermal radiator tube, a cooling tube concentrically disposed within each thermal radiator tube and an array of thermophotovoltaic cells disposed on the exterior surface of the cooling tube. A shell having a first end and a second end surrounds the thermal radiator tube. Inner and outer tubesheets, each having an aperture corresponding to each cooling tube, are located at each end of the shell. The thermal radiator tube extends within the shell between the inner tubesheets. The cooling tube extends within the shell through the corresponding apertures of the two inner tubesheets to the corresponding apertures of the two outer tubesheets. A plurality of the thermal radiator tubes can be arranged in a staggered or an in-line configuration within the shell.

  13. Underwater camera with depth measurement

    Science.gov (United States)

    Wang, Wei-Chih; Lin, Keng-Ren; Tsui, Chi L.; Schipf, David; Leang, Jonathan

    2016-04-01

    The objective of this study is to develop an RGB-D (video + depth) camera that provides three-dimensional image data for use in the haptic feedback of a robotic underwater ordnance recovery system. Two camera systems were developed and studied. The first depth camera relies on structured light (as used by the Microsoft Kinect), where the displacement of an object is determined by variations of the geometry of a projected pattern. The other camera system is based on a Time of Flight (ToF) depth camera. The results of the structural light camera system shows that the camera system requires a stronger light source with a similar operating wavelength and bandwidth to achieve a desirable working distance in water. This approach might not be robust enough for our proposed underwater RGB-D camera system, as it will require a complete re-design of the light source component. The ToF camera system instead, allows an arbitrary placement of light source and camera. The intensity output of the broadband LED light source in the ToF camera system can be increased by putting them into an array configuration and the LEDs can be modulated comfortably with any waveform and frequencies required by the ToF camera. In this paper, both camera were evaluated and experiments were conducted to demonstrate the versatility of the ToF camera.

  14. Communities, Cameras, and Conservation

    Science.gov (United States)

    Patterson, Barbara

    2012-01-01

    Communities, Cameras, and Conservation (CCC) is the most exciting and valuable program the author has seen in her 30 years of teaching field science courses. In this citizen science project, students and community volunteers collect data on mountain lions ("Puma concolor") at four natural areas and public parks along the Front Range of Colorado.…

  15. Make a Pinhole Camera

    Science.gov (United States)

    Fisher, Diane K.; Novati, Alexander

    2009-01-01

    On Earth, using ordinary visible light, one can create a single image of light recorded over time. Of course a movie or video is light recorded over time, but it is a series of instantaneous snapshots, rather than light and time both recorded on the same medium. A pinhole camera, which is simple to make out of ordinary materials and using ordinary…

  16. The PAU Camera

    Science.gov (United States)

    Casas, R.; Ballester, O.; Cardiel-Sas, L.; Carretero, J.; Castander, F. J.; Castilla, J.; Crocce, M.; de Vicente, J.; Delfino, M.; Fernández, E.; Fosalba, P.; García-Bellido, J.; Gaztañaga, E.; Grañena, F.; Jiménez, J.; Madrid, F.; Maiorino, M.; Martí, P.; Miquel, R.; Neissner, C.; Ponce, R.; Sánchez, E.; Serrano, S.; Sevilla, I.; Tonello, N.; Troyano, I.

    2011-11-01

    The PAU Camera (PAUCam) is a wide-field camera designed to be mounted at the William Herschel Telescope (WHT) prime focus, located at the Observatorio del Roque de los Muchachos in the island of La Palma (Canary Islands).Its primary function is to carry out a cosmological survey, the PAU Survey, covering an area of several hundred square degrees of sky. Its purpose is to determine positions and distances using photometric redshift techniques. To achieve accurate photo-z's, PAUCam will be equipped with 40 narrow-band filters covering the range from 450 to850 nm, and six broad-band filters, those of the SDSS system plus the Y band. To fully cover the focal plane delivered by the telescope optics, 18 CCDs 2k x 4k are needed. The pixels are square of 15 μ m size. The optical characteristics of the prime focus corrector deliver a field-of-view where eight of these CCDs will have an illumination of more than 95% covering a field of 40 arc minutes. The rest of the CCDs will occupy the vignetted region extending the field diameter to one degree. Two of the CCDs will be devoted to auto-guiding.This camera have some innovative features. Firstly, both the broad-band and the narrow-band filters will be placed in mobile trays, hosting 16 such filters at most. Those are located inside the cryostat at few millimeters in front of the CCDs when observing. Secondly, a pressurized liquid nitrogen tank outside the camera will feed a boiler inside the cryostat with a controlled massflow. The read-out electronics will use the Monsoon architecture, originally developed by NOAO, modified and manufactured by our team in the frame of the DECam project (the camera used in the DES Survey).PAUCam will also be available to the astronomical community of the WHT.

  17. Image Sensors Enhance Camera Technologies

    Science.gov (United States)

    2010-01-01

    In the 1990s, a Jet Propulsion Laboratory team led by Eric Fossum researched ways of improving complementary metal-oxide semiconductor (CMOS) image sensors in order to miniaturize cameras on spacecraft while maintaining scientific image quality. Fossum s team founded a company to commercialize the resulting CMOS active pixel sensor. Now called the Aptina Imaging Corporation, based in San Jose, California, the company has shipped over 1 billion sensors for use in applications such as digital cameras, camera phones, Web cameras, and automotive cameras. Today, one of every three cell phone cameras on the planet feature Aptina s sensor technology.

  18. MISR radiometric camera-by-camera Cloud Mask V004

    Data.gov (United States)

    National Aeronautics and Space Administration — This file contains the Radiometric camera-by-camera Cloud Mask dataset. It is used to determine whether a scene is classified as clear or cloudy. A new parameter has...

  19. Feeding tube - infants

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/007235.htm Feeding tube - infants To use the sharing features on this page, please enable JavaScript. A feeding tube is a small, soft, plastic tube placed ...

  20. SPECT detectors: the Anger Camera and beyond

    Science.gov (United States)

    Peterson, Todd E.; Furenlid, Lars R.

    2011-09-01

    The development of radiation detectors capable of delivering spatial information about gamma-ray interactions was one of the key enabling technologies for nuclear medicine imaging and, eventually, single-photon emission computed tomography (SPECT). The continuous sodium iodide scintillator crystal coupled to an array of photomultiplier tubes, almost universally referred to as the Anger Camera after its inventor, has long been the dominant SPECT detector system. Nevertheless, many alternative materials and configurations have been investigated over the years. Technological advances as well as the emerging importance of specialized applications, such as cardiac and preclinical imaging, have spurred innovation such that alternatives to the Anger Camera are now part of commercial imaging systems. Increased computing power has made it practical to apply advanced signal processing and estimation schemes to make better use of the information contained in the detector signals. In this review we discuss the key performance properties of SPECT detectors and survey developments in both scintillator and semiconductor detectors and their readouts with an eye toward some of the practical issues at least in part responsible for the continuing prevalence of the Anger Camera in the clinic.

  1. Dual cameras acquisition and display system of retina-like sensor camera and rectangular sensor camera

    Science.gov (United States)

    Cao, Nan; Cao, Fengmei; Lin, Yabin; Bai, Tingzhu; Song, Shengyu

    2015-04-01

    For a new kind of retina-like senor camera and a traditional rectangular sensor camera, dual cameras acquisition and display system need to be built. We introduce the principle and the development of retina-like senor. Image coordinates transformation and interpolation based on sub-pixel interpolation need to be realized for our retina-like sensor's special pixels distribution. The hardware platform is composed of retina-like senor camera, rectangular sensor camera, image grabber and PC. Combined the MIL and OpenCV library, the software program is composed in VC++ on VS 2010. Experience results show that the system can realizes two cameras' acquisition and display.

  2. Iterative reconstruction of detector response of an Anger gamma camera

    Science.gov (United States)

    Morozov, A.; Solovov, V.; Alves, F.; Domingos, V.; Martins, R.; Neves, F.; Chepel, V.

    2015-05-01

    Statistical event reconstruction techniques can give better results for gamma cameras than the traditional centroid method. However, implementation of such techniques requires detailed knowledge of the photomultiplier tube light-response functions. Here we describe an iterative method which allows one to obtain the response functions from flood irradiation data without imposing strict requirements on the spatial uniformity of the event distribution. A successful application of the method for medical gamma cameras is demonstrated using both simulated and experimental data. An implementation of the iterative reconstruction technique capable of operating in real time is presented. We show that this technique can also be used for monitoring photomultiplier gain variations.

  3. Combustion pinhole camera system

    Science.gov (United States)

    Witte, Arvel B.

    1984-02-21

    A pinhole camera system utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  4. Gamma ray camera

    Science.gov (United States)

    Perez-Mendez, V.

    1997-01-21

    A gamma ray camera is disclosed for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array. 6 figs.

  5. The Star Formation Camera

    CERN Document Server

    Scowen, Paul A; Beasley, Matthew; Calzetti, Daniela; Desch, Steven; Fullerton, Alex; Gallagher, John; Lisman, Doug; Macenka, Steve; Malhotra, Sangeeta; McCaughrean, Mark; Nikzad, Shouleh; O'Connell, Robert; Oey, Sally; Padgett, Deborah; Rhoads, James; Roberge, Aki; Siegmund, Oswald; Shaklan, Stuart; Smith, Nathan; Stern, Daniel; Tumlinson, Jason; Windhorst, Rogier; Woodruff, Robert

    2009-01-01

    The Star Formation Camera (SFC) is a wide-field (~15'x19, >280 arcmin^2), high-resolution (18x18 mas pixels) UV/optical dichroic camera designed for the Theia 4-m space-borne space telescope concept. SFC will deliver diffraction-limited images at lambda > 300 nm in both a blue (190-517nm) and a red (517-1075nm) channel simultaneously. Our aim is to conduct a comprehensive and systematic study of the astrophysical processes and environments relevant for the births and life cycles of stars and their planetary systems, and to investigate and understand the range of environments, feedback mechanisms, and other factors that most affect the outcome of the star and planet formation process. This program addresses the origins and evolution of stars, galaxies, and cosmic structure and has direct relevance for the formation and survival of planetary systems like our Solar System and planets like Earth. We present the design and performance specifications resulting from the implementation study of the camera, conducted ...

  6. Hemispherical Laue camera

    Science.gov (United States)

    Li, James C. M.; Chu, Sungnee G.

    1980-01-01

    A hemispherical Laue camera comprises a crystal sample mount for positioning a sample to be analyzed at the center of sphere of a hemispherical, X-radiation sensitive film cassette, a collimator, a stationary or rotating sample mount and a set of standard spherical projection spheres. X-radiation generated from an external source is directed through the collimator to impinge onto the single crystal sample on the stationary mount. The diffracted beam is recorded on the hemispherical X-radiation sensitive film mounted inside the hemispherical film cassette in either transmission or back-reflection geometry. The distances travelled by X-radiation diffracted from the crystal to the hemispherical film are the same for all crystal planes which satisfy Bragg's Law. The recorded diffraction spots or Laue spots on the film thereby preserve both the symmetry information of the crystal structure and the relative intensities which are directly related to the relative structure factors of the crystal orientations. The diffraction pattern on the exposed film is compared with the known diffraction pattern on one of the standard spherical projection spheres for a specific crystal structure to determine the orientation of the crystal sample. By replacing the stationary sample support with a rotating sample mount, the hemispherical Laue camera can be used for crystal structure determination in a manner previously provided in conventional Debye-Scherrer cameras.

  7. Gamma ray camera

    Science.gov (United States)

    Perez-Mendez, Victor

    1997-01-01

    A gamma ray camera for detecting rays emanating from a radiation source such as an isotope. The gamma ray camera includes a sensor array formed of a visible light crystal for converting incident gamma rays to a plurality of corresponding visible light photons, and a photosensor array responsive to the visible light photons in order to form an electronic image of the radiation therefrom. The photosensor array is adapted to record an integrated amount of charge proportional to the incident gamma rays closest to it, and includes a transparent metallic layer, photodiode consisting of a p-i-n structure formed on one side of the transparent metallic layer, and comprising an upper p-type layer, an intermediate layer and a lower n-type layer. In the preferred mode, the scintillator crystal is composed essentially of a cesium iodide (CsI) crystal preferably doped with a predetermined amount impurity, and the p-type upper intermediate layers and said n-type layer are essentially composed of hydrogenated amorphous silicon (a-Si:H). The gamma ray camera further includes a collimator interposed between the radiation source and the sensor array, and a readout circuit formed on one side of the photosensor array.

  8. Upgraded cameras for the HESS imaging atmospheric Cherenkov telescopes

    Science.gov (United States)

    Giavitto, Gianluca; Ashton, Terry; Balzer, Arnim; Berge, David; Brun, Francois; Chaminade, Thomas; Delagnes, Eric; Fontaine, Gérard; Füßling, Matthias; Giebels, Berrie; Glicenstein, Jean-François; Gräber, Tobias; Hinton, James; Jahnke, Albert; Klepser, Stefan; Kossatz, Marko; Kretzschmann, Axel; Lefranc, Valentin; Leich, Holger; Lüdecke, Hartmut; Lypova, Iryna; Manigot, Pascal; Marandon, Vincent; Moulin, Emmanuel; de Naurois, Mathieu; Nayman, Patrick; Penno, Marek; Ross, Duncan; Salek, David; Schade, Markus; Schwab, Thomas; Simoni, Rachel; Stegmann, Christian; Steppa, Constantin; Thornhill, Julian; Toussnel, François

    2016-08-01

    The High Energy Stereoscopic System (H.E.S.S.) is an array of five imaging atmospheric Cherenkov telescopes, sensitive to cosmic gamma rays of energies between 30 GeV and several tens of TeV. Four of them started operations in 2003 and their photomultiplier tube (PMT) cameras are currently undergoing a major upgrade, with the goals of improving the overall performance of the array and reducing the failure rate of the ageing systems. With the exception of the 960 PMTs, all components inside the camera have been replaced: these include the readout and trigger electronics, the power, ventilation and pneumatic systems and the control and data acquisition software. New designs and technical solutions have been introduced: the readout makes use of the NECTAr analog memory chip, which samples and stores the PMT signals and was developed for the Cherenkov Telescope Array (CTA). The control of all hardware subsystems is carried out by an FPGA coupled to an embedded ARM computer, a modular design which has proven to be very fast and reliable. The new camera software is based on modern C++ libraries such as Apache Thrift, ØMQ and Protocol buffers, offering very good performance, robustness, flexibility and ease of development. The first camera was upgraded in 2015, the other three cameras are foreseen to follow in fall 2016. We describe the design, the performance, the results of the tests and the lessons learned from the first upgraded H.E.S.S. camera.

  9. Adaptive compressive sensing camera

    Science.gov (United States)

    Hsu, Charles; Hsu, Ming K.; Cha, Jae; Iwamura, Tomo; Landa, Joseph; Nguyen, Charles; Szu, Harold

    2013-05-01

    We have embedded Adaptive Compressive Sensing (ACS) algorithm on Charge-Coupled-Device (CCD) camera based on the simplest concept that each pixel is a charge bucket, and the charges comes from Einstein photoelectric conversion effect. Applying the manufactory design principle, we only allow altering each working component at a minimum one step. We then simulated what would be such a camera can do for real world persistent surveillance taking into account of diurnal, all weather, and seasonal variations. The data storage has saved immensely, and the order of magnitude of saving is inversely proportional to target angular speed. We did design two new components of CCD camera. Due to the matured CMOS (Complementary metal-oxide-semiconductor) technology, the on-chip Sample and Hold (SAH) circuitry can be designed for a dual Photon Detector (PD) analog circuitry for changedetection that predicts skipping or going forward at a sufficient sampling frame rate. For an admitted frame, there is a purely random sparse matrix [Φ] which is implemented at each bucket pixel level the charge transport bias voltage toward its neighborhood buckets or not, and if not, it goes to the ground drainage. Since the snapshot image is not a video, we could not apply the usual MPEG video compression and Hoffman entropy codec as well as powerful WaveNet Wrapper on sensor level. We shall compare (i) Pre-Processing FFT and a threshold of significant Fourier mode components and inverse FFT to check PSNR; (ii) Post-Processing image recovery will be selectively done by CDT&D adaptive version of linear programming at L1 minimization and L2 similarity. For (ii) we need to determine in new frames selection by SAH circuitry (i) the degree of information (d.o.i) K(t) dictates the purely random linear sparse combination of measurement data a la [Φ]M,N M(t) = K(t) Log N(t).

  10. Mars Science Laboratory Engineering Cameras

    Science.gov (United States)

    Maki, Justin N.; Thiessen, David L.; Pourangi, Ali M.; Kobzeff, Peter A.; Lee, Steven W.; Dingizian, Arsham; Schwochert, Mark A.

    2012-01-01

    NASA's Mars Science Laboratory (MSL) Rover, which launched to Mars in 2011, is equipped with a set of 12 engineering cameras. These cameras are build-to-print copies of the Mars Exploration Rover (MER) cameras, which were sent to Mars in 2003. The engineering cameras weigh less than 300 grams each and use less than 3 W of power. Images returned from the engineering cameras are used to navigate the rover on the Martian surface, deploy the rover robotic arm, and ingest samples into the rover sample processing system. The navigation cameras (Navcams) are mounted to a pan/tilt mast and have a 45-degree square field of view (FOV) with a pixel scale of 0.82 mrad/pixel. The hazard avoidance cameras (Haz - cams) are body-mounted to the rover chassis in the front and rear of the vehicle and have a 124-degree square FOV with a pixel scale of 2.1 mrad/pixel. All of the cameras utilize a frame-transfer CCD (charge-coupled device) with a 1024x1024 imaging region and red/near IR bandpass filters centered at 650 nm. The MSL engineering cameras are grouped into two sets of six: one set of cameras is connected to rover computer A and the other set is connected to rover computer B. The MSL rover carries 8 Hazcams and 4 Navcams.

  11. PAU camera: detectors characterization

    Science.gov (United States)

    Casas, Ricard; Ballester, Otger; Cardiel-Sas, Laia; Castilla, Javier; Jiménez, Jorge; Maiorino, Marino; Pío, Cristóbal; Sevilla, Ignacio; de Vicente, Juan

    2012-07-01

    The PAU Camera (PAUCam) [1,2] is a wide field camera that will be mounted at the corrected prime focus of the William Herschel Telescope (Observatorio del Roque de los Muchachos, Canary Islands, Spain) in the next months. The focal plane of PAUCam is composed by a mosaic of 18 CCD detectors of 2,048 x 4,176 pixels each one with a pixel size of 15 microns, manufactured by Hamamatsu Photonics K. K. This mosaic covers a field of view (FoV) of 60 arcmin (minutes of arc), 40 of them are unvignetted. The behaviour of these 18 devices, plus four spares, and their electronic response should be characterized and optimized for the use in PAUCam. This job is being carried out in the laboratories of the ICE/IFAE and the CIEMAT. The electronic optimization of the CCD detectors is being carried out by means of an OG (Output Gate) scan and maximizing it CTE (Charge Transfer Efficiency) while the read-out noise is minimized. The device characterization itself is obtained with different tests. The photon transfer curve (PTC) that allows to obtain the electronic gain, the linearity vs. light stimulus, the full-well capacity and the cosmetic defects. The read-out noise, the dark current, the stability vs. temperature and the light remanence.

  12. Stereoscopic camera design

    Science.gov (United States)

    Montgomery, David J.; Jones, Christopher K.; Stewart, James N.; Smith, Alan

    2002-05-01

    It is clear from the literature that the majority of work in stereoscopic imaging is directed towards the development of modern stereoscopic displays. As costs come down, wider public interest in this technology is expected to increase. This new technology would require new methods of image formation. Advances in stereo computer graphics will of course lead to the creation of new stereo computer games, graphics in films etc. However, the consumer would also like to see real-world stereoscopic images, pictures of family, holiday snaps etc. Such scenery would have wide ranges of depth to accommodate and would need also to cope with moving objects, such as cars, and in particular other people. Thus, the consumer acceptance of auto/stereoscopic displays and 3D in general would be greatly enhanced by the existence of a quality stereoscopic camera. This paper will cover an analysis of existing stereoscopic camera designs and show that they can be categorized into four different types, with inherent advantages and disadvantages. A recommendation is then made with regard to 3D consumer still and video photography. The paper will go on to discuss this recommendation and describe its advantages and how it can be realized in practice.

  13. HONEY -- The Honeywell Camera

    Science.gov (United States)

    Clayton, C. A.; Wilkins, T. N.

    The Honeywell model 3000 colour graphic recorder system (hereafter referred to simply as Honeywell) has been bought by Starlink for producing publishable quality photographic hardcopy from the IKON image displays. Full colour and black & white images can be recorded on positive or negative 35mm film. The Honeywell consists of a built-in high resolution flat-faced monochrome video monitor, a red/green/blue colour filter mechanism and a 35mm camera. The device works on the direct video signals from the IKON. This means that changing the brightness or contrast on the IKON monitor will not affect any photographs that you take. The video signals from the IKON consist of separate red, green and blue signals. When you take a picture, the Honeywell takes the red, green and blue signals in turn and displays three pictures consecutively on its internal monitor. It takes an exposure through each of three filters (red, green and blue) onto the film in the camera. This builds up the complete colour picture on the film. Honeywell systems are installed at nine Starlink sites, namely Belfast (locally funded), Birmingham, Cambridge, Durham, Leicester, Manchester, Rutherford, ROE and UCL.

  14. Transmission electron microscope CCD camera

    Science.gov (United States)

    Downing, Kenneth H.

    1999-01-01

    In order to improve the performance of a CCD camera on a high voltage electron microscope, an electron decelerator is inserted between the microscope column and the CCD. This arrangement optimizes the interaction of the electron beam with the scintillator of the CCD camera while retaining optimization of the microscope optics and of the interaction of the beam with the specimen. Changing the electron beam energy between the specimen and camera allows both to be optimized.

  15. Forced Convective Condensation of Nonazeotropic Refrigerant Mixtures in Horizontal Annulus with Petal Shaped Fin Tubes

    Institute of Scientific and Technical Information of China (English)

    WangShiping; ZhouXinqiu; 等

    1995-01-01

    In this paper,condensation performance in horizontal annulus was compared with a smooth tube,one Saw-Tooth Finned tube(STF tube),four Petal Shaped Fin tubes(PF tubes),using R113,R11 and their mixtures(vapor molar fractions of R11 at the test section inlet were 0.384,0.588and 0.809) as working fluid.The mass flux at the test section ranged from 15-220/m2s。Camera and video camera were used to shoot the flow pattern and condensation phenomena.The condensation transfer coefficient(hc) of mixtures were considerably lower than those of pure fluid,and did not change linearly with composition.The maximum degradation of measured hc from the ideal value were 23% for the smooth tube,65%for STF tube,67% for PF tubes,which occurred in the composition range of 0.4-0.6 vapor molar fraction of R11.For the condensation of mixture,R11 molar fraction from 38%to 81%,the PF tubes had the highest value of hc.which were 10-25% higher than those of STF tubes,and 480-580% higher than that of smooth tube,because the petal shaped fins of PF tubes could promote strong turbulence in the two phase flow,and reduce the mass transfer resistance.

  16. Feeding tube insertion - gastrostomy

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/002937.htm Feeding tube insertion - gastrostomy To use the sharing features on this page, please enable JavaScript. A gastrostomy feeding tube insertion is the placement of a feeding ...

  17. Tube Feeding Troubleshooting Guide

    Science.gov (United States)

    Tube Feeding Troubleshoot ing Guide This guide is a tool to assist you, and should not replace your doctor’s ... everyone. table of contents Going Home with Tube Feedings....................................................2 Nausea and ... ...

  18. Neural Tube Defects

    Science.gov (United States)

    Neural tube defects are birth defects of the brain, spine, or spinal cord. They happen in the ... that she is pregnant. The two most common neural tube defects are spina bifida and anencephaly. In ...

  19. Design of microcontroller based system for automation of streak camera.

    Science.gov (United States)

    Joshi, M J; Upadhyay, J; Deshpande, P P; Sharma, M L; Navathe, C P

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  20. Design of microcontroller based system for automation of streak camera

    Science.gov (United States)

    Joshi, M. J.; Upadhyay, J.; Deshpande, P. P.; Sharma, M. L.; Navathe, C. P.

    2010-08-01

    A microcontroller based system has been developed for automation of the S-20 optical streak camera, which is used as a diagnostic tool to measure ultrafast light phenomenon. An 8 bit MCS family microcontroller is employed to generate all control signals for the streak camera. All biasing voltages required for various electrodes of the tubes are generated using dc-to-dc converters. A high voltage ramp signal is generated through a step generator unit followed by an integrator circuit and is applied to the camera's deflecting plates. The slope of the ramp can be changed by varying values of the capacitor and inductor. A programmable digital delay generator has been developed for synchronization of ramp signal with the optical signal. An independent hardwired interlock circuit has been developed for machine safety. A LABVIEW based graphical user interface has been developed which enables the user to program the settings of the camera and capture the image. The image is displayed with intensity profiles along horizontal and vertical axes. The streak camera was calibrated using nanosecond and femtosecond lasers.

  1. Coincidence ion imaging with a fast frame camera

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Suk Kyoung; Cudry, Fadia; Lin, Yun Fei; Lingenfelter, Steven; Winney, Alexander H.; Fan, Lin; Li, Wen, E-mail: wli@chem.wayne.edu [Department of Chemistry, Wayne State University, Detroit, Michigan 48202 (United States)

    2014-12-15

    A new time- and position-sensitive particle detection system based on a fast frame CMOS (complementary metal-oxide semiconductors) camera is developed for coincidence ion imaging. The system is composed of four major components: a conventional microchannel plate/phosphor screen ion imager, a fast frame CMOS camera, a single anode photomultiplier tube (PMT), and a high-speed digitizer. The system collects the positional information of ions from a fast frame camera through real-time centroiding while the arrival times are obtained from the timing signal of a PMT processed by a high-speed digitizer. Multi-hit capability is achieved by correlating the intensity of ion spots on each camera frame with the peak heights on the corresponding time-of-flight spectrum of a PMT. Efficient computer algorithms are developed to process camera frames and digitizer traces in real-time at 1 kHz laser repetition rate. We demonstrate the capability of this system by detecting a momentum-matched co-fragments pair (methyl and iodine cations) produced from strong field dissociative double ionization of methyl iodide.

  2. Camera artifacts in IUE spectra

    Science.gov (United States)

    Bruegman, O. W.; Crenshaw, D. M.

    1994-01-01

    This study of emission line mimicking features in the IUE cameras has produced an atlas of artifiacts in high-dispersion images with an accompanying table of prominent artifacts and a table of prominent artifacts in the raw images along with a medium image of the sky background for each IUE camera.

  3. Radiation camera motion correction system

    Science.gov (United States)

    Hoffer, P.B.

    1973-12-18

    The device determines the ratio of the intensity of radiation received by a radiation camera from two separate portions of the object. A correction signal is developed to maintain this ratio at a substantially constant value and this correction signal is combined with the camera signal to correct for object motion. (Official Gazette)

  4. Coherent infrared imaging camera (CIRIC)

    Energy Technology Data Exchange (ETDEWEB)

    Hutchinson, D.P.; Simpson, M.L.; Bennett, C.A.; Richards, R.K.; Emery, M.S.; Crutcher, R.I.; Sitter, D.N. Jr.; Wachter, E.A.; Huston, M.A.

    1995-07-01

    New developments in 2-D, wide-bandwidth HgCdTe (MCT) and GaAs quantum-well infrared photodetectors (QWIP) coupled with Monolithic Microwave Integrated Circuit (MMIC) technology are now making focal plane array coherent infrared (IR) cameras viable. Unlike conventional IR cameras which provide only thermal data about a scene or target, a coherent camera based on optical heterodyne interferometry will also provide spectral and range information. Each pixel of the camera, consisting of a single photo-sensitive heterodyne mixer followed by an intermediate frequency amplifier and illuminated by a separate local oscillator beam, constitutes a complete optical heterodyne receiver. Applications of coherent IR cameras are numerous and include target surveillance, range detection, chemical plume evolution, monitoring stack plume emissions, and wind shear detection.

  5. Camera sensitivity study

    Science.gov (United States)

    Schlueter, Jonathan; Murphey, Yi L.; Miller, John W. V.; Shridhar, Malayappan; Luo, Yun; Khairallah, Farid

    2004-12-01

    As the cost/performance Ratio of vision systems improves with time, new classes of applications become feasible. One such area, automotive applications, is currently being investigated. Applications include occupant detection, collision avoidance and lane tracking. Interest in occupant detection has been spurred by federal automotive safety rules in response to injuries and fatalities caused by deployment of occupant-side air bags. In principle, a vision system could control airbag deployment to prevent this type of mishap. Employing vision technology here, however, presents a variety of challenges, which include controlling costs, inability to control illumination, developing and training a reliable classification system and loss of performance due to production variations due to manufacturing tolerances and customer options. This paper describes the measures that have been developed to evaluate the sensitivity of an occupant detection system to these types of variations. Two procedures are described for evaluating how sensitive the classifier is to camera variations. The first procedure is based on classification accuracy while the second evaluates feature differences.

  6. Proportional counter radiation camera

    Science.gov (United States)

    Borkowski, C.J.; Kopp, M.K.

    1974-01-15

    A gas-filled proportional counter camera that images photon emitting sources is described. A two-dimensional, positionsensitive proportional multiwire counter is provided as the detector. The counter consists of a high- voltage anode screen sandwiched between orthogonally disposed planar arrays of multiple parallel strung, resistively coupled cathode wires. Two terminals from each of the cathode arrays are connected to separate timing circuitry to obtain separate X and Y coordinate signal values from pulse shape measurements to define the position of an event within the counter arrays which may be recorded by various means for data display. The counter is further provided with a linear drift field which effectively enlarges the active gas volume of the counter and constrains the recoil electrons produced from ionizing radiation entering the counter to drift perpendicularly toward the planar detection arrays. A collimator is interposed between a subject to be imaged and the counter to transmit only the radiation from the subject which has a perpendicular trajectory with respect to the planar cathode arrays of the detector. (Official Gazette)

  7. Standard design for National Ignition Facility x-ray streak and framing cameras

    Energy Technology Data Exchange (ETDEWEB)

    Kimbrough, J. R.; Bell, P. M.; Bradley, D. K.; Holder, J. P.; Kalantar, D. K.; MacPhee, A. G.; Telford, S.

    2010-10-01

    The x-ray streak camera and x-ray framing camera for the National Ignition Facility were redesigned to improve electromagnetic pulse hardening, protect high voltage circuits from pressure transients, and maximize the use of common parts and operational software. Both instruments use the same PC104 based controller, interface, power supply, charge coupled device camera, protective hermetically sealed housing, and mechanical interfaces. Communication is over fiber optics with identical facility hardware for both instruments. Each has three triggers that can be either fiber optic or coax. High voltage protection consists of a vacuum sensor to enable the high voltage and pulsed microchannel plate phosphor voltage. In the streak camera, the high voltage is removed after the sweep. Both rely on the hardened aluminum box and a custom power supply to reduce electromagnetic pulse/electromagnetic interference (EMP/EMI) getting into the electronics. In addition, the streak camera has an EMP/EMI shield enclosing the front of the streak tube.

  8. Vision Sensors and Cameras

    Science.gov (United States)

    Hoefflinger, Bernd

    Silicon charge-coupled-device (CCD) imagers have been and are a specialty market ruled by a few companies for decades. Based on CMOS technologies, active-pixel sensors (APS) began to appear in 1990 at the 1 μm technology node. These pixels allow random access, global shutters, and they are compatible with focal-plane imaging systems combining sensing and first-level image processing. The progress towards smaller features and towards ultra-low leakage currents has provided reduced dark currents and μm-size pixels. All chips offer Mega-pixel resolution, and many have very high sensitivities equivalent to ASA 12.800. As a result, HDTV video cameras will become a commodity. Because charge-integration sensors suffer from a limited dynamic range, significant processing effort is spent on multiple exposure and piece-wise analog-digital conversion to reach ranges >10,000:1. The fundamental alternative is log-converting pixels with an eye-like response. This offers a range of almost a million to 1, constant contrast sensitivity and constant colors, important features in professional, technical and medical applications. 3D retino-morphic stacking of sensing and processing on top of each other is being revisited with sub-100 nm CMOS circuits and with TSV technology. With sensor outputs directly on top of neurons, neural focal-plane processing will regain momentum, and new levels of intelligent vision will be achieved. The industry push towards thinned wafers and TSV enables backside-illuminated and other pixels with a 100% fill-factor. 3D vision, which relies on stereo or on time-of-flight, high-speed circuitry, will also benefit from scaled-down CMOS technologies both because of their size as well as their higher speed.

  9. An Inexpensive Digital Infrared Camera

    Science.gov (United States)

    Mills, Allan

    2012-01-01

    Details are given for the conversion of an inexpensive webcam to a camera specifically sensitive to the near infrared (700-1000 nm). Some experiments and practical applications are suggested and illustrated. (Contains 9 figures.)

  10. The future of consumer cameras

    Science.gov (United States)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  11. Continuous monitoring of Hawaiian volcanoes using thermal cameras

    Science.gov (United States)

    Patrick, M. R.; Orr, T. R.; Antolik, L.; Lee, R.; Kamibayashi, K.

    2012-12-01

    Thermal cameras are becoming more common at volcanoes around the world, and have become a powerful tool for observing volcanic activity. Fixed, continuously recording thermal cameras have been installed by the Hawaiian Volcano Observatory in the last two years at four locations on Kilauea Volcano to better monitor its two ongoing eruptions. The summit eruption, which began in March 2008, hosts an active lava lake deep within a fume-filled vent crater. A thermal camera perched on the rim of Halema`uma`u Crater, acquiring an image every five seconds, has now captured about two years of sustained lava lake activity, including frequent lava level fluctuations, small explosions , and several draining events. This thermal camera has been able to "see" through the thick fume in the crater, providing truly 24/7 monitoring that would not be possible with normal webcams. The east rift zone eruption, which began in 1983, has chiefly consisted of effusion through lava tubes onto the surface, but over the past two years has been interrupted by an intrusion, lava fountaining, crater collapse, and perched lava lake growth and draining. The three thermal cameras on the east rift zone, all on Pu`u `O`o cone and acquiring an image every several minutes, have captured many of these changes and are providing an improved means for alerting observatory staff of new activity. Plans are underway to install a thermal camera at the summit of Mauna Loa to monitor and alert to any future changes there. Thermal cameras are more difficult to install, and image acquisition and processing are more complicated than with visual webcams. Our system is based in part on the successful thermal camera installations by Italian volcanologists on Stromboli and Vulcano. Equipment includes custom enclosures with IR transmissive windows, power, and telemetry. Data acquisition is based on ActiveX controls, and data management is done using automated Matlab scripts. Higher-level data processing, also done with

  12. Fuel nozzle tube retention

    Energy Technology Data Exchange (ETDEWEB)

    Cihlar, David William; Melton, Patrick Benedict

    2017-02-28

    A system for retaining a fuel nozzle premix tube includes a retention plate and a premix tube which extends downstream from an outlet of a premix passage defined along an aft side of a fuel plenum body. The premix tube includes an inlet end and a spring support feature which is disposed proximate to the inlet end. The premix tube extends through the retention plate. The spring retention feature is disposed between an aft side of the fuel plenum and the retention plate. The system further includes a spring which extends between the spring retention feature and the retention plate.

  13. Heated Tube Facility

    Data.gov (United States)

    Federal Laboratory Consortium — The Heated Tube Facility at NASA GRC investigates cooling issues by simulating conditions characteristic of rocket engine thrust chambers and high speed airbreathing...

  14. New high spatial resolution portable camera in medical imaging

    Science.gov (United States)

    Trotta, C.; Massari, R.; Palermo, N.; Scopinaro, F.; Soluri, A.

    2007-07-01

    In the last years, many studies have been carried out on portable gamma cameras in order to optimize a device for medical imaging. In this paper, we present a new type of gamma camera, for low energies detection, based on a position sensitive photomultiplier tube Hamamatsu Flat Panel H8500 and an innovative technique based on CsI(Tl) scintillation crystals inserted into the square holes of a tungsten collimator. The geometrical features of this collimator-scintillator structure, which affect the camera spatial resolution and sensitivity, were chosen to offer optimal performances in clinical functional examinations. Detector sensitivity, energy resolution and spatial resolution were measured and the acquired image quality was evaluated with particular attention to the pixel identification capability. This low weight (about 2 kg) portable gamma camera was developed thanks to a miniaturized resistive chain electronic readout, combined with a dedicated compact 4 channel ADC board. This data acquisition board, designed by our research group, showed excellent performances, with respect to a commercial PCI 6110E card (National Intruments), in term of sampling period and additional on board operation for data pre-processing.

  15. Picosecond X-ray streak camera dynamic range measurement

    Science.gov (United States)

    Zuber, C.; Bazzoli, S.; Brunel, P.; Fronty, J.-P.; Gontier, D.; Goulmy, C.; Raimbourg, J.; Rubbelynck, C.; Trosseille, C.

    2016-09-01

    Streak cameras are widely used to record the spatio-temporal evolution of laser-induced plasma. A prototype of picosecond X-ray streak camera has been developed and tested by Commissariat à l'Énergie Atomique et aux Énergies Alternatives to answer the Laser MegaJoule specific needs. The dynamic range of this instrument is measured with picosecond X-ray pulses generated by the interaction of a laser beam and a copper target. The required value of 100 is reached only in the configurations combining the slowest sweeping speed and optimization of the streak tube electron throughput by an appropriate choice of high voltages applied to its electrodes.

  16. G-APDs in Cherenkov astronomy: The FACT camera

    Energy Technology Data Exchange (ETDEWEB)

    Kraehenbuehl, T., E-mail: thomas.kraehenbuehl@phys.ethz.ch [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Anderhub, H. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Backes, M. [Technische Universitaet Dortmund, D-44221 Dortmund (Germany); Biland, A.; Boller, A.; Braun, I. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Bretz, T. [Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne (Switzerland); Commichau, V.; Djambazov, L. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Dorner, D.; Farnier, C. [ISDC Data Center for Astrophysics, CH-1290 Versoix (Switzerland); Gendotti, A.; Grimm, O.; Gunten, H. von; Hildebrand, D.; Horisberger, U.; Huber, B.; Kim, K.-S. [ETH Zurich, Institute for Particle Physics, CH-8093 Zurich (Switzerland); Koehne, J.-H.; Krumm, B. [Technische Universitaet Dortmund, D-44221 Dortmund (Germany); and others

    2012-12-11

    Geiger-mode avalanche photodiodes (G-APD, SiPM) are a much discussed alternative to photomultiplier tubes in Cherenkov astronomy. The First G-APD Cherenkov Telescope (FACT) collaboration builds a camera based on a hexagonal array of 1440 G-APDs and has now finalized its construction phase. A light-collecting solid PMMA cone is glued to each G-APD to eliminate dead space between the G-APDs by increasing the active area, and to restrict the light collection angle of the sensor to the reflector area in order to reduce the amount of background light. The processing of the signals is integrated in the camera and includes the digitization using the domino ring sampling chip DRS4.

  17. SUB-CAMERA CALIBRATION OF A PENTA-CAMERA

    Directory of Open Access Journals (Sweden)

    K. Jacobsen

    2016-03-01

    Full Text Available Penta cameras consisting of a nadir and four inclined cameras are becoming more and more popular, having the advantage of imaging also facades in built up areas from four directions. Such system cameras require a boresight calibration of the geometric relation of the cameras to each other, but also a calibration of the sub-cameras. Based on data sets of the ISPRS/EuroSDR benchmark for multi platform photogrammetry the inner orientation of the used IGI Penta DigiCAM has been analyzed. The required image coordinates of the blocks Dortmund and Zeche Zollern have been determined by Pix4Dmapper and have been independently adjusted and analyzed by program system BLUH. With 4.1 million image points in 314 images respectively 3.9 million image points in 248 images a dense matching was provided by Pix4Dmapper. With up to 19 respectively 29 images per object point the images are well connected, nevertheless the high number of images per object point are concentrated to the block centres while the inclined images outside the block centre are satisfying but not very strongly connected. This leads to very high values for the Student test (T-test of the finally used additional parameters or in other words, additional parameters are highly significant. The estimated radial symmetric distortion of the nadir sub-camera corresponds to the laboratory calibration of IGI, but there are still radial symmetric distortions also for the inclined cameras with a size exceeding 5μm even if mentioned as negligible based on the laboratory calibration. Radial and tangential effects of the image corners are limited but still available. Remarkable angular affine systematic image errors can be seen especially in the block Zeche Zollern. Such deformations are unusual for digital matrix cameras, but it can be caused by the correlation between inner and exterior orientation if only parallel flight lines are used. With exception of the angular affinity the systematic image errors

  18. Steam generator tube failures

    Energy Technology Data Exchange (ETDEWEB)

    MacDonald, P.E.; Shah, V.N.; Ward, L.W.; Ellison, P.G.

    1996-04-01

    A review and summary of the available information on steam generator tubing failures and the impact of these failures on plant safety is presented. The following topics are covered: pressurized water reactor (PWR), Canadian deuterium uranium (CANDU) reactor, and Russian water moderated, water cooled energy reactor (VVER) steam generator degradation, PWR steam generator tube ruptures, the thermal-hydraulic response of a PWR plant with a faulted steam generator, the risk significance of steam generator tube rupture accidents, tubing inspection requirements and fitness-for-service criteria in various countries, and defect detection reliability and sizing accuracy. A significant number of steam generator tubes are defective and are removed from service or repaired each year. This wide spread damage has been caused by many diverse degradation mechanisms, some of which are difficult to detect and predict. In addition, spontaneous tube ruptures have occurred at the rate of about one every 2 years over the last 20 years, and incipient tube ruptures (tube failures usually identified with leak detection monitors just before rupture) have been occurring at the rate of about one per year. These ruptures have caused complex plant transients which have not always been easy for the reactor operators to control. Our analysis shows that if more than 15 tubes rupture during a main steam line break, the system response could lead to core melting. Although spontaneous and induced steam generator tube ruptures are small contributors to the total core damage frequency calculated in probabilistic risk assessments, they are risk significant because the radionuclides are likely to bypass the reactor containment building. The frequency of steam generator tube ruptures can be significantly reduced through appropriate and timely inspections and repairs or removal from service.

  19. Traditional gamma cameras are preferred.

    Science.gov (United States)

    DePuey, E Gordon

    2016-08-01

    Although the new solid-state dedicated cardiac cameras provide excellent spatial and energy resolution and allow for markedly reduced SPECT acquisition times and/or injected radiopharmaceutical activity, they have some distinct disadvantages compared to traditional sodium iodide SPECT cameras. They are expensive. Attenuation correction is not available. Cardio-focused collimation, advantageous to increase depth-dependent resolution and myocardial count density, accentuates diaphragmatic attenuation and scatter from subdiaphragmatic structures. Although supplemental prone imaging is therefore routinely advised, many patients cannot tolerate it. Moreover, very large patients cannot be accommodated in the solid-state camera gantries. Since data are acquired simultaneously with an arc of solid-state detectors around the chest, no temporally dependent "rotating" projection images are obtained. Therefore, patient motion can be neither detected nor corrected. In contrast, traditional sodium iodide SPECT cameras provide rotating projection images to allow technologists and physicians to detect and correct patient motion and to accurately detect the position of soft tissue attenuators and to anticipate associated artifacts. Very large patients are easily accommodated. Low-dose x-ray attenuation correction is widely available. Also, relatively inexpensive low-count density software is provided by many vendors, allowing shorter SPECT acquisition times and reduced injected activity approaching that achievable with solid-state cameras.

  20. Nuclear probes and intraoperative gamma cameras.

    Science.gov (United States)

    Heller, Sherman; Zanzonico, Pat

    2011-05-01

    Gamma probes are now an important, well-established technology in the management of cancer, particularly in the detection of sentinel lymph nodes. Intraoperative sentinel lymph node as well as tumor detection may be improved under some circumstances by the use of beta (negatron or positron), rather than gamma detection, because the very short range (∼ 1 mm or less) of such particulate radiations eliminates the contribution of confounding counts from activity other than in the immediate vicinity of the detector. This has led to the development of intraoperative beta probes. Gamma camera imaging also benefits from short source-to-detector distances and minimal overlying tissue, and intraoperative small field-of-view gamma cameras have therefore been developed as well. Radiation detectors for intraoperative probes can generally be characterized as either scintillation or ionization detectors. Scintillators used in scintillation-detector probes include thallium-doped sodium iodide, thallium- and sodium-doped cesium iodide, and cerium-doped lutecium orthooxysilicate. Alternatives to inorganic scintillators are plastic scintillators, solutions of organic scintillation compounds dissolved in an organic solvent that is subsequently polymerized to form a solid. Their combined high counting efficiency for beta particles and low counting efficiency for 511-keV annihilation γ-rays make plastic scintillators well-suited as intraoperative beta probes in general and positron probes in particular Semiconductors used in ionization-detector probes include cadmium telluride, cadmium zinc telluride, and mercuric iodide. Clinical studies directly comparing scintillation and semiconductor intraoperative probes have not provided a clear choice between scintillation and ionization detector-based probes. The earliest small field-of-view intraoperative gamma camera systems were hand-held devices having fields of view of only 1.5-2.5 cm in diameter that used conventional thallium

  1. Hologram recording tubes

    Science.gov (United States)

    Rajchman, J. H.

    1973-01-01

    Optical memories allow extremely large numbers of bits to be stored and recalled in a matter of microseconds. Two recording tubes, similar to conventional image-converting tubes, but having a soft-glass surface on which hologram is recorded, do not degrade under repeated hologram read/write cycles.

  2. Image Intensifier Modules For Use With Commercially Available Solid State Cameras

    Science.gov (United States)

    Murphy, Howard; Tyler, Al; Lake, Donald W.

    1989-04-01

    configured as required by a specific camera application. Modular line and matrix scan cameras incorporating sensors with fiber optic faceplates (Fig 4) are also available. These units retain the advantages of interchangeability, simple construction, ruggedness, and optical precision offered by the more common lens input units. Fiber optic faceplate cameras are used for a wide variety of applications. A common usage involves mating of the Reticon-supplied camera to a customer-supplied intensifier tube for low light level and/or short exposure time situations.

  3. Wavy tube heat pumping

    Energy Technology Data Exchange (ETDEWEB)

    Haldeman, C. W.

    1985-12-03

    A PVC conduit about 4'' in diameter and a little more than 40 feet long is adapted for being seated in a hole in the earth and surrounds a coaxial copper tube along its length that carries Freon between a heat pump and a distributor at the bottom. A number of wavy conducting tubes located between the central conducting tube and the wall of the conduit interconnect the distributor with a Freon distributor at the top arranged for connection to the heat pump. The wavy conducting tubing is made by passing straight soft copper tubing between a pair of like opposed meshing gears each having four convex points in space quadrature separated by four convex recesses with the radius of curvature of each point slightly less than that of each concave recess.

  4. Categorising YouTube

    DEFF Research Database (Denmark)

    Simonsen, Thomas Mosebo

    2011-01-01

    This article provides a genre analytical approach to creating a typology of the User Generated Content (UGC) of YouTube. The article investigates the construction of navigation processes on the YouTube website. It suggests a pragmatic genre approach that is expanded through a focus on YouTube’s...... technological affordances. Through an analysis of the different pragmatic contexts of YouTube, it is argued that a taxonomic understanding of YouTube must be analysed in regards to the vacillation of a user-driven bottom-up folksonomy and a hierarchical browsing system that emphasises a culture of competition...... and which favours the already popular content of YouTube. With this taxonomic approach, the UGC videos are registered and analysed in terms of empirically based observations. The article identifies various UGC categories and their principal characteristics. Furthermore, general tendencies of the UGC within...

  5. Molybdenum Tube Characterization report

    Energy Technology Data Exchange (ETDEWEB)

    Beaux II, Miles Frank [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Usov, Igor Olegovich [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-07

    Chemical vapor deposition (CVD) techniques have been utilized to produce free-standing molybdenum tubes with the end goal of nuclear fuel clad applications. In order to produce tubes with properties desirable for this application, deposition rates were lowered requiring long deposition durations on the order of 50 hours. Standard CVD methods as well as fluidized-bed CVD (FBCVD) methods were applied towards these objectives. Characterization of the tubes produced in this manner revealed material suitable for fuel clad applications, but lacking necessary uniformity across the length of the tubes. The production of freestanding Mo tubes that possess the desired properties across their entire length represents an engineering challenge that can be overcome in a next iteration of the deposition system.

  6. Perceptual Color Characterization of Cameras

    Directory of Open Access Journals (Sweden)

    Javier Vazquez-Corral

    2014-12-01

    Full Text Available Color camera characterization, mapping outputs from the camera sensors to an independent color space, such as \\(XYZ\\, is an important step in the camera processing pipeline. Until now, this procedure has been primarily solved by using a \\(3 \\times 3\\ matrix obtained via a least-squares optimization. In this paper, we propose to use the spherical sampling method, recently published by Finlayson al., to perform a perceptual color characterization. In particular, we search for the \\(3 \\times 3\\ matrix that minimizes three different perceptual errors, one pixel based and two spatially based. For the pixel-based case, we minimize the CIE \\(\\Delta E\\ error, while for the spatial-based case, we minimize both the S-CIELAB error and the CID error measure. Our results demonstrate an improvement of approximately 3for the \\(\\Delta E\\ error, 7& for the S-CIELAB error and 13% for the CID error measures.

  7. Dark Energy Camera for Blanco

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Gary A.; /Caltech /SLAC

    2010-08-25

    In order to make accurate measurements of dark energy, a system is needed to monitor the focus and alignment of the Dark Energy Camera (DECam) to be located on the Blanco 4m Telescope for the upcoming Dark Energy Survey. One new approach under development is to fit out-of-focus star images to a point spread function from which information about the focus and tilt of the camera can be obtained. As a first test of a new algorithm using this idea, simulated star images produced from a model of DECam in the optics software Zemax were fitted. Then, real images from the Mosaic II imager currently installed on the Blanco telescope were used to investigate the algorithm's capabilities. A number of problems with the algorithm were found, and more work is needed to understand its limitations and improve its capabilities so it can reliably predict camera alignment and focus.

  8. The GISMO-2 Bolometer Camera

    Science.gov (United States)

    Staguhn, Johannes G.; Benford, Dominic J.; Fixsen, Dale J.; Hilton, Gene; Irwin, Kent D.; Jhabvala, Christine A.; Kovacs, Attila; Leclercq, Samuel; Maher, Stephen F.; Miller, Timothy M.; Moseley, Samuel H.; Sharp, Elemer H.; Wollack, Edward J.

    2012-01-01

    We present the concept for the GISMO-2 bolometer camera) which we build for background-limited operation at the IRAM 30 m telescope on Pico Veleta, Spain. GISM0-2 will operate Simultaneously in the 1 mm and 2 mm atmospherical windows. The 1 mm channel uses a 32 x 40 TES-based Backshort Under Grid (BUG) bolometer array, the 2 mm channel operates with a 16 x 16 BUG array. The camera utilizes almost the entire full field of view provided by the telescope. The optical design of GISM0-2 was strongly influenced by our experience with the GISMO 2 mm bolometer camera which is successfully operating at the 30m telescope. GISMO is accessible to the astronomical community through the regular IRAM call for proposals.

  9. EDICAM (Event Detection Intelligent Camera)

    Energy Technology Data Exchange (ETDEWEB)

    Zoletnik, S. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Szabolics, T., E-mail: szabolics.tamas@wigner.mta.hu [Wigner RCP RMI, EURATOM Association, Budapest (Hungary); Kocsis, G.; Szepesi, T.; Dunai, D. [Wigner RCP RMI, EURATOM Association, Budapest (Hungary)

    2013-10-15

    Highlights: ► We present EDICAM's hardware modules. ► We present EDICAM's main design concepts. ► This paper will describe EDICAM firmware architecture. ► Operation principles description. ► Further developments. -- Abstract: A new type of fast framing camera has been developed for fusion applications by the Wigner Research Centre for Physics during the last few years. A new concept was designed for intelligent event driven imaging which is capable of focusing image readout to Regions of Interests (ROIs) where and when predefined events occur. At present these events mean intensity changes and external triggers but in the future more sophisticated methods might also be defined. The camera provides 444 Hz frame rate at full resolution of 1280 × 1024 pixels, but monitoring of smaller ROIs can be done in the 1–116 kHz range even during exposure of the full image. Keeping space limitations and the harsh environment in mind the camera is divided into a small Sensor Module and a processing card interconnected by a fast 10 Gbit optical link. This camera hardware has been used for passive monitoring of the plasma in different devices for example at ASDEX Upgrade and COMPASS with the first version of its firmware. The new firmware and software package is now available and ready for testing the new event processing features. This paper will present the operation principle and features of the Event Detection Intelligent Camera (EDICAM). The device is intended to be the central element in the 10-camera monitoring system of the Wendelstein 7-X stellarator.

  10. Effect of Uniformly and Nonuniformly Coated Al2O3 Nanoparticles over Glass Tube Heater on Pool Boiling

    Directory of Open Access Journals (Sweden)

    Nitin Doifode

    2016-01-01

    Full Text Available Effect of uniformly and nonuniformly coated Al2O3 nanoparticles over plain glass tube heater on pool boiling heat transfer was studied experimentally. A borosilicate glass tube coated with Al2O3 nanoparticle was used as test heater. The boiling behaviour was studied by using high speed camera. Result obtained for pool boiling shows enhancement in heat transfer for nanoparticle coated surface heater and compared with plain glass tube heater. Also heat transfer coefficient for nonuniformly coated nanoparticles was studied and compared with uniformly coated and plain glass tube. Coating effect of nanoparticles over glass tube increases its surface roughness and thereby creates more nucleation sites.

  11. What Are Neural Tube Defects?

    Science.gov (United States)

    ... NICHD Research Information Clinical Trials Resources and Publications Neural Tube Defects (NTDs): Condition Information Skip sharing on social media links Share this: Page Content What are neural tube defects? Neural (pronounced NOOR-uhl ) tube defects are ...

  12. 4π FOV compact Compton camera for nuclear material investigations

    Science.gov (United States)

    Lee, Wonho; Lee, Taewoong

    2011-10-01

    A compact Compton camera with a 4π field of view (FOV) was manufactured using the design parameters optimized with the effective choice of gamma-ray interaction order determined from a Monte Carlo simulation. The camera consisted of six CsI(Na) planar scintillators with a pixelized structure that was coupled to position sensitive photomultiplier tubes (H8500) consisting of multiple anodes connected to custom-made circuits. The size of the scintillator and each pixel was 4.4×4.4×0.5 and 0.2×0.2×0.5 cm, respectively. The total size of each detection module was only 5×5×6 cm and the distance between the detector modules was approximately 10 cm to maximize the camera performance, as calculated by the simulation. Therefore, the camera is quite portable for examining nuclear materials in areas, such as harbors or nuclear power plants. The non-uniformity of the multi-anode PMTs was corrected using a novel readout circuit. Amplitude information of the signals from the electronics attached to the scintillator-coupled multi-anode PMTs was collected using a data acquisition board (cDAQ-9178), and the timing information was sent to a FPGA (SPARTAN3E). The FPGA picked the rising edges of the timing signals, and compared the edges of the signals from six detection modules to select the coincident signal from a Compton pair only. The output of the FPGA triggered the DAQ board to send the effective Compton events to a computer. The Compton image was reconstructed, and the performance of the 4π FOV Compact camera was examined.

  13. Isolated Fallopian Tube Torsion

    Directory of Open Access Journals (Sweden)

    S. Kardakis

    2013-01-01

    Full Text Available Isolated torsion of the Fallopian tube is a rare gynecological cause of acute lower abdominal pain, and diagnosis is difficult. There are no pathognomonic symptoms; clinical, imaging, or laboratory findings. A preoperative ultrasound showing tubular adnexal masses of heterogeneous echogenicity with cystic component is often present. Diagnosis can rarely be made before operation, and laparoscopy is necessary to establish the diagnosis. Unfortunately, surgery often is performed too late for tube conservation. Isolated Fallopian tube torsion should be suspected in case of acute pelvic pain, and prompt intervention is necessary.

  14. The Camera Comes to Court.

    Science.gov (United States)

    Floren, Leola

    After the Lindbergh kidnapping trial in 1935, the American Bar Association sought to eliminate electronic equipment from courtroom proceedings. Eventually, all but two states adopted regulations applying that ban to some extent, and a 1965 Supreme Court decision encouraged the banning of television cameras at trials as well. Currently, some states…

  15. Camera Movement in Narrative Cinema

    DEFF Research Database (Denmark)

    Nielsen, Jakob Isak

    2007-01-01

    Just like art historians have focused on e.g. composition or lighting, this dissertation takes a single stylistic parameter as its object of study: camera movement. Within film studies this localized avenue of middle-level research has become increasingly viable under the aegis of a perspective k...

  16. OSIRIS camera barrel optomechanical design

    Science.gov (United States)

    Farah, Alejandro; Tejada, Carlos; Gonzalez, Jesus; Cobos, Francisco J.; Sanchez, Beatriz; Fuentes, Javier; Ruiz, Elfego

    2004-09-01

    A Camera Barrel, located in the OSIRIS imager/spectrograph for the Gran Telescopio Canarias (GTC), is described in this article. The barrel design has been developed by the Institute for Astronomy of the University of Mexico (IA-UNAM), in collaboration with the Institute for Astrophysics of Canarias (IAC), Spain. The barrel is being manufactured by the Engineering Center for Industrial Development (CIDESI) at Queretaro, Mexico. The Camera Barrel includes a set of eight lenses (three doublets and two singlets), with their respective supports and cells, as well as two subsystems: the Focusing Unit, which is a mechanism that modifies the first doublet relative position; and the Passive Displacement Unit (PDU), which uses the third doublet as thermal compensator to maintain the camera focal length and image quality when the ambient temperature changes. This article includes a brief description of the scientific instrument; describes the design criteria related with performance justification; and summarizes the specifications related with misalignment errors and generated stresses. The Camera Barrel components are described and analytical calculations, FEA simulations and error budgets are also included.

  17. Nasogastric feeding tube

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/patientinstructions/000182.htm Nasogastric feeding tube To use the sharing features on this ... the nose. It can be used for all feedings or for giving a person extra calories. It ...

  18. Tube-Forming Assays.

    Science.gov (United States)

    Brown, Ryan M; Meah, Christopher J; Heath, Victoria L; Styles, Iain B; Bicknell, Roy

    2016-01-01

    Angiogenesis involves the generation of new blood vessels from the existing vasculature and is dependent on many growth factors and signaling events. In vivo angiogenesis is dynamic and complex, meaning assays are commonly utilized to explore specific targets for research into this area. Tube-forming assays offer an excellent overview of the molecular processes in angiogenesis. The Matrigel tube forming assay is a simple-to-implement but powerful tool for identifying biomolecules involved in angiogenesis. A detailed experimental protocol on the implementation of the assay is described in conjunction with an in-depth review of methods that can be applied to the analysis of the tube formation. In addition, an ImageJ plug-in is presented which allows automatic quantification of tube images reducing analysis times while removing user bias and subjectivity.

  19. Chest tube insertion - slideshow

    Science.gov (United States)

    ... presentations/100008.htm Chest tube insertion - series—Normal anatomy To use the sharing features ... pleural space is the space between the inner and outer lining of the lung. It is normally very thin, and lined only ...

  20. Snorkeling and Jones tubes

    OpenAIRE

    Lam, Lewis Y. W.; Weatherhead, Robert G.

    2015-01-01

    We report a case of tympanic membrane rupture during snorkeling in a 17-year-old young man who had previously undergone bilateral Jones tubes placed for epiphora. To our knowledge, this phenomenon has not been previously reported.

  1. Snorkeling and Jones tubes.

    Science.gov (United States)

    Lam, Lewis Y W; Weatherhead, Robert G

    2015-01-01

    We report a case of tympanic membrane rupture during snorkeling in a 17-year-old young man who had previously undergone bilateral Jones tubes placed for epiphora. To our knowledge, this phenomenon has not been previously reported.

  2. Kinking of medical tubes.

    Science.gov (United States)

    Ingles, David

    2004-05-01

    The phenomenon of kinking in medical tubing remains a problem for some applications, particularly critical ones such as transporting gasses or fluids. Design features are described to prevent its occurrence.

  3. First Avalanche-photodiode camera test (FACT): A novel camera using G-APDs for the observation of very high-energy {gamma}-rays with Cherenkov telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Braun, I. [ETH Zurich, CH-8093 Zurich (Switzerland); Commichau, S.C. [ETH Zurich, CH-8093 Zurich (Switzerland)], E-mail: commichau@phys.ethz.ch; Rissi, M. [ETH Zurich, CH-8093 Zurich (Switzerland); Backes, M. [Dortmund University of Technology, D-44221 Dortmund (Germany); Biland, A. [ETH Zurich, CH-8093 Zurich (Switzerland); Bretz, T. [University of Wuerzburg, D-97074 Wuerzburg (Germany); Britvitch, I.; Commichau, V.; Gunten, H. von; Hildebrand, D.; Horisberger, U.; Kranich, D. [ETH Zurich, CH-8093 Zurich (Switzerland); Lorenz, E. [ETH Zurich, CH-8093 Zurich (Switzerland); Max-Planck-Institut fuer Physik, D-80805 Muenchen (Germany); Lustermann, W. [ETH Zurich, CH-8093 Zurich (Switzerland); Mannheim, K. [University of Wuerzburg, D-97074 Wuerzburg (Germany); Neise, D. [Dortmund University of Technology, D-44221 Dortmund (Germany); Pauss, F. [ETH Zurich, CH-8093 Zurich (Switzerland); Pohl, M. [University of Geneva, CH-1211 Geneva (Switzerland); Renker, D. [Paul Scherrer Institut (PSI) Villigen, CH-5232 Villigen (Switzerland); Rhode, W. [Dortmund University of Technology, D-44221 Dortmund (Germany)] (and others)

    2009-10-21

    We present a project for a novel camera using Geiger-mode Avalanche Photodiodes (G-APDs), to be installed in a small telescope (former HEGRA CT3) on the MAGIC site in La Palma (Canary Island, Spain). This novel type of semiconductor photon detector provides several superior features compared to conventional photomultiplier tubes (PMTs). The most promising one is a much higher Photon Detection Efficiency.

  4. Magnesium tube hydroforming

    Energy Technology Data Exchange (ETDEWEB)

    Liewald, M.; Pop, R. [Institute for Metal Forming Technology (IFU), Stuttgart (Germany)

    2008-04-15

    Magnesium alloys reveal a good strength-to-weight ratio in the family of lightweight metals and gains potential to provide up to 30% mass savings compared to aluminium and up to 75 % compared to steel. The use of sheet magnesium alloys for auto body applications is however limited due to the relatively low formability at room temperature. Within the scope of this paper, extruded magnesium tubes, which are suitable for hydroforming applications, have been investigated. Results obtained at room temperature using magnesium AZ31 tubes show that circumferential strains are limited to a maximal value of 4%. In order to examine the influence of the forming temperature on tube formability, investigations have been carried out with a new die set for hot internal high pressure (IHP) forming at temperatures up to 400 C. Earlier investigations with magnesium AZ31 tubes have shown that fractures occur along the welding line at tubes extruded over a spider die, whereby a non-uniform expansion at bursting with an elongation value of 24% can be observed. A maximum circumferential strain of approx. 60% could be attained when seamless, mechanically pre-expanded and annealed tubes of the same alloy have been used. The effect of annealing time on materials forming properties shows a fine grained structure for sufficient annealing times as well as deterioration with a large increase at same time. Hence, seamless ZM21 tubes have been used in the current investigations. With these tubes, an increased tensile fracture strain of 116% at 350 C is observed as against 19% at 20 C, obtained by tensile testing of milled specimens from the extruded tubes. This behaviour is also seen under the condition of tool contact during the IHP forming process. To determine the maximum circumferential strain at different forming temperatures and strain rates, the tubes are initially bulged in a die with square cross-section under plane stress conditions. Thereafter, the tubes are calibrated by using an

  5. Power vacuum tubes handbook

    CERN Document Server

    Whitaker, Jerry

    2012-01-01

    Providing examples of applications, Power Vacuum Tubes Handbook, Third Edition examines the underlying technology of each type of power vacuum tube device in common use today. The author presents basic principles, reports on new development efforts, and discusses implementation and maintenance considerations. Supporting mathematical equations and extensive technical illustrations and schematic diagrams help readers understand the material. Translate Principles into Specific Applications This one-stop reference is a hands-on guide for engineering personnel involved in the design, specification,

  6. A gas laser tube

    Energy Technology Data Exchange (ETDEWEB)

    Tetsuo, F.; Tokhikhide, N.

    1984-04-19

    A gas laser tube is described in which contamination of the laser gas mixture by the coolant is avoided, resulting in a longer service life of the mirrors. The holder contains two tubes, one inside the other. The laser gas mixture flows through the internal tube. An electrode is fastened to the holder. The coolant is pumped through the slot between the two tubes, for which a hole is cut into the holder. The external tube has a ring which serves to seal the cavity containing the coolant from the atmosphere. The internal tube has two rings, one to seal the laser gas mixture and the other to seal the coolant. A slot is located between these two rings, which leads to the atmosphere (the atmosphere layer). With this configuration, the degradation of the sealing properties of the internal ring caused by interaction with the atmospheric layer is not reflected in the purity of the laser gas mixture. Moreover, pollution of the mirrors caused by the penetration of the coolant into the cavity is eliminated.

  7. Dynamic tube/support interaction in heat exchanger tubes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, S.S.

    1991-01-01

    The supports for heat exchanger tubes are usually plates with drilled holes; other types of supports also have been used. To facilitate manufacture and to allow for thermal expansion of the tubes, small clearances are used between tubes and tube supports. The dynamics of tube/support interaction in heat exchangers is fairly complicated. Understanding tube dynamics and its effects is important for heat exchangers. This paper summarizes the current state of the art on this subject and to identify future research needs. Specifically, the following topics are discussed: dynamics of loosely supported tubes, tube/support gap dynamics, tube response in flow, tube damage and wear, design considerations, and future research needs. 55 refs., 1 fig.

  8. Dynamic Camera Positioning and Reconfiguration for Multi-Camera Networks

    OpenAIRE

    Konda, Krishna Reddy

    2015-01-01

    The large availability of different types of cameras and lenses, together with the reduction in price of video sensors, has contributed to a widespread use of video surveillance systems, which have become a widely adopted tool to enforce security and safety, in detecting and preventing crimes and dangerous events. The possibility for personalization of such systems is generally very high, letting the user customize the sensing infrastructure, and deploying ad-hoc solutions based on the curren...

  9. Experimental study of the water jet induced by underwater electrical discharge in a narrow rectangular tube

    Science.gov (United States)

    Koita, T.; Zhu, Y.; Sun, M.

    2017-03-01

    This paper reports an experimental investigation on the effects of explosion depth and tube width on the water jet induced by an underwater electrical discharge in a narrow rectangular tube. The water jet formation and bubble structure were evaluated from the images recorded by a high-speed video camera. Two typical patterns of jet formation and four general patterns of bubble implosion were observed, depending on the explosion depth and tube width. The velocity of the water jet was calculated from the recorded images. The jet velocity was observed to depend on not only the explosion depth and energy, but also on the tube width. We proposed an empirical formula defining the water jet velocity in the tube as a function of the tube width and explosion depth and energy.

  10. Experimental study of the water jet induced by underwater electrical discharge in a narrow rectangular tube

    Science.gov (United States)

    Koita, T.; Zhu, Y.; Sun, M.

    2016-05-01

    This paper reports an experimental investigation on the effects of explosion depth and tube width on the water jet induced by an underwater electrical discharge in a narrow rectangular tube. The water jet formation and bubble structure were evaluated from the images recorded by a high-speed video camera. Two typical patterns of jet formation and four general patterns of bubble implosion were observed, depending on the explosion depth and tube width. The velocity of the water jet was calculated from the recorded images. The jet velocity was observed to depend on not only the explosion depth and energy, but also on the tube width. We proposed an empirical formula defining the water jet velocity in the tube as a function of the tube width and explosion depth and energy.

  11. FlashCam: a fully-digital camera for the medium-sized telescopes of the Cherenkov Telescope Array

    CERN Document Server

    Pühlhofer, G; Bernhard, S; Capasso, M; Diebold, S; Eisenkolb, F; Florin, D; Föhr, C; Funk, S; Gadola, A; Garrecht, F; Hermann, G; Jung, I; Kalekin, O; Kalkuhl, C; Kasperek, J; Kihm, T; Lahmann, R; Manalaysay, A; Marszalek, A; Pfeifer, M; Rajda, P J; Reimer, O; Santangelo, A; Schanz, T; Schwab, T; Steiner, S; Straumann, U; Tenzer, C; Vollhardt, A; Weitzel, Q; Werner, F; Wolf, D; Zietara, K

    2015-01-01

    The FlashCam group is currently preparing photomultiplier-tube based cameras proposed for the medium-sized telescopes (MST) of the Cherenkov Telescope Array (CTA). The cameras are designed around the FlashCam readout concept which is the first fully-digital readout system for Cherenkov cameras, based on commercial FADCs and FPGAs as key components for the front-end electronics modules and a high performance camera server as back-end. This contribution describes the progress of the full-scale FlashCam camera prototype currently under construction, as well as performance results also obtained with earlier demonstrator setups. Plans towards the production and implementation of FlashCams on site are also briefly presented.

  12. Architectural Design Document for Camera Models

    DEFF Research Database (Denmark)

    Thuesen, Gøsta

    1998-01-01

    Architecture of camera simulator models and data interface for the Maneuvering of Inspection/Servicing Vehicle (MIV) study.......Architecture of camera simulator models and data interface for the Maneuvering of Inspection/Servicing Vehicle (MIV) study....

  13. Calibration of cameras of the H.E.S.S. detector

    CERN Document Server

    Aharonian, F A; Aye, K M; Bazer-Bachi, A R; Beilicke, M; Benbow, W; Berge, D; Berghaus, P; Bernlöhr, K; Bolz, O; Boisson, C; Borgmeier, C; Breitling, F; Brown, A M; Chadwick, P M; Chitnis, V R; Chounet, L M; Cornils, R; Costamante, L; Degrange, B; De Jager, O C; Djannati-Atai, A; O'Connor-Drury, L; Ergin, T; Espigat, P; Feinstein, F; Fleury, P; Fontaine, G; Funk, S; Gallant, Y A; Giebels, B; Gillessen, S; Goret, P; Guy, J; Hadjichristidis, C; Hauser, M; Heinzelmann, G; Henri, G; Hermann, G; Hinton, J; Hofmann, W; Holleran, M; Horns, D; Jung, I; Khelifi, B; Komin, Nu; Konopelko, A; Latham, I J; Le Gallou, R; Lemoine, M; Lemiere, A; Leroy, N; Lohse, T; Marcowith, A; Masterson, C; McComb, T J L; De Naurois, Mathieu; Nolan, S J; Noutsos, A; Orford, K J; Osborne, J L; Ouchrif, M; Panter, M; Pelletier, G; Pita, S; Pohl, M; Pühlhofer, G; Punch, M; Raubenheimer, B C; Raue, M; Raux, J; Rayner, S M; Redondo, I; Reimer, A; Reimer, O; Ripken, J; Rivoal, M; Rob, L; Rolland, L; Rowell, G; Sahakian, V V; Sauge, L; Schlenker, S; Schlickeiser, R; Schuster, C; Schwanke, U; Siewert, M; Sol, H; Steenkamp, R; Stegmann, C; Tavernet, J P; Theoret, C G; Tluczykont, M; Van der Walt, D J; Vasileiadis, G; Vincent, P; Visser, B; Völk, H J; Wagner, S J

    2004-01-01

    H.E.S.S. - the High Energy Stereoscopic System- is a new system of large atmospheric Cherenkov telescopes for GeV/TeV astronomy. Each of the four telescopes of 107 m^2 mirror area is equipped with a 960-pixel photomulitiplier-tube camera. This paper describes the methods used to convert the photomultiplier signals into the quantities needed for Cherenkov image analysis. Two independent calibration techniques have been applied in parallel to provide an estimation of uncertainties. Results on the long-term stability of the H.E.S.S. cameras are also presented.

  14. A G-APD based Camera for Imaging Atmospheric Cherenkov Telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Anderhub, H. [Eidgenoessische Technische Hochschule Zuerich, 8093 Zuerich (Switzerland); Backes, M. [Technische Universitaet Dortmund, 44221 Dortmund (Germany); Biland, A.; Boller, A.; Braun, I. [Eidgenoessische Technische Hochschule Zuerich, 8093 Zuerich (Switzerland); Bretz, T. [Ecole Polytechnique Federale de Lausanne, 1015 Lausanne (Switzerland); Commichau, S.; Commichau, V.; Dorner, D.; Gendotti, A. [Eidgenoessische Technische Hochschule Zuerich, 8093 Zuerich (Switzerland); Grimm, O., E-mail: oliver.grimm@phys.ethz.c [Eidgenoessische Technische Hochschule Zuerich, 8093 Zuerich (Switzerland); Gunten, H. von; Hildebrand, D.; Horisberger, U. [Eidgenoessische Technische Hochschule Zuerich, 8093 Zuerich (Switzerland); Koehne, J.-H. [Technische Universitaet Dortmund, 44221 Dortmund (Germany); Kraehenbuehl, T.; Kranich, D.; Lorenz, E.; Lustermann, W. [Eidgenoessische Technische Hochschule Zuerich, 8093 Zuerich (Switzerland); Mannheim, K. [Universitaet Wuerzburg, 97074 Wuerzburg (Germany)

    2011-02-01

    Imaging Atmospheric Cherenkov Telescopes (IACT) for Gamma-ray astronomy are presently using photomultiplier tubes as photo sensors. Geiger-mode avalanche photodiodes (G-APD) promise an improvement in sensitivity and, important for this application, ease of construction, operation and ruggedness. G-APDs have proven many of their features in the laboratory, but a qualified assessment of their performance in an IACT camera is best undertaken with a prototype. This paper describes the design and construction of a full-scale camera based on G-APDs realized within the FACT project (First G-APD Cherenkov Telescope).

  15. Oblique detonation waves stabilized in rectangular-cross-section bent tubes

    OpenAIRE

    2011-01-01

    Oblique detonation waves, which are generated by a fundamental detonation phenomenon occurring in bent tubes, may be applied to fuel combustion in high-efficiency engines such as a pulse detonation engine (PDE) and a rotating detonation engine (RDE). The present study has experimentally demonstrated that steady-state oblique detonation waves propagated stably through rectangular-cross-section bent tubes by visualizing these waves using a high-speed camera and the shadowgraph method. The obliq...

  16. Automated Placement of Multiple Stereo Cameras

    OpenAIRE

    Malik, Rahul; Bajcsy, Peter

    2008-01-01

    International audience; This paper presents a simulation framework for multiple stereo camera placement. Multiple stereo camera systems are becoming increasingly popular these days. Applications of multiple stereo camera systems such as tele-immersive systems enable cloning of dynamic scenes in real-time and delivering 3D information from multiple geographic locations to everyone for viewing it in virtual (immersive) 3D spaces. In order to make such multi stereo camera systems ubiquitous, sol...

  17. Mirrored Light Field Video Camera Adapter

    OpenAIRE

    Tsai, Dorian; Dansereau, Donald G.; Martin, Steve; Corke, Peter

    2016-01-01

    This paper proposes the design of a custom mirror-based light field camera adapter that is cheap, simple in construction, and accessible. Mirrors of different shape and orientation reflect the scene into an upwards-facing camera to create an array of virtual cameras with overlapping field of view at specified depths, and deliver video frame rate light fields. We describe the design, construction, decoding and calibration processes of our mirror-based light field camera adapter in preparation ...

  18. An optical metasurface planar camera

    CERN Document Server

    Arbabi, Amir; Kamali, Seyedeh Mahsa; Horie, Yu; Han, Seunghoon; Faraon, Andrei

    2016-01-01

    Optical metasurfaces are 2D arrays of nano-scatterers that modify optical wavefronts at subwavelength spatial resolution. They are poised to revolutionize optical design by enabling complex low cost systems where multiple metasurfaces are lithographically stacked on top of each other and are integrated with electronics. For imaging applications, metasurface stacks can perform sophisticated image corrections and can be directly integrated with image sensors. Here, we demonstrate this concept with a miniature flat camera integrating a monolithic metasurface lens doublet corrected for monochromatic aberrations, and an image sensor. The doublet lens, which acts as a fisheye photographic objective, has an f-number of 0.9, an angle-of-view larger than 60$^\\circ$$\\times$60$^\\circ$, and operates at 850 nm wavelength with large transmission. The camera exhibits high image quality, which indicates the potential of this technology to produce a paradigm shift in future designs of imaging systems for microscopy, photograp...

  19. Combustion pinhole-camera system

    Science.gov (United States)

    Witte, A.B.

    1982-05-19

    A pinhole camera system is described utilizing a sealed optical-purge assembly which provides optical access into a coal combustor or other energy conversion reactors. The camera system basically consists of a focused-purge pinhole optical port assembly, a conventional TV vidicon receiver, an external, variable density light filter which is coupled electronically to the vidicon automatic gain control (agc). The key component of this system is the focused-purge pinhole optical port assembly which utilizes a purging inert gas to keep debris from entering the port and a lens arrangement which transfers the pinhole to the outside of the port assembly. One additional feature of the port assembly is that it is not flush with the interior of the combustor.

  20. HiRISE: The People's Camera

    Science.gov (United States)

    McEwen, A. S.; Eliason, E.; Gulick, V. C.; Spinoza, Y.; Beyer, R. A.; HiRISE Team

    2010-12-01

    The High Resolution Imaging Science Experiment (HiRISE) camera, orbiting Mars since 2006 on the Mars Reconnaissance Orbiter (MRO), has returned more than 17,000 large images with scales as small as 25 cm/pixel. From it’s beginning, the HiRISE team has followed “The People’s Camera” concept, with rapid release of useful images, explanations, and tools, and facilitating public image suggestions. The camera includes 14 CCDs, each read out into 2 data channels, so compressed images are returned from MRO as 28 long (up to 120,000 line) images that are 1024 pixels wide (or binned 2x2 to 512 pixels, etc.). This raw data is very difficult to use, especially for the public. At the HiRISE operations center the raw data are calibrated and processed into a series of B&W and color products, including browse images and JPEG2000-compressed images and tools to make it easy for everyone to explore these enormous images (see http://hirise.lpl.arizona.edu/). Automated pipelines do all of this processing, so we can keep up with the high data rate; images go directly to the format of the Planetary Data System (PDS). After students visually check each image product for errors, they are fully released just 1 month after receipt; captioned images (written by science team members) may be released sooner. These processed HiRISE images have been incorporated into tools such as Google Mars and World Wide Telescope for even greater accessibility. 51 Digital Terrain Models derived from HiRISE stereo pairs have been released, resulting in some spectacular flyover movies produced by members of the public and viewed up to 50,000 times according to YouTube. Public targeting began in 2007 via NASA Quest (http://marsoweb.nas.nasa.gov/HiRISE/quest/) and more than 200 images have been acquired, mostly by students and educators. At the beginning of 2010 we released HiWish (http://www.uahirise.org/hiwish/), opening HiRISE targeting to anyone in the world with Internet access, and already more

  1. SPEIR: A Ge Compton Camera

    Energy Technology Data Exchange (ETDEWEB)

    Mihailescu, L; Vetter, K M; Burks, M T; Hull, E L; Craig, W W

    2004-02-11

    The SPEctroscopic Imager for {gamma}-Rays (SPEIR) is a new concept of a compact {gamma}-ray imaging system of high efficiency and spectroscopic resolution with a 4-{pi} field-of-view. The system behind this concept employs double-sided segmented planar Ge detectors accompanied by the use of list-mode photon reconstruction methods to create a sensitive, compact Compton scatter camera.

  2. Graphic design of pinhole cameras

    Science.gov (United States)

    Edwards, H. B.; Chu, W. P.

    1979-01-01

    The paper describes a graphic technique for the analysis and optimization of pinhole size and focal length. The technique is based on the use of the transfer function of optical elements described by Scott (1959) to construct the transfer function of a circular pinhole camera. This transfer function is the response of a component or system to a pattern of lines having a sinusoidally varying radiance at varying spatial frequencies. Some specific examples of graphic design are presented.

  3. Image Based Camera Localization: an Overview

    OpenAIRE

    Wu, Yihong

    2016-01-01

    Recently, virtual reality, augmented reality, robotics, self-driving cars et al attractive much attention of industrial community, in which image based camera localization is a key task. It is urgent to give an overview of image based camera localization. In this paper, an overview of image based camera localization is presented. It will be useful to not only researchers but also engineers.

  4. 21 CFR 886.1120 - Opthalmic camera.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Opthalmic camera. 886.1120 Section 886.1120 Food... DEVICES OPHTHALMIC DEVICES Diagnostic Devices § 886.1120 Opthalmic camera. (a) Identification. An ophthalmic camera is an AC-powered device intended to take photographs of the eye and the surrounding...

  5. 21 CFR 892.1110 - Positron camera.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Positron camera. 892.1110 Section 892.1110 Food... DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1110 Positron camera. (a) Identification. A positron camera is a device intended to image the distribution of positron-emitting radionuclides in the...

  6. 16 CFR 501.1 - Camera film.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Camera film. 501.1 Section 501.1 Commercial Practices FEDERAL TRADE COMMISSION RULES, REGULATIONS, STATEMENT OF GENERAL POLICY OR INTERPRETATION AND... 500 § 501.1 Camera film. Camera film packaged and labeled for retail sale is exempt from the...

  7. Coaxial fundus camera for opthalmology

    Science.gov (United States)

    de Matos, Luciana; Castro, Guilherme; Castro Neto, Jarbas C.

    2015-09-01

    A Fundus Camera for ophthalmology is a high definition device which needs to meet low light illumination of the human retina, high resolution in the retina and reflection free image1. Those constraints make its optical design very sophisticated, but the most difficult to comply with is the reflection free illumination and the final alignment due to the high number of non coaxial optical components in the system. Reflection of the illumination, both in the objective and at the cornea, mask image quality, and a poor alignment make the sophisticated optical design useless. In this work we developed a totally axial optical system for a non-midriatic Fundus Camera. The illumination is performed by a LED ring, coaxial with the optical system and composed of IR of visible LEDs. The illumination ring is projected by the objective lens in the cornea. The Objective, LED illuminator, CCD lens are coaxial making the final alignment easily to perform. The CCD + capture lens module is a CCTV camera with autofocus and Zoom built in, added to a 175 mm focal length doublet corrected for infinity, making the system easily operated and very compact.

  8. Unassisted 3D camera calibration

    Science.gov (United States)

    Atanassov, Kalin; Ramachandra, Vikas; Nash, James; Goma, Sergio R.

    2012-03-01

    With the rapid growth of 3D technology, 3D image capture has become a critical part of the 3D feature set on mobile phones. 3D image quality is affected by the scene geometry as well as on-the-device processing. An automatic 3D system usually assumes known camera poses accomplished by factory calibration using a special chart. In real life settings, pose parameters estimated by factory calibration can be negatively impacted by movements of the lens barrel due to shaking, focusing, or camera drop. If any of these factors displaces the optical axes of either or both cameras, vertical disparity might exceed the maximum tolerable margin and the 3D user may experience eye strain or headaches. To make 3D capture more practical, one needs to consider unassisted (on arbitrary scenes) calibration. In this paper, we propose an algorithm that relies on detection and matching of keypoints between left and right images. Frames containing erroneous matches, along with frames with insufficiently rich keypoint constellations, are detected and discarded. Roll, pitch yaw , and scale differences between left and right frames are then estimated. The algorithm performance is evaluated in terms of the remaining vertical disparity as compared to the maximum tolerable vertical disparity.

  9. Effect of tube size on electromagnetic tube bulging

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The commercial finite code ANSYS was employed for the simulation of the electromagnetic tube bulging process. The finite element model and boundary conditions were thoroughly discussed. ANSYS/EMAG was used to model the time varying electromagnetic field in order to obtain the radial and axial magnetic pressure acting on the tube. The magnetic pressure was then used as boundary conditions to model the high velocity deformation of various length tube with ANSYS/LSDYNA. The time space distribution of magnetic pressure on various length tubes was presented. Effect of tube size on the distribution of radial magnetic pressure and axial magnetic pressure and high velocity deformation were discussed. According to the radial magnetic pressure ratio of tube end to tube center and corresponding dimensionless length ratio of tube to coil, the free electromagnetic tube bulging was studied in classification. The calculated results show good agreements with practice.

  10. Categorising YouTube

    Directory of Open Access Journals (Sweden)

    Thomas Mosebo Simonsen

    2011-09-01

    Full Text Available This article provides a genre analytical approach to creating a typology of the User Generated Content (UGC of YouTube. The article investigates the construction of navigationprocesses on the YouTube website. It suggests a pragmatic genre approach that is expanded through a focus on YouTube’s technological affordances. Through an analysis of the different pragmatic contexts of YouTube, it is argued that a taxonomic understanding of YouTube must be analysed in regards to the vacillation of a user-driven bottom-up folksonomy and a hierarchical browsing system that emphasises a culture of competition and which favours the already popular content of YouTube. With this taxonomic approach, the UGC videos are registered and analysed in terms of empirically based observations. The article identifies various UGC categories and their principal characteristics. Furthermore, general tendencies of the UGC within the interacting relationship of new and old genres are discussed. It is argued that the utility of a conventional categorical system is primarily of analytical and theoretical interest rather than as a practical instrument.

  11. Spectrometry with consumer-quality CMOS cameras.

    Science.gov (United States)

    Scheeline, Alexander

    2015-01-01

    Many modern spectrometric instruments use diode arrays, charge-coupled arrays, or CMOS cameras for detection and measurement. As portable or point-of-use instruments are desirable, one would expect that instruments using the cameras in cellular telephones and tablet computers would be the basis of numerous instruments. However, no mass market for such devices has yet developed. The difficulties in using megapixel CMOS cameras for scientific measurements are discussed, and promising avenues for instrument development reviewed. Inexpensive alternatives to use of the built-in camera are also mentioned, as the long-term question is whether it is better to overcome the constraints of CMOS cameras or to bypass them.

  12. Single Camera Calibration in 3D Vision

    Directory of Open Access Journals (Sweden)

    Caius SULIMAN

    2009-12-01

    Full Text Available Camera calibration is a necessary step in 3D vision in order to extract metric information from 2D images. A camera is considered to be calibrated when the parameters of the camera are known (i.e. principal distance, lens distorsion, focal length etc.. In this paper we deal with a single camera calibration method and with the help of this method we try to find the intrinsic and extrinsic camera parameters. The method was implemented with succes in the programming and simulation environment Matlab.

  13. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    OpenAIRE

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P. T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many cameras, or for frequent ad-hoc deployments of cameras, the cost of this calibration is high. This creates a barrier for the use of video analytics. Automating the calibration allows for a short conf...

  14. Characterization of the Series 1000 Camera System

    Energy Technology Data Exchange (ETDEWEB)

    Kimbrough, J; Moody, J; Bell, P; Landen, O

    2004-04-07

    The National Ignition Facility requires a compact network addressable scientific grade CCD camera for use in diagnostics ranging from streak cameras to gated x-ray imaging cameras. Due to the limited space inside the diagnostic, an analog and digital input/output option in the camera controller permits control of both the camera and the diagnostic by a single Ethernet link. The system consists of a Spectral Instruments Series 1000 camera, a PC104+ controller, and power supply. The 4k by 4k CCD camera has a dynamic range of 70 dB with less than 14 electron read noise at a 1MHz readout rate. The PC104+ controller includes 16 analog inputs, 4 analog outputs and 16 digital input/output lines for interfacing to diagnostic instrumentation. A description of the system and performance characterization is reported.

  15. Automatic calibration method for plenoptic camera

    Science.gov (United States)

    Luan, Yinsen; He, Xing; Xu, Bing; Yang, Ping; Tang, Guomao

    2016-04-01

    An automatic calibration method is proposed for a microlens-based plenoptic camera. First, all microlens images on the white image are searched and recognized automatically based on digital morphology. Then, the center points of microlens images are rearranged according to their relative position relationships. Consequently, the microlens images are located, i.e., the plenoptic camera is calibrated without the prior knowledge of camera parameters. Furthermore, this method is appropriate for all types of microlens-based plenoptic cameras, even the multifocus plenoptic camera, the plenoptic camera with arbitrarily arranged microlenses, or the plenoptic camera with different sizes of microlenses. Finally, we verify our method by the raw data of Lytro. The experiments show that our method has higher intelligence than the methods published before.

  16. Local Heat Transfer for Finned-Tube Heat Exchangers using Oval Tubes

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, James Edward; Sohal, Manohar Singh

    2000-08-01

    This paper presents the results of an experimental study of forced convection heat transfer in a narrow rectangular duct fitted with either a circular tube or an elliptical tube in crossflow. The duct was designed to simulate a single passage in a fin-tube heat exchanger. Heat transfer measurements were obtained using a transient technique in which a heated airflow is suddenly introduced to the test section. High-resolution local fin-surface temperature distributions were obtained at several times after initiation of the transient using an imaging infrared camera. Corresponding local fin-surface heat transfer coefficient distributions were then calculated from a locally applied one-dimensional semi-infinite inverse heat conduction model. Heat transfer results were obtained over an airflow rate ranging from 1.56 x 10-3 to 15.6 x 10-3 kg/s. These flow rates correspond to a duct-height Reynolds number range of 630 – 6300 with a duct height of 1.106 cm and a duct width-toheight ratio, W/H, of 11.25. The test cylinder was sized such that the diameter-to-duct height ratio, D/H is 5. The elliptical tube had an aspect ratio of 3:1 and a/H equal to 4.33. Results presented in this paper reveal visual and quantitative details of local fin-surface heat transfer distributions in the vicinity of circular and oval tubes and their relationship to the complex horseshoe vortex system that forms in the flow stagnation region. Fin surface stagnation-region Nusselt numbers are shown to be proportional to the square-root of Reynolds number.

  17. Radiometric calibration for MWIR cameras

    Science.gov (United States)

    Yang, Hyunjin; Chun, Joohwan; Seo, Doo Chun; Yang, Jiyeon

    2012-06-01

    Korean Multi-purpose Satellite-3A (KOMPSAT-3A), which weighing about 1,000 kg is scheduled to be launched in 2013 and will be located at a sun-synchronous orbit (SSO) of 530 km in altitude. This is Korea's rst satellite to orbit with a mid-wave infrared (MWIR) image sensor, which is currently being developed at Korea Aerospace Research Institute (KARI). The missions envisioned include forest re surveillance, measurement of the ocean surface temperature, national defense and crop harvest estimate. In this paper, we shall explain the MWIR scene generation software and atmospheric compensation techniques for the infrared (IR) camera that we are currently developing. The MWIR scene generation software we have developed taking into account sky thermal emission, path emission, target emission, sky solar scattering and ground re ection based on MODTRAN data. Here, this software will be used for generating the radiation image in the satellite camera which requires an atmospheric compensation algorithm and the validation of the accuracy of the temperature which is obtained in our result. Image visibility restoration algorithm is a method for removing the eect of atmosphere between the camera and an object. This algorithm works between the satellite and the Earth, to predict object temperature noised with the Earth's atmosphere and solar radiation. Commonly, to compensate for the atmospheric eect, some softwares like MODTRAN is used for modeling the atmosphere. Our algorithm doesn't require an additional software to obtain the surface temperature. However, it needs to adjust visibility restoration parameters and the precision of the result still should be studied.

  18. The Flutter Shutter Camera Simulator

    Directory of Open Access Journals (Sweden)

    Yohann Tendero

    2012-10-01

    Full Text Available The proposed method simulates an embedded flutter shutter camera implemented either analogically or numerically, and computes its performance. The goal of the flutter shutter is to make motion blur invertible, by a "fluttering" shutter that opens and closes on a well chosen sequence of time intervals. In the simulations the motion is assumed uniform, and the user can choose its velocity. Several types of flutter shutter codes are tested and evaluated: the original ones considered by the inventors, the classic motion blur, and finally several analog or numerical optimal codes proposed recently. In all cases the exact SNR of the deconvolved result is also computed.

  19. Cryogenic mechanism for ISO camera

    Science.gov (United States)

    Luciano, G.

    1987-12-01

    The Infrared Space Observatory (ISO) camera configuration, architecture, materials, tribology, motorization, and development status are outlined. The operating temperature is 2 to 3 K, at 2.5 to 18 microns. Selected material is a titanium alloy, with MoS2/TiC lubrication. A stepping motor drives the ball-bearing mounted wheels to which the optical elements are fixed. Model test results are satisfactory, and also confirm the validity of the test facilities, particularly for vibration tests at 4K.

  20. Heat-shrink plastic tubing seals joints in glass tubing

    Science.gov (United States)

    Del Duca, B.; Downey, A.

    1968-01-01

    Small units of standard glass apparatus held together by short lengths of transparent heat-shrinkable polyolefin tubing. The tubing is shrunk over glass O-ring type connectors having O-rings but no lubricant.

  1. Thermal anomaly at the Earth's surface associated with a lava tube

    Science.gov (United States)

    Piombo, Antonello; Di Bari, Marco; Tallarico, Andrea; Dragoni, Michele

    2016-10-01

    Lava tubes are frequently encountered in volcanic areas. The formation of lava tubes has strong implications on the volcanic hazard during effusive eruptions. The thermal dissipation of lava flowing in a tube is reduced in respect to the lava flowing in an open channel so the lava may threaten areas that would not be reached by flows in open channels: for this reason it is important to detect the presence of lava tubes. In this work we propose a model to detect the presence and the characteristics of lava tubes by their thermal footprint at the surface. We model numerically the temperature distribution and the heat flow, both in the steady and the transient state, and we take into account the principal thermal effects due to the presence of an active lava tube, i.e. the conduction to the ground and the atmosphere, the convection and the radiation in the atmosphere. We assume that lava fluid is at high temperature, in motion inside a sloping tube under the gravity force. The thermal profile across the tube direction, in particular the width of the temperature curve, allows to evaluate the depth of the tube. The values of maximum temperature and of tube depth allow to estimate the area of the tube section. The shape of the temperature curve and its asymmetry can give information about the geometry of the tube. If we observe volcanic areas at different times by thermal cameras, we can detect anomalies and evaluate their causes during an eruption; in particular, we can evaluate whether they are due to active lava flows or not and what is their state. For lava tubes, we can connect thermal anomalies with lava tube position, characteristics and state.

  2. Cladding tube manufacturing technology

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, R. [Kraftwerk Union AG, Mulheim (Germany); Jeong, Y.H.; Baek, B.J.; Kim, K.H.; Kim, S.J.; Choi, B.K.; Kim, J.M. [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-04-01

    This report gives an overview of the manufacturing routine of PWR cladding tubes. The routine essentially consists of a series of deformation and annealing processes which are necessary to transform the ingot geometry to tube dimensions. By changing shape, microstructure and structure-related properties are altered simultaneously. First, a short overview of the basics of that part of deformation geometry is given which is related to tube reducing operations. Then those processes of the manufacturing routine which change the microstructure are depicted, and the influence of certain process parameters on microstructure and material properties are shown. The influence of the resulting microstructure on material properties is not discussed in detail, since it is described in my previous report 'Alloy Development for High Burnup Cladding.' Because of their paramount importance still up to now, and because manufacturing data and their influence on properties for other alloys are not so well established or published, the descriptions are mostly related to Zry4 tube manufacturing, and are only in short for other alloys. (author). 9 refs., 46 figs.

  3. Downhole pulse tube refrigerators

    Energy Technology Data Exchange (ETDEWEB)

    Swift, G.; Gardner, D. [Los Alamos National Lab., NM (United States). Condensed Matter and Thermal Physics Group

    1997-12-01

    This report summarizes a preliminary design study to explore the plausibility of using pulse tube refrigeration to cool instruments in a hot down-hole environment. The original motivation was to maintain Dave Reagor`s high-temperature superconducting electronics at 75 K, but the study has evolved to include three target design criteria: cooling at 30 C in a 300 C environment, cooling at 75 K in a 50 C environment, cooling at both 75 K and 30 C in a 250 C environment. These specific temperatures were chosen arbitrarily, as representative of what is possible. The primary goals are low cost, reliability, and small package diameter. Pulse-tube refrigeration is a rapidly growing sub-field of cryogenic refrigeration. The pulse tube refrigerator has recently become the simplest, cheapest, most rugged and reliable low-power cryocooler. The authors expect this technology will be applicable downhole because of the ratio of hot to cold temperatures (in absolute units, such as Kelvin) of interest in deep drilling is comparable to the ratios routinely achieved with cryogenic pulse-tube refrigerators.

  4. Prawns in Bamboo Tube

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    Ingredients: 400 grams Jiwei prawns, 25 grams pork shreds, 5 grams sliced garlic. Condiments: 5 grams cooking oil, minced ginger root and scallions, cooking wine, salt, pepper and MSG (optional) Method: 1. Place the Shelled prawns into a bowl and mix with all the condiments. 2. Stuff the prawns into a fresh bamboo tube,

  5. Light field panorama by a plenoptic camera

    Science.gov (United States)

    Xue, Zhou; Baboulaz, Loic; Prandoni, Paolo; Vetterli, Martin

    2013-03-01

    Consumer-grade plenoptic camera Lytro draws a lot of interest from both academic and industrial world. However its low resolution in both spatial and angular domain prevents it from being used for fine and detailed light field acquisition. This paper proposes to use a plenoptic camera as an image scanner and perform light field stitching to increase the size of the acquired light field data. We consider a simplified plenoptic camera model comprising a pinhole camera moving behind a thin lens. Based on this model, we describe how to perform light field acquisition and stitching under two different scenarios: by camera translation or by camera translation and rotation. In both cases, we assume the camera motion to be known. In the case of camera translation, we show how the acquired light fields should be resampled to increase the spatial range and ultimately obtain a wider field of view. In the case of camera translation and rotation, the camera motion is calculated such that the light fields can be directly stitched and extended in the angular domain. Simulation results verify our approach and demonstrate the potential of the motion model for further light field applications such as registration and super-resolution.

  6. Computational cameras: convergence of optics and processing.

    Science.gov (United States)

    Zhou, Changyin; Nayar, Shree K

    2011-12-01

    A computational camera uses a combination of optics and processing to produce images that cannot be captured with traditional cameras. In the last decade, computational imaging has emerged as a vibrant field of research. A wide variety of computational cameras has been demonstrated to encode more useful visual information in the captured images, as compared with conventional cameras. In this paper, we survey computational cameras from two perspectives. First, we present a taxonomy of computational camera designs according to the coding approaches, including object side coding, pupil plane coding, sensor side coding, illumination coding, camera arrays and clusters, and unconventional imaging systems. Second, we use the abstract notion of light field representation as a general tool to describe computational camera designs, where each camera can be formulated as a projection of a high-dimensional light field to a 2-D image sensor. We show how individual optical devices transform light fields and use these transforms to illustrate how different computational camera designs (collections of optical devices) capture and encode useful visual information.

  7. A Unifying Theory for Camera Calibration.

    Science.gov (United States)

    Ramalingam, SriKumar; Sturm, Peter

    2016-07-19

    This paper proposes a unified theory for calibrating a wide variety of camera models such as pinhole, fisheye, cata-dioptric, and multi-camera networks. We model any camera as a set of image pixels and their associated camera rays in space. Every pixel measures the light traveling along a (half-) ray in 3-space, associated with that pixel. By this definition, calibration simply refers to the computation of the mapping between pixels and the associated 3D rays. Such a mapping can be computed using images of calibration grids, which are objects with known 3D geometry, taken from unknown positions. This general camera model allows to represent non-central cameras; we also consider two special subclasses, namely central and axial cameras. In a central camera, all rays intersect in a single point, whereas the rays are completely arbitrary in a non-central one. Axial cameras are an intermediate case: the camera rays intersect a single line. In this work, we show the theory for calibrating central, axial and non-central models using calibration grids, which can be either three-dimensional or planar.

  8. Optimising camera traps for monitoring small mammals.

    Directory of Open Access Journals (Sweden)

    Alistair S Glen

    Full Text Available Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1 trigger speed, 2 passive infrared vs. microwave sensor, 3 white vs. infrared flash, and 4 still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea, feral cats (Felis catus and hedgehogs (Erinaceuseuropaeus. Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  9. Optimising camera traps for monitoring small mammals.

    Science.gov (United States)

    Glen, Alistair S; Cockburn, Stuart; Nichols, Margaret; Ekanayake, Jagath; Warburton, Bruce

    2013-01-01

    Practical techniques are required to monitor invasive animals, which are often cryptic and occur at low density. Camera traps have potential for this purpose, but may have problems detecting and identifying small species. A further challenge is how to standardise the size of each camera's field of view so capture rates are comparable between different places and times. We investigated the optimal specifications for a low-cost camera trap for small mammals. The factors tested were 1) trigger speed, 2) passive infrared vs. microwave sensor, 3) white vs. infrared flash, and 4) still photographs vs. video. We also tested a new approach to standardise each camera's field of view. We compared the success rates of four camera trap designs in detecting and taking recognisable photographs of captive stoats (Mustelaerminea), feral cats (Felis catus) and hedgehogs (Erinaceuseuropaeus). Trigger speeds of 0.2-2.1 s captured photographs of all three target species unless the animal was running at high speed. The camera with a microwave sensor was prone to false triggers, and often failed to trigger when an animal moved in front of it. A white flash produced photographs that were more readily identified to species than those obtained under infrared light. However, a white flash may be more likely to frighten target animals, potentially affecting detection probabilities. Video footage achieved similar success rates to still cameras but required more processing time and computer memory. Placing two camera traps side by side achieved a higher success rate than using a single camera. Camera traps show considerable promise for monitoring invasive mammal control operations. Further research should address how best to standardise the size of each camera's field of view, maximise the probability that an animal encountering a camera trap will be detected, and eliminate visible or audible cues emitted by camera traps.

  10. Restore condition of Incore thimble tubes in guide tubes

    Energy Technology Data Exchange (ETDEWEB)

    Solanas, A.; Izquierdo, J.

    2014-07-01

    Aging of Nuclear Power Plant and succession of outages lead to wear and twist of the thimbles tubes but also to the fooling of Incore guide tubes. These can create friction and a high strength must be used for thimble tubes withdrawal. (Author)

  11. Eustachian tube function in children after insertion of ventilation tubes.

    NARCIS (Netherlands)

    Heerbeek, N. van; Ingels, K.J.A.O.; Snik, A.F.M.; Zielhuis, G.A.

    2001-01-01

    This study was performed to assess the effect of the insertion of ventilation tubes and the subsequent aeration of the middle ear on eustachian tube (ET) function in children. Manometric ET function tests were performed repeatedly for 3 months after the placement of ventilation tubes in 83 children

  12. The Zwicky Transient Facility Camera

    Science.gov (United States)

    Dekany, Richard; Smith, Roger M.; Belicki, Justin; Delacroix, Alexandre; Duggan, Gina; Feeney, Michael; Hale, David; Kaye, Stephen; Milburn, Jennifer; Murphy, Patrick; Porter, Michael; Reiley, Daniel J.; Riddle, Reed L.; Rodriguez, Hector; Bellm, Eric C.

    2016-08-01

    The Zwicky Transient Facility Camera (ZTFC) is a key element of the ZTF Observing System, the integrated system of optoelectromechanical instrumentation tasked to acquire the wide-field, high-cadence time-domain astronomical data at the heart of the Zwicky Transient Facility. The ZTFC consists of a compact cryostat with large vacuum window protecting a mosaic of 16 large, wafer-scale science CCDs and 4 smaller guide/focus CCDs, a sophisticated vacuum interface board which carries data as electrical signals out of the cryostat, an electromechanical window frame for securing externally inserted optical filter selections, and associated cryo-thermal/vacuum system support elements. The ZTFC provides an instantaneous 47 deg2 field of view, limited by primary mirror vignetting in its Schmidt telescope prime focus configuration. We report here on the design and performance of the ZTF CCD camera cryostat and report results from extensive Joule-Thompson cryocooler tests that may be of broad interest to the instrumentation community.

  13. A neutron pinhole camera for PF-24 source: Conceptual design and optimization

    Science.gov (United States)

    Bielecki, J.; Wójcik-Gargula, A.; Wiacek, U.; Scholz, M.; Igielski, A.; Drozdowicz, K.; Woźnicka, U.

    2015-07-01

    A fast-neutron pinhole camera based on small-area (5mm × 5 mm) BCF-12 scintillation detectors with nanosecond time resolution has been designed. The pinhole camera is dedicated to the investigation of the spatial and temporal distributions of DD neutrons from the Plasma Focus (PF-24) source. The geometrical parameters of the camera have been optimized in terms of maximum neutron flux at the imaging plane by means of MCNP calculations. The detection system consists of four closely packed scintillation detectors coupled via long optical fibres to Hamamatsu H3164-10 photomultiplier tubes. The pinhole consists of specially designed 420 mm long copper collimator with an effective aperture of 1.7 mm mounted inside a cylindrical polyethylene tube. The performance of the presented detection system in the mixed (hard X-ray and neutron) radiation field of the PF-24 plasma focus device has been tested. The results of the tests showed that the small-area BCF-12 scintillation detectors can be successfully applied as the detection system of the neutron pinhole camera for the PF-24 device.

  14. MAGIC-II Camera Slow Control Software

    CERN Document Server

    Steinke, B; Tridon, D Borla

    2009-01-01

    The Imaging Atmospheric Cherenkov Telescope MAGIC I has recently been extended to a stereoscopic system by adding a second 17 m telescope, MAGIC-II. One of the major improvements of the second telescope is an improved camera. The Camera Control Program is embedded in the telescope control software as an independent subsystem. The Camera Control Program is an effective software to monitor and control the camera values and their settings and is written in the visual programming language LabVIEW. The two main parts, the Central Variables File, which stores all information of the pixel and other camera parameters, and the Comm Control Routine, which controls changes in possible settings, provide a reliable operation. A safety routine protects the camera from misuse by accidental commands, from bad weather conditions and from hardware errors by automatic reactions.

  15. Enteral Tube Feeding and Pneumonia

    Science.gov (United States)

    Gray, David Sheridan; Kimmel, David

    2006-01-01

    To determine the effects of enteral tube feeding on the incidence of pneumonia, we performed a retrospective review of all clients at our institution who had gastrostomy or jejunostomy tubes placed over a 10-year period. Ninety-three subjects had a history of pneumonia before feeding tube insertion. Eighty had gastrostomy and 13, jejunostomy…

  16. Movement-based Interaction in Camera Spaces

    DEFF Research Database (Denmark)

    Eriksson, Eva; Riisgaard Hansen, Thomas; Lykke-Olesen, Andreas

    2006-01-01

    In this paper we present three concepts that address movement-based interaction using camera tracking. Based on our work with several movement-based projects we present four selected applications, and use these applications to leverage our discussion, and to describe our three main concepts space......, relations, and feedback. We see these as central for describing and analysing movement-based systems using camera tracking and we show how these three concepts can be used to analyse other camera tracking applications....

  17. Development of biostereometric experiments. [stereometric camera system

    Science.gov (United States)

    Herron, R. E.

    1978-01-01

    The stereometric camera was designed for close-range techniques in biostereometrics. The camera focusing distance of 360 mm to infinity covers a broad field of close-range photogrammetry. The design provides for a separate unit for the lens system and interchangeable backs on the camera for the use of single frame film exposure, roll-type film cassettes, or glass plates. The system incorporates the use of a surface contrast optical projector.

  18. Comparative evaluation of consumer grade cameras and mobile phone cameras for close range photogrammetry

    Science.gov (United States)

    Chikatsu, Hirofumi; Takahashi, Yoji

    2009-08-01

    The authors have been concentrating on developing convenient 3D measurement methods using consumer grade digital cameras, and it was concluded that consumer grade digital cameras are expected to become a useful photogrammetric device for the various close range application fields. On the other hand, mobile phone cameras which have 10 mega pixels were appeared on the market in Japan. In these circumstances, we are faced with alternative epoch-making problem whether mobile phone cameras are able to take the place of consumer grade digital cameras in close range photogrammetric applications. In order to evaluate potentials of mobile phone cameras in close range photogrammetry, comparative evaluation between mobile phone cameras and consumer grade digital cameras are investigated in this paper with respect to lens distortion, reliability, stability and robustness. The calibration tests for 16 mobile phone cameras and 50 consumer grade digital cameras were conducted indoors using test target. Furthermore, practability of mobile phone camera for close range photogrammetry was evaluated outdoors. This paper presents that mobile phone cameras have ability to take the place of consumer grade digital cameras, and develop the market in digital photogrammetric fields.

  19. Object tracking using multiple camera video streams

    Science.gov (United States)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  20. Omnidirectional Underwater Camera Design and Calibration

    Directory of Open Access Journals (Sweden)

    Josep Bosch

    2015-03-01

    Full Text Available This paper presents the development of an underwater omnidirectional multi-camera system (OMS based on a commercially available six-camera system, originally designed for land applications. A full calibration method is presented for the estimation of both the intrinsic and extrinsic parameters, which is able to cope with wide-angle lenses and non-overlapping cameras simultaneously. This method is valid for any OMS in both land or water applications. For underwater use, a customized housing is required, which often leads to strong image distortion due to refraction among the different media. This phenomena makes the basic pinhole camera model invalid for underwater cameras, especially when using wide-angle lenses, and requires the explicit modeling of the individual optical rays. To address this problem, a ray tracing approach has been adopted to create a field-of-view (FOV simulator for underwater cameras. The simulator allows for the testing of different housing geometries and optics for the cameras to ensure a complete hemisphere coverage in underwater operation. This paper describes the design and testing of a compact custom housing for a commercial off-the-shelf OMS camera (Ladybug 3 and presents the first results of its use. A proposed three-stage calibration process allows for the estimation of all of the relevant camera parameters. Experimental results are presented, which illustrate the performance of the calibration method and validate the approach.

  1. Vortex tube optimization theory

    Energy Technology Data Exchange (ETDEWEB)

    Lewins, Jeffery [Cambridge Univ., Magdalene Coll., Cambridge (United Kingdom); Bejan, Adrian [Duke Univ., Dept. of Mechanical Engineering and Materials Science, Durham, NC (United States)

    1999-11-01

    The Ranque-Hilsch vortex tube splits a single high pressure stream of gas into cold and warm streams. Simple models for the vortex tube combined with regenerative precooling are given from which an optimisation can be undertaken. Two such optimisations are needed: the first shows that at any given cut or fraction of the cold stream, the best refrigerative load, allowing for the temperature lift, is nearly half the maximum loading that would result in no lift. The second optimisation shows that the optimum cut is an equal division of the vortex streams between hot and cold. Bounds are obtainable within this theory for the performance of the system for a given gas and pressure ratio. (Author)

  2. Neural tube defects

    Directory of Open Access Journals (Sweden)

    M.E. Marshall

    1981-09-01

    Full Text Available Neural tube defects refer to any defect in the morphogenesis of the neural tube, the most common types being spina bifida and anencephaly. Spina bifida has been recognised in skeletons found in north-eastern Morocco and estimated to have an age of almost 12 000 years. It was also known to the ancient Greek and Arabian physicians who thought that the bony defect was due to the tumour. The term spina bifida was first used by Professor Nicolai Tulp of Amsterdam in 1652. Many other terms have been used to describe this defect, but spina bifida remains the most useful general term, as it describes the separation of the vertebral elements in the midline.

  3. Camera processing with chromatic aberration.

    Science.gov (United States)

    Korneliussen, Jan Tore; Hirakawa, Keigo

    2014-10-01

    Since the refractive index of materials commonly used for lens depends on the wavelengths of light, practical camera optics fail to converge light to a single point on an image plane. Known as chromatic aberration, this phenomenon distorts image details by introducing magnification error, defocus blur, and color fringes. Though achromatic and apochromatic lens designs reduce chromatic aberration to a degree, they are complex and expensive and they do not offer a perfect correction. In this paper, we propose a new postcapture processing scheme designed to overcome these problems computationally. Specifically, the proposed solution is comprised of chromatic aberration-tolerant demosaicking algorithm and post-demosaicking chromatic aberration correction. Experiments with simulated and real sensor data verify that the chromatic aberration is effectively corrected.

  4. Gesture recognition on smart cameras

    Science.gov (United States)

    Dziri, Aziz; Chevobbe, Stephane; Darouich, Mehdi

    2013-02-01

    Gesture recognition is a feature in human-machine interaction that allows more natural interaction without the use of complex devices. For this reason, several methods of gesture recognition have been developed in recent years. However, most real time methods are designed to operate on a Personal Computer with high computing resources and memory. In this paper, we analyze relevant methods found in the literature in order to investigate the ability of smart camera to execute gesture recognition algorithms. We elaborate two hand gesture recognition pipelines. The first method is based on invariant moments extraction and the second on finger tips detection. The hand detection method used for both pipeline is based on skin color segmentation. The results obtained show that the un-optimized versions of invariant moments method and finger tips detection method can reach 10 fps on embedded processor and use about 200 kB of memory.

  5. Framework for Evaluating Camera Opinions

    Directory of Open Access Journals (Sweden)

    K.M. Subramanian

    2015-03-01

    Full Text Available Opinion mining plays a most important role in text mining applications in brand and product positioning, customer relationship management, consumer attitude detection and market research. The applications lead to new generation of companies/products meant for online market perception, online content monitoring and reputation management. Expansion of the web inspires users to contribute/express opinions via blogs, videos and social networking sites. Such platforms provide valuable information for analysis of sentiment pertaining a product or service. This study investigates the performance of various feature extraction methods and classification algorithm for opinion mining. Opinions expressed in Amazon website for cameras are collected and used for evaluation. Features are extracted from the opinions using Term Document Frequency and Inverse Document Frequency (TDFIDF. Feature transformation is achieved through Principal Component Analysis (PCA and kernel PCA. Naïve Bayes, K Nearest Neighbor and Classification and Regression Trees (CART classification algorithms classify the features extracted.

  6. Illumination box and camera system

    Science.gov (United States)

    Haas, Jeffrey S.; Kelly, Fredrick R.; Bushman, John F.; Wiefel, Michael H.; Jensen, Wayne A.; Klunder, Gregory L.

    2002-01-01

    A hand portable, field-deployable thin-layer chromatography (TLC) unit and a hand portable, battery-operated unit for development, illumination, and data acquisition of the TLC plates contain many miniaturized features that permit a large number of samples to be processed efficiently. The TLC unit includes a solvent tank, a holder for TLC plates, and a variety of tool chambers for storing TLC plates, solvent, and pipettes. After processing in the TLC unit, a TLC plate is positioned in a collapsible illumination box, where the box and a CCD camera are optically aligned for optimal pixel resolution of the CCD images of the TLC plate. The TLC system includes an improved development chamber for chemical development of TLC plates that prevents solvent overflow.

  7. LROC - Lunar Reconnaissance Orbiter Camera

    Science.gov (United States)

    Robinson, M. S.; Eliason, E.; Hiesinger, H.; Jolliff, B. L.; McEwen, A.; Malin, M. C.; Ravine, M. A.; Thomas, P. C.; Turtle, E. P.

    2009-12-01

    The Lunar Reconnaissance Orbiter (LRO) went into lunar orbit on 23 June 2009. The LRO Camera (LROC) acquired its first lunar images on June 30 and commenced full scale testing and commissioning on July 10. The LROC consists of two narrow-angle cameras (NACs) that provide 0.5 m scale panchromatic images over a combined 5 km swath, and a wide-angle camera (WAC) to provide images at a scale of 100 m per pixel in five visible wavelength bands (415, 566, 604, 643, and 689 nm) and 400 m per pixel in two ultraviolet bands (321 nm and 360 nm) from the nominal 50 km orbit. Early operations were designed to test the performance of the cameras under all nominal operating conditions and provided a baseline for future calibrations. Test sequences included off-nadir slews to image stars and the Earth, 90° yaw sequences to collect flat field calibration data, night imaging for background characterization, and systematic mapping to test performance. LRO initially was placed into a terminator orbit resulting in images acquired under low signal conditions. Over the next three months the incidence angle at the spacecraft’s equator crossing gradually decreased towards high noon, providing a range of illumination conditions. Several hundred south polar images were collected in support of impact site selection for the LCROSS mission; details can be seen in many of the shadows. Commissioning phase images not only proved the instruments’ overall performance was nominal, but also that many geologic features of the lunar surface are well preserved at the meter-scale. Of particular note is the variety of impact-induced morphologies preserved in a near pristine state in and around kilometer-scale and larger young Copernican age impact craters that include: abundant evidence of impact melt of a variety of rheological properties, including coherent flows with surface textures and planimetric properties reflecting supersolidus (e.g., liquid melt) emplacement, blocks delicately perched on

  8. Primary fallopian tube carcinoma

    Directory of Open Access Journals (Sweden)

    Mladenović-Segedi Ljiljana

    2009-01-01

    Full Text Available Introduction. Primary fallopian tube carcinoma is extremely rare, making 0.3-1.6% of all female genital tract malignancies. Although the etymology of this tumor is unknown, it is suggested to be associated with chronic tubal inflammation, infertility, tuberculous salpingitis and tubal endometriosis. High parity is considered to be protective. Cytogenetic studies show the disease to be associated with over expression of p53, HER2/neu and c-myb. There is also some evidence that BRCA1 and BRCA2 mutations have a role in umorogeneis. Clinical features. The most prevailing symptoms with fallopian tube carcinoma are abdominal pain, abnormal vaginal discharge/bleeding and the most common finding is an adnexal mass. In many patients, fallopian tube carcinoma is asymptomatic. Diagnosis. Due to its rarity, preoperative diagnosis of primary fallopian tube carcinoma is rarely made. It is usually misdiagnosed as ovarian carcinoma, tuboovarian abscess or ectopic pregnancy. Sonographic features of the tumor are non-specific and include the presence of a fluid-filled adnexal structure with a significant solid component, a sausage-shaped mass, a cystic mass with papillary projections within, a cystic mass with cog wheel appearance and an ovoid-shaped structure containing an incomplete separation and a highly vascular solid nodule. More than 80% of patients have elevated pretreatment serum CA-125 levels, which is useful in follow-up after the definite treatment. Treatment. The treatment approach is similar to that of ovarian carcinoma, and includes total abdominal hysterectomy and bilateral salpingo-oophorectomy. Staging is followed with chemotherapy.

  9. HRSC: High resolution stereo camera

    Science.gov (United States)

    Neukum, G.; Jaumann, R.; Basilevsky, A.T.; Dumke, A.; Van Gasselt, S.; Giese, B.; Hauber, E.; Head, J. W.; Heipke, C.; Hoekzema, N.; Hoffmann, H.; Greeley, R.; Gwinner, K.; Kirk, R.; Markiewicz, W.; McCord, T.B.; Michael, G.; Muller, Jan-Peter; Murray, J.B.; Oberst, J.; Pinet, P.; Pischel, R.; Roatsch, T.; Scholten, F.; Willner, K.

    2009-01-01

    The High Resolution Stereo Camera (HRSC) on Mars Express has delivered a wealth of image data, amounting to over 2.5 TB from the start of the mapping phase in January 2004 to September 2008. In that time, more than a third of Mars was covered at a resolution of 10-20 m/pixel in stereo and colour. After five years in orbit, HRSC is still in excellent shape, and it could continue to operate for many more years. HRSC has proven its ability to close the gap between the low-resolution Viking image data and the high-resolution Mars Orbiter Camera images, leading to a global picture of the geological evolution of Mars that is now much clearer than ever before. Derived highest-resolution terrain model data have closed major gaps and provided an unprecedented insight into the shape of the surface, which is paramount not only for surface analysis and geological interpretation, but also for combination with and analysis of data from other instruments, as well as in planning for future missions. This chapter presents the scientific output from data analysis and highlevel data processing, complemented by a summary of how the experiment is conducted by the HRSC team members working in geoscience, atmospheric science, photogrammetry and spectrophotometry. Many of these contributions have been or will be published in peer-reviewed journals and special issues. They form a cross-section of the scientific output, either by summarising the new geoscientific picture of Mars provided by HRSC or by detailing some of the topics of data analysis concerning photogrammetry, cartography and spectral data analysis.

  10. Traveling-Wave Tubes

    Science.gov (United States)

    Kory, Carol L.

    1998-01-01

    The traveling-wave tube (TWT) is a vacuum device invented in the early 1940's used for amplification at microwave frequencies. Amplification is attained by surrendering kinetic energy from an electron beam to a radio frequency (RF) electromagnetic wave. The demand for vacuum devices has been decreased largely by the advent of solid-state devices. However, although solid state devices have replaced vacuum devices in many areas, there are still many applications such as radar, electronic countermeasures and satellite communications, that require operating characteristics such as high power (Watts to Megawatts), high frequency (below 1 GHz to over 100 GHz) and large bandwidth that only vacuum devices can provide. Vacuum devices are also deemed irreplaceable in the music industry where musicians treasure their tube-based amplifiers claiming that the solid-state and digital counterparts could never provide the same "warmth" (3). The term traveling-wave tube includes both fast-wave and slow-wave devices. This article will concentrate on slow-wave devices as the vast majority of TWTs in operation fall into this category.

  11. MISR FIRSTLOOK radiometric camera-by-camera Cloud Mask V001

    Data.gov (United States)

    National Aeronautics and Space Administration — This file contains the FIRSTLOOK Radiometric camera-by-camera Cloud Mask (RCCM) dataset produced using ancillary inputs (RCCT) from the previous time period. It is...

  12. Reliability of steam generator tubing

    Energy Technology Data Exchange (ETDEWEB)

    Kadokami, E. [Mitsubishi Heavy Industries Ltd., Hyogo-ku (Japan)

    1997-02-01

    The author presents results on studies made of the reliability of steam generator (SG) tubing. The basis for this work is that in Japan the issue of defects in SG tubing is addressed by the approach that any detected defect should be repaired, either by plugging the tube or sleeving it. However, this leaves open the issue that there is a detection limit in practice, and what is the effect of nondetectable cracks on the performance of tubing. These studies were commissioned to look at the safety issues involved in degraded SG tubing. The program has looked at a number of different issues. First was an assessment of the penetration and opening behavior of tube flaws due to internal pressure in the tubing. They have studied: penetration behavior of the tube flaws; primary water leakage from through-wall flaws; opening behavior of through-wall flaws. In addition they have looked at the question of the reliability of tubing with flaws during normal plant operation. Also there have been studies done on the consequences of tube rupture accidents on the integrity of neighboring tubes.

  13. Camera Inspection Arm for Boiling Water Reactors - 13330

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Scott; Rood, Marc [S.A. Technology, 3985 S. Lincoln Ave, Loveland, CO 80537 (United States)

    2013-07-01

    Boiling Water Reactor (BWR) outage maintenance tasks can be time-consuming and hazardous. Reactor facilities are continuously looking for quicker, safer, and more effective methods of performing routine inspection during these outages. In 2011, S.A. Technology (SAT) was approached by Energy Northwest to provide a remote system capable of increasing efficiencies related to Reactor Pressure Vessel (RPV) internal inspection activities. The specific intent of the system discussed was to inspect recirculation jet pumps in a manner that did not require manual tooling, and could be performed independently of other ongoing inspection activities. In 2012, SAT developed a compact, remote, camera inspection arm to create a safer, more efficient outage environment. This arm incorporates a compact and lightweight design along with the innovative use of bi-stable composite tubes to provide a six-degree of freedom inspection tool capable of reducing dose uptake, reducing crew size, and reducing the overall critical path for jet pump inspections. The prototype camera inspection arm unit is scheduled for final testing in early 2013 in preparation for the Columbia Generating Station refueling outage in the spring of 2013. (authors)

  14. Metrology Camera System of Prime Focus Spectrograph for Subaru Telescope

    CERN Document Server

    Wang, Shiang-Yu; Huang, Pin-Jie; Ling, Hung-Hsu; Karr, Jennifer; Chang, Yin-Chang; Hu, Yen-Shan; Hsu, Shu-Fu; Chen, Hsin-Yo; Gunn, James E; Reiley, Dan J; Tamura, Naoyuki; Takato, Naruhisa; Shimono, Atsushi

    2016-01-01

    The Prime Focus Spectrograph (PFS) is a new optical/near-infrared multi-fiber spectrograph designed for the prime focus of the 8.2m Subaru telescope. PFS will cover a 1.3 degree diameter field with 2394 fibers to complement the imaging capabilities of Hyper SuprimeCam. To retain high throughput, the final positioning accuracy between the fibers and observing targets of PFS is required to be less than 10um. The metrology camera system (MCS) serves as the optical encoder of the fiber motors for the configuring of fibers. MCS provides the fiber positions within a 5um error over the 45 cm focal plane. The information from MCS will be fed into the fiber positioner control system for the closed loop control. MCS will be located at the Cassegrain focus of Subaru telescope in order to to cover the whole focal plane with one 50M pixel Canon CMOS camera. It is a 380mm Schmidt type telescope which generates a uniform spot size with a 10 micron FWHM across the field for reasonable sampling of PSF. Carbon fiber tubes are ...

  15. Trajectory association across multiple airborne cameras.

    Science.gov (United States)

    Sheikh, Yaser Ajmal; Shah, Mubarak

    2008-02-01

    A camera mounted on an aerial vehicle provides an excellent means for monitoring large areas of a scene. Utilizing several such cameras on different aerial vehicles allows further flexibility, in terms of increased visual scope and in the pursuit of multiple targets. In this paper, we address the problem of associating objects across multiple airborne cameras. Since the cameras are moving and often widely separated, direct appearance-based or proximity-based constraints cannot be used. Instead, we exploit geometric constraints on the relationship between the motion of each object across cameras, to test multiple association hypotheses, without assuming any prior calibration information. Given our scene model, we propose a likelihood function for evaluating a hypothesized association between observations in multiple cameras that is geometrically motivated. Since multiple cameras exist, ensuring coherency in association is an essential requirement, e.g. that transitive closure is maintained between more than two cameras. To ensure such coherency we pose the problem of maximizing the likelihood function as a k-dimensional matching and use an approximation to find the optimal assignment of association. Using the proposed error function, canonical trajectories of each object and optimal estimates of inter-camera transformations (in a maximum likelihood sense) are computed. Finally, we show that as a result of associating objects across the cameras, a concurrent visualization of multiple aerial video streams is possible and that, under special conditions, trajectories interrupted due to occlusion or missing detections can be repaired. Results are shown on a number of real and controlled scenarios with multiple objects observed by multiple cameras, validating our qualitative models, and through simulation quantitative performance is also reported.

  16. Hybrid endotracheal tubes

    Science.gov (United States)

    Sakezles, Christopher Thomas

    Intubation involves the placement of a tube into the tracheal lumen and is prescribed in any setting in which the airway must be stabilized or the patient anesthetized. The purpose of the endotracheal tube in these procedures is to maintain a viable airway, facilitate mechanical ventilation, allow the administration of anesthetics, and prevent the reflux of vomitus into the lungs. In order to satisfy these requirements a nearly airtight seal must be maintained between the tube and the tracheal lining. Most conventional endotracheal tubes provide this seal by employing a cuff that is inflated once the tube is in place. However, the design of this cuff and properties of the material are a source of irritation and injury to the tracheal tissues. In fact, the complication rate for endotracheal intubation is reported to be between 10 and 60%, with manifestations ranging from severe sore throat to erosion through the tracheal wall. These complications are caused by a combination of the materials employed and the forces exerted by the cuff on the tracheal tissues. In particular, the abrasive action of the cuff shears cells from the lining, epithelium adhering to the cuff is removed during extubation, and normal forces exerted on the basement tissues disrupt the blood supply and cause pressure necrosis. The complications associated with tracheal intubation may be reduced or eliminated by employing airway devices constructed from hydrogel materials. Hydrogels are a class of crosslinked polymers which swell in the presence of moisture, and may contain more than 95% water by weight. For the current study, several prototype airway devices were constructed from hydrogel materials including poly(vinyl alcohol), poly(hydroxyethyl methacrylate), and poly(vinyl pyrrolidone). The raw hydrogel materials from this group were subjected to tensile, swelling, and biocompatibility testing, while the finished devices were subjected to extensive mechanical simulation and animal trials

  17. Small prototype gamma camera based on wavelength-shifting fibres

    Science.gov (United States)

    Castro, I. F.; Soares, A. J.; Moutinho, L. M.; Veloso, J. F. C. A.

    2012-01-01

    We are studying and developing a small field of view gamma camera based on wavelength-shifting optical fibres coupled to both sides of an inorganic scintillation crystal and using for the light readout highly sensitive photon detectors, namely silicon photomultipliers (SiPMs) and high efficiency multi-anode photomultiplier tubes (MaPMTs). The coupling of the fibres in orthogonal directions allows obtaining 2D position information, while the energy signal is provided by a PMT. A first prototype laboratory system has been developed using a custom-made 50 × 50 × 3 mm3 CsI(Na) crystal with embedded 1 mm diameter fibres and reading out the light from several fibres in each direction, both with individual SiPMs and with a MaPMT. Proof-of-concept studies and results obtained with these systems using 57Co are presented and compared. The application of optical fibres combined with highly sensitive SiPMs or MaPMTs as light sensors in a compact gamma camera has the potential to improve the spatial resolution to the 1-2 mm FWHM level, thus improving the sensitivity of typical scintigraphy techniques and making such camera clinically useful. Results demonstrate the feasibility and imaging capability of the system using both types of photon detectors for imaging. In the case of SiPMs, a temperature cooling system is necessary to improve the SNR and consequently achieve a better imaging performance. The development of larger prototypes with 10 × 10 cm2 and 12 × 12 cm2 is under way, using 1 mm2 SiPMs and 64 anode PMTs, respectively.

  18. Automatic inference of geometric camera parameters and intercamera topology in uncalibrated disjoint surveillance cameras

    NARCIS (Netherlands)

    Hollander, R.J.M. den; Bouma, H.; Baan, J.; Eendebak, P.T.; Rest, J.H.C. van

    2015-01-01

    Person tracking across non-overlapping cameras and other types of video analytics benefit from spatial calibration information that allows an estimation of the distance between cameras and a relation between pixel coordinates and world coordinates within a camera. In a large environment with many ca

  19. Improving Situational Awareness in camera surveillance by combining top-view maps with camera images

    NARCIS (Netherlands)

    Kooi, F.L.; Zeeders, R.

    2009-01-01

    The goal of the experiment described is to improve today's camera surveillance in public spaces. Three designs with the camera images combined on a top-view map were compared to each other and to the current situation in camera surveillance. The goal was to test which design makes spatial relationsh

  20. Camera self-calibration from translation by referring to a known camera.

    Science.gov (United States)

    Zhao, Bin; Hu, Zhaozheng

    2015-09-01

    This paper presents a novel linear method for camera self-calibration by referring to a known (or calibrated) camera. The method requires at least three images, with two images generated by the uncalibrated camera from pure translation and one image generated by the known reference camera. We first propose a method to compute the infinite homography from scene depths. Based on this, we use two images generated by translating the uncalibrated camera to recover scene depths, which are further utilized to linearly compute the infinite homography between an arbitrary uncalibrated image, and the image from the known camera. With the known camera as reference, the computed infinite homography is readily decomposed for camera calibration. The proposed self-calibration method has been tested with simulation and real image data. Experimental results demonstrate that the method is practical and accurate. This paper proposes using a "known reference camera" for camera calibration. The pure translation, as required in the method, is much more maneuverable, compared with some strict motions in the literature, such as pure rotation. The proposed self-calibration method has good potential for solving online camera calibration problems, which has important applications, especially for multicamera and zooming camera systems.

  1. Optimal Camera Placement for Motion Capture Systems.

    Science.gov (United States)

    Rahimian, Pooya; Kearney, Joseph K

    2017-03-01

    Optical motion capture is based on estimating the three-dimensional positions of markers by triangulation from multiple cameras. Successful performance depends on points being visible from at least two cameras and on the accuracy of the triangulation. Triangulation accuracy is strongly related to the positions and orientations of the cameras. Thus, the configuration of the camera network has a critical impact on performance. A poor camera configuration may result in a low quality three-dimensional (3D) estimation and consequently low quality of tracking. This paper introduces and compares two methods for camera placement. The first method is based on a metric that computes target point visibility in the presence of dynamic occlusion from cameras with "good" views. The second method is based on the distribution of views of target points. Efficient algorithms, based on simulated annealing, are introduced for estimating the optimal configuration of cameras for the two metrics and a given distribution of target points. The accuracy and robustness of the algorithms are evaluated through both simulation and empirical measurement. Implementations of the two methods are available for download as tools for the community.

  2. Thermal Cameras in School Laboratory Activities

    Science.gov (United States)

    Haglund, Jesper; Jeppsson, Fredrik; Hedberg, David; Schönborn, Konrad J.

    2015-01-01

    Thermal cameras offer real-time visual access to otherwise invisible thermal phenomena, which are conceptually demanding for learners during traditional teaching. We present three studies of students' conduction of laboratory activities that employ thermal cameras to teach challenging thermal concepts in grades 4, 7 and 10-12. Visualization of…

  3. Depth Estimation Using a Sliding Camera.

    Science.gov (United States)

    Ge, Kailin; Hu, Han; Feng, Jianjiang; Zhou, Jie

    2016-02-01

    Image-based 3D reconstruction technology is widely used in different fields. The conventional algorithms are mainly based on stereo matching between two or more fixed cameras, and high accuracy can only be achieved using a large camera array, which is very expensive and inconvenient in many applications. Another popular choice is utilizing structure-from-motion methods for arbitrarily placed camera(s). However, due to too many degrees of freedom, its computational cost is heavy and its accuracy is rather limited. In this paper, we propose a novel depth estimation algorithm using a sliding camera system. By analyzing the geometric properties of the camera system, we design a camera pose initialization algorithm that can work satisfyingly with only a small number of feature points and is robust to noise. For pixels corresponding to different depths, an adaptive iterative algorithm is proposed to choose optimal frames for stereo matching, which can take advantage of continuously pose-changing imaging and save the time consumption amazingly too. The proposed algorithm can also be easily extended to handle less constrained situations (such as using a camera mounted on a moving robot or vehicle). Experimental results on both synthetic and real-world data have illustrated the effectiveness of the proposed algorithm.

  4. Securing Embedded Smart Cameras with Trusted Computing

    Directory of Open Access Journals (Sweden)

    Winkler Thomas

    2011-01-01

    Full Text Available Camera systems are used in many applications including video surveillance for crime prevention and investigation, traffic monitoring on highways or building monitoring and automation. With the shift from analog towards digital systems, the capabilities of cameras are constantly increasing. Today's smart camera systems come with considerable computing power, large memory, and wired or wireless communication interfaces. With onboard image processing and analysis capabilities, cameras not only open new possibilities but also raise new challenges. Often overlooked are potential security issues of the camera system. The increasing amount of software running on the cameras turns them into attractive targets for attackers. Therefore, the protection of camera devices and delivered data is of critical importance. In this work we present an embedded camera prototype that uses Trusted Computing to provide security guarantees for streamed videos. With a hardware-based security solution, we ensure integrity, authenticity, and confidentiality of videos. Furthermore, we incorporate image timestamping, detection of platform reboots, and reporting of the system status. This work is not limited to theoretical considerations but also describes the implementation of a prototype system. Extensive evaluation results illustrate the practical feasibility of the approach.

  5. Cameras Monitor Spacecraft Integrity to Prevent Failures

    Science.gov (United States)

    2014-01-01

    The Jet Propulsion Laboratory contracted Malin Space Science Systems Inc. to outfit Curiosity with four of its cameras using the latest commercial imaging technology. The company parlayed the knowledge gained under working with NASA to develop an off-the-shelf line of cameras, along with a digital video recorder, designed to help troubleshoot problems that may arise on satellites in space.

  6. CCD Color Camera Characterization for Image Measurements

    NARCIS (Netherlands)

    Withagen, P.J.; Groen, F.C.A.; Schutte, K.

    2007-01-01

    In this article, we will analyze a range of different types of cameras for its use in measurements. We verify a general model of a charged coupled device camera using experiments. This model includes gain and offset, additive and multiplicative noise, and gamma correction. It is shown that for sever

  7. A BASIC CAMERA UNIT FOR MEDICAL PHOTOGRAPHY.

    Science.gov (United States)

    SMIALOWSKI, A; CURRIE, D J

    1964-08-22

    A camera unit suitable for most medical photographic purposes is described. The unit comprises a single-lens reflex camera, an electronic flash unit and supplementary lenses. Simple instructions for use of th's basic unit are presented. The unit is entirely suitable for taking fine-quality photographs of most medical subjects by persons who have had little photographic training.

  8. AIM: Ames Imaging Module Spacecraft Camera

    Science.gov (United States)

    Thompson, Sarah

    2015-01-01

    The AIM camera is a small, lightweight, low power, low cost imaging system developed at NASA Ames. Though it has imaging capabilities similar to those of $1M plus spacecraft cameras, it does so on a fraction of the mass, power and cost budget.

  9. Creating and Using a Camera Obscura

    Science.gov (United States)

    Quinnell, Justin

    2012-01-01

    The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material.…

  10. Rosetta Star Tracker and Navigation Camera

    DEFF Research Database (Denmark)

    Thuesen, Gøsta

    1998-01-01

    Proposal in response to the Invitation to Tender (ITT) issued by Matra Marconi Space (MSS) for the procurement of the ROSETTA Star Tracker and Navigation Camera.......Proposal in response to the Invitation to Tender (ITT) issued by Matra Marconi Space (MSS) for the procurement of the ROSETTA Star Tracker and Navigation Camera....

  11. Fazendo 3d com uma camera so

    CERN Document Server

    Lunazzi, J J

    2010-01-01

    A simple system to make stereo photography or videos based in just two mirrors was made in 1989 and recently adapted to a digital camera setup. Um sistema simples para fazer fotografia ou videos em estereo baseado em dois espelhos que dividem o campo da imagem foi criado no ano 1989, e recentemente adaptado para camera digital.

  12. Creating and Using a Camera Obscura

    Science.gov (United States)

    Quinnell, Justin

    2012-01-01

    The camera obscura (Latin for "darkened room") is the earliest optical device and goes back over 2500 years. The small pinhole or lens at the front of the room allows light to enter and this is then "projected" onto a screen inside the room. This differs from a camera, which projects its image onto light-sensitive material. Originally images were…

  13. Active spectral imaging nondestructive evaluation (SINDE) camera

    Energy Technology Data Exchange (ETDEWEB)

    Simova, E.; Rochefort, P.A., E-mail: eli.simova@cnl.ca [Canadian Nuclear Laboratories, Chalk River, Ontario (Canada)

    2016-06-15

    A proof-of-concept video camera for active spectral imaging nondestructive evaluation has been demonstrated. An active multispectral imaging technique has been implemented in the visible and near infrared by using light emitting diodes with wavelengths spanning from 400 to 970 nm. This shows how the camera can be used in nondestructive evaluation to inspect surfaces and spectrally identify materials and corrosion. (author)

  14. Adapting virtual camera behaviour through player modelling

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2015-01-01

    a novel approach to virtual camera control, which builds upon camera control and player modelling to provide the user with an adaptive point-of-view. To achieve this goal, we propose a methodology to model the player’s preferences on virtual camera movements and we employ the resulting models to tailor......Research in virtual camera control has focused primarily on finding methods to allow designers to place cameras effectively and efficiently in dynamic and unpredictable environments, and to generate complex and dynamic plans for cinematography in virtual environments. In this article, we propose...... the viewpoint movements to the player type and her game-play style. Ultimately, the methodology is applied to a 3D platform game and is evaluated through a controlled experiment; the results suggest that the resulting adaptive cinematographic experience is favoured by some player types and it can generate...

  15. Incremental activity modeling in multiple disjoint cameras.

    Science.gov (United States)

    Loy, Chen Change; Xiang, Tao; Gong, Shaogang

    2012-09-01

    Activity modeling and unusual event detection in a network of cameras is challenging, particularly when the camera views are not overlapped. We show that it is possible to detect unusual events in multiple disjoint cameras as context-incoherent patterns through incremental learning of time delayed dependencies between distributed local activities observed within and across camera views. Specifically, we model multicamera activities using a Time Delayed Probabilistic Graphical Model (TD-PGM) with different nodes representing activities in different decomposed regions from different views and the directed links between nodes encoding their time delayed dependencies. To deal with visual context changes, we formulate a novel incremental learning method for modeling time delayed dependencies that change over time. We validate the effectiveness of the proposed approach using a synthetic data set and videos captured from a camera network installed at a busy underground station.

  16. Flow visualization by mobile phone cameras

    Science.gov (United States)

    Cierpka, Christian; Hain, Rainer; Buchmann, Nicolas A.

    2016-06-01

    Mobile smart phones were completely changing people's communication within the last ten years. However, these devices do not only offer communication through different channels but also devices and applications for fun and recreation. In this respect, mobile phone cameras include now relatively fast (up to 240 Hz) cameras to capture high-speed videos of sport events or other fast processes. The article therefore explores the possibility to make use of this development and the wide spread availability of these cameras in the terms of velocity measurements for industrial or technical applications and fluid dynamics education in high schools and at universities. The requirements for a simplistic PIV (particle image velocimetry) system are discussed. A model experiment of a free water jet was used to prove the concept and shed some light on the achievable quality and determine bottle necks by comparing the results obtained with a mobile phone camera with data taken by a high-speed camera suited for scientific experiments.

  17. Gamma camera performance: technical assessment protocol

    Energy Technology Data Exchange (ETDEWEB)

    Bolster, A.A. [West Glasgow Hospitals NHS Trust, London (United Kingdom). Dept. of Clinical Physics; Waddington, W.A. [University College London Hospitals NHS Trust, London (United Kingdom). Inst. of Nuclear Medicine

    1996-12-31

    This protocol addresses the performance assessment of single and dual headed gamma cameras. No attempt is made to assess the performance of any associated computing systems. Evaluations are usually performed on a gamma camera commercially available within the United Kingdom and recently installed at a clinical site. In consultation with the manufacturer, GCAT selects the site and liaises with local staff to arrange a mutually convenient time for assessment. The manufacturer is encouraged to have a representative present during the evaluation. Three to four days are typically required for the evaluation team to perform the necessary measurements. When access time is limited, the team will modify the protocol to test the camera as thoroughly as possible. Data are acquired on the camera`s computer system and are subsequently transferred to the independent GCAT computer system for analysis. This transfer from site computer to the independent system is effected via a hardware interface and Interfile data transfer. (author).

  18. Modelling Virtual Camera Behaviour Through Player Gaze

    DEFF Research Database (Denmark)

    Picardi, Andrea; Burelli, Paolo; Yannakakis, Georgios N.

    2012-01-01

    In a three-dimensional virtual environment, aspects such as narrative and interaction largely depend on the placement and animation of the virtual camera. Therefore, virtual camera control plays a critical role in player experience and, thereby, in the overall quality of a computer game. Both game...... industry and game AI research focus on the devel- opment of increasingly sophisticated systems to automate the control of the virtual camera integrating artificial intel- ligence algorithms within physical simulations. However, in both industry and academia little research has been carried out...... on the relationship between virtual camera, game-play and player behaviour. We run a game user experiment to shed some light on this relationship and identify relevant dif- ferences between camera behaviours through different game sessions, playing behaviours and player gaze patterns. Re- sults show that users can...

  19. Lava Tube Collapse Pits

    Science.gov (United States)

    2004-01-01

    [figure removed for brevity, see original site] We will be looking at collapse pits for the next two weeks. Collapse pits on Mars are formed in several ways. In volcanic areas, channelized lava flows can form roofs which insulate the flowing lava. These features are termed lava tubes on Earth and are common features in basaltic flows. After the lava has drained, parts of the roof of the tube will collapse under its own weight. These collapse pits will only be as deep as the bottom of the original lava tube. Another type of collapse feature associated with volcanic areas arises when very large eruptions completely evacuate the magma chamber beneath the volcano. The weight of the volcano will cause the entire edifice to subside into the void space below it. Structural features including fractures and graben will form during the subsidence. Many times collapse pits will form within the graben. In addition to volcanic collapse pits, Mars has many collapse pits formed when volatiles (such as subsurface ice) are released from the surface layers. As the volatiles leave, the weight of the surrounding rock causes collapse pits to form. These collapse pits are found in the southern hemisphere of Mars. They are likely lava tube collapse pits related to flows from Hadriaca Patera. Image information: VIS instrument. Latitude -36.8, Longitude 89.6 East (270.4 West). 19 meter/pixel resolution. Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time. NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D

  20. Cloud Computing with Context Cameras

    CERN Document Server

    Pickles, A J

    2013-01-01

    We summarize methods and plans to monitor and calibrate photometric observations with our autonomous, robotic network of 2m, 1m and 40cm telescopes. These are sited globally to optimize our ability to observe time-variable sources. Wide field "context" cameras are aligned with our network telescopes and cycle every 2 minutes through BVriz filters, spanning our optical range. We measure instantaneous zero-point offsets and transparency (throughput) against calibrators in the 5-12m range from the all-sky Tycho2 catalog, and periodically against primary standards. Similar measurements are made for all our science images, with typical fields of view of 0.5 degrees. These are matched against Landolt, Stetson and Sloan standards, and against calibrators in the 10-17m range from the all-sky APASS catalog. Such measurements provide pretty good instantaneous flux calibration, often to better than 5%, even in cloudy conditions. Zero-point and transparency measurements can be used to characterize, monitor and inter-comp...

  1. Practical intraoperative stereo camera calibration.

    Science.gov (United States)

    Pratt, Philip; Bergeles, Christos; Darzi, Ara; Yang, Guang-Zhong

    2014-01-01

    Many of the currently available stereo endoscopes employed during minimally invasive surgical procedures have shallow depths of field. Consequently, focus settings are adjusted from time to time in order to achieve the best view of the operative workspace. Invalidating any prior calibration procedure, this presents a significant problem for image guidance applications as they typically rely on the calibrated camera parameters for a variety of geometric tasks, including triangulation, registration and scene reconstruction. While recalibration can be performed intraoperatively, this invariably results in a major disruption to workflow, and can be seen to represent a genuine barrier to the widespread adoption of image guidance technologies. The novel solution described herein constructs a model of the stereo endoscope across the continuum of focus settings, thereby reducing the number of degrees of freedom to one, such that a single view of reference geometry will determine the calibration uniquely. No special hardware or access to proprietary interfaces is required, and the method is ready for evaluation during human cases. A thorough quantitative analysis indicates that the resulting intrinsic and extrinsic parameters lead to calibrations as accurate as those derived from multiple pattern views.

  2. Smart Camera Technology Increases Quality

    Science.gov (United States)

    2004-01-01

    When it comes to real-time image processing, everyone is an expert. People begin processing images at birth and rapidly learn to control their responses through the real-time processing of the human visual system. The human eye captures an enormous amount of information in the form of light images. In order to keep the brain from becoming overloaded with all the data, portions of an image are processed at a higher resolution than others, such as a traffic light changing colors. changing colors. In the same manner, image processing products strive to extract the information stored in light in the most efficient way possible. Digital cameras available today capture millions of pixels worth of information from incident light. However, at frame rates more than a few per second, existing digital interfaces are overwhelmed. All the user can do is store several frames to memory until that memory is full and then subsequent information is lost. New technology pairs existing digital interface technology with an off-the-shelf complementary metal oxide semiconductor (CMOS) imager to provide more than 500 frames per second of specialty image processing. The result is a cost-effective detection system unlike any other.

  3. True three-dimensional camera

    Science.gov (United States)

    Kornreich, Philipp; Farell, Bart

    2013-01-01

    An imager that can measure the distance from each pixel to the point on the object that is in focus at the pixel is described. This is accomplished by short photo-conducting lightguides at each pixel. In the eye the rods and cones are the fiber-like lightguides. The device uses ambient light that is only coherent in spherical shell-shaped light packets of thickness of one coherence length. Modern semiconductor technology permits the construction of lightguides shorter than a coherence length of ambient light. Each of the frequency components of the broad band light arriving at a pixel has a phase proportional to the distance from an object point to its image pixel. Light frequency components in the packet arriving at a pixel through a convex lens add constructively only if the light comes from the object point in focus at this pixel. The light in packets from all other object points cancels. Thus the pixel receives light from one object point only. The lightguide has contacts along its length. The lightguide charge carriers are generated by the light patterns. These light patterns, and thus the photocurrent, shift in response to the phase of the input signal. Thus, the photocurrent is a function of the distance from the pixel to its object point. Applications include autonomous vehicle navigation and robotic vision. Another application is a crude teleportation system consisting of a camera and a three-dimensional printer at a remote location.

  4. NIR Camera/spectrograph: TEQUILA

    Science.gov (United States)

    Ruiz, E.; Sohn, E.; Cruz-Gonzalez, I.; Salas, L.; Parraga, A.; Torres, R.; Perez, M.; Cobos, F.; Tejada, C.; Iriarte, A.

    1998-11-01

    We describe the configuration and operation modes of the IR camera/spectrograph called TEQUILA, based on a 1024X1024 HgCdTe FPA (HAWAII). The optical system will allow three possible modes of operation: direct imaging, low and medium resolution spectroscopy and polarimetry. The basic system is being designed to consist of the following: 1) A LN$_2$ dewar that allocates the FPA together with the preamplifiers and a 24 filter position cylinder. 2) Control and readout electronics based on DSP modules linked to a workstation through fiber optics. 3) An optomechanical assembly cooled to -30oC that provides an efficient operation of the instrument in its various modes. 4) A control module for the moving parts of the instrument. The opto-mechanical assembly will have the necessary provisions to install a scanning Fabry-Perot interferometer and an adaptive optics correction system. The final image acquisition and control of the whole instrument is carried out in a workstation to provide the observer with a friendly environment. The system will operate at the 2.1 m telescope at the Observatorio Astronomico Nacional in San Pedro Martir, B.C. (Mexico), and is intended to be a first-light instrument for the new 7.8 m Mexican Infrared-Optical Telescope (TIM).

  5. Tube coalescence in the Jingfudong lava tube and implications for lava flow hazard of Tengchong volcanism

    OpenAIRE

    Zhengquan Chen; Yongshun Liu; Haiquan Wei; Jiandong Xu; Wenfeng Guo

    2016-01-01

    Tube-fed structure occurs as a general phenomenon in Tengchong basic lavas, such as lava tubes, lava plugs and tube-related collapse depressions. We deduced the development of Laoguipo lava flows, which is the longest lava tube (Jingfudong lava tube) evolved in Tengchong volcanic area. Following the detailed documentation of the tube morphology of the Jingfudong lava tube, we propose that the Jingfudong lava tube was formed through vertical coalescence of at least three tubes. The coalescence...

  6. Compact Toroid Propagation in a Magnetized Drift Tube

    Science.gov (United States)

    Horton, Robert D.; Baker, Kevin L.; Hwang, David Q.; Evans, Russell W.

    2000-10-01

    Injection of a spheromak-like compact toroid (SCT) plasma into a toroidal plasma confinement device may require the SCT to propagate through a drift tube region occupied by a pre-existing magnetic field. This field is expected to extert a retarding force on the SCT, but may also result in a beneficial compression. The effects of transverse and longitudinal magnetic fields will be measured using the CTIX compact-toroid injector, together with a fast framing camera with an axial view of the formation, coaxial, and drift-tube regions. In the case of longitudinal magnetic field, comparisons will be made with the predictions of two-dimensional numerical simulation. The use of localized magnetic field to reduce plasma bridging of the insulating gap will also be investigated.

  7. Steam generator tube integrity program

    Energy Technology Data Exchange (ETDEWEB)

    Dierks, D.R.; Shack, W.J. [Argonne National Laboratory, IL (United States); Muscara, J.

    1996-03-01

    A new research program on steam generator tubing degradation is being sponsored by the U.S. Nuclear Regulatory Commission (NRC) at Argonne National Laboratory. This program is intended to support a performance-based steam generator tube integrity rule. Critical areas addressed by the program include evaluation of the processes used for the in-service inspection of steam generator tubes and recommendations for improving the reliability and accuracy of inspections; validation and improvement of correlations for evaluating integrity and leakage of degraded steam generator tubes, and validation and improvement of correlations and models for predicting degradation in steam generator tubes as aging occurs. The studies will focus on mill-annealed Alloy 600 tubing, however, tests will also be performed on replacement materials such as thermally-treated Alloy 600 or 690. An overview of the technical work planned for the program is given.

  8. The special relativistic shock tube

    Science.gov (United States)

    Thompson, Kevin W.

    1986-01-01

    The shock-tube problem has served as a popular test for numerical hydrodynamics codes. The development of relativistic hydrodynamics codes has created a need for a similar test problem in relativistic hydrodynamics. The analytical solution to the special relativistic shock-tube problem is presented here. The relativistic shock-jump conditions and rarefaction solution which make up the shock tube are derived. The Newtonian limit of the calculations is given throughout.

  9. Tubing for augmented heat transfer

    Energy Technology Data Exchange (ETDEWEB)

    Yampolsky, J.S.; Pavlics, P.

    1983-08-01

    The objectives of the program reported were: to determine the heat transfer and friction characteristics on the outside of spiral fluted tubing in single phase flow of water, and to assess the relative cost of a heat exchanger constructed with spiral fluted tubing with one using conventional smooth tubing. An application is examined where an isolation water/water heat exchanger was used to transfer the heat from a gaseous diffusion plant to an external system for energy recovery. (LEW)

  10. Diffusion in a Curved Tube

    OpenAIRE

    Ogawa, Naohisa

    2011-01-01

    The diffusion of particles in confining walls forming a tube is discussed. Such a transport phenomenon is observed in biological cells and porous media. We consider the case in which the tube is winding with curvature and torsion, and the thickness of the tube is sufficiently small compared with its curvature radius. We discuss how geomerical quantities appear in a quasi-one-dimensional diffusion equation.

  11. Alternate tube plugging criteria for steam generator tubes

    Energy Technology Data Exchange (ETDEWEB)

    Cueto-Felgueroso, C.; Aparicio, C.B. [Tecnatom, S.A., Madrid (Spain)

    1997-02-01

    The tubing of the Steam Generators constitutes more than half of the reactor coolant pressure boundary. Specific requirements governing the maintenance of steam generator tubes integrity are set in Plant Technical Specifications and in Section XI of the ASME Boiler and Pressure Vessel Code. The operating experience of Steam Generator tubes of PWR plants has shown the existence of some types of degradatory processes. Every one of these has an specific cause and affects one or more zones of the tubes. In the case of Spanish Power Plants, and depending on the particular Plant considered, they should be mentioned the Primary Water Stress Corrosion Cracking (PWSCC) at the roll transition zone (RTZ), the Outside Diameter Stress Corrosion Cracking (ODSCC) at the Tube Support Plate (TSP) intersections and the fretting with the Anti-Vibration Bars (AVBs) or with the Support Plates in the preheater zone. The In-Service Inspections by Eddy Currents constitutes the standard method for assuring the SG tubes integrity and they permit the monitoring of the defects during the service life of the plant. When the degradation reaches a determined limit, called the plugging limit, the SG tube must be either repaired or retired from service by plugging. Customarily, the plugging limit is related to the depth of the defect. Such depth is typically 40% of the wall thickness of the tube and is applicable to any type of defect in the tube. In its origin, that limit was established for tubes thinned by wastage, which was the predominant degradation in the seventies. The application of this criterion for axial crack-like defects, as, for instance, those due to PWSCC in the roll transition zone, has lead to an excessive and unnecessary number of tubes being plugged. This has lead to the development of defect specific plugging criteria. Examples of the application of such criteria are discussed in the article.

  12. Autonomous Multicamera Tracking on Embedded Smart Cameras

    Directory of Open Access Journals (Sweden)

    Bischof Horst

    2007-01-01

    Full Text Available There is currently a strong trend towards the deployment of advanced computer vision methods on embedded systems. This deployment is very challenging since embedded platforms often provide limited resources such as computing performance, memory, and power. In this paper we present a multicamera tracking method on distributed, embedded smart cameras. Smart cameras combine video sensing, processing, and communication on a single embedded device which is equipped with a multiprocessor computation and communication infrastructure. Our multicamera tracking approach focuses on a fully decentralized handover procedure between adjacent cameras. The basic idea is to initiate a single tracking instance in the multicamera system for each object of interest. The tracker follows the supervised object over the camera network, migrating to the camera which observes the object. Thus, no central coordination is required resulting in an autonomous and scalable tracking approach. We have fully implemented this novel multicamera tracking approach on our embedded smart cameras. Tracking is achieved by the well-known CamShift algorithm; the handover procedure is realized using a mobile agent system available on the smart camera network. Our approach has been successfully evaluated on tracking persons at our campus.

  13. Sky camera geometric calibration using solar observations

    Science.gov (United States)

    Urquhart, Bryan; Kurtz, Ben; Kleissl, Jan

    2016-09-01

    A camera model and associated automated calibration procedure for stationary daytime sky imaging cameras is presented. The specific modeling and calibration needs are motivated by remotely deployed cameras used to forecast solar power production where cameras point skyward and use 180° fisheye lenses. Sun position in the sky and on the image plane provides a simple and automated approach to calibration; special equipment or calibration patterns are not required. Sun position in the sky is modeled using a solar position algorithm (requiring latitude, longitude, altitude and time as inputs). Sun position on the image plane is detected using a simple image processing algorithm. The performance evaluation focuses on the calibration of a camera employing a fisheye lens with an equisolid angle projection, but the camera model is general enough to treat most fixed focal length, central, dioptric camera systems with a photo objective lens. Calibration errors scale with the noise level of the sun position measurement in the image plane, but the calibration is robust across a large range of noise in the sun position. Calibration performance on clear days ranged from 0.94 to 1.24 pixels root mean square error.

  14. Automatic camera tracking for remote manipulators

    Energy Technology Data Exchange (ETDEWEB)

    Stoughton, R.S.; Martin, H.L.; Bentz, R.R.

    1984-04-01

    The problem of automatic camera tracking of mobile objects is addressed with specific reference to remote manipulators and using either fixed or mobile cameras. The technique uses a kinematic approach employing 4 x 4 coordinate transformation matrices to solve for the needed camera PAN and TILT angles. No vision feedback systems are used, as the required input data are obtained entirely from position sensors from the manipulator and the camera-positioning system. All hardware requirements are generally satisfied by currently available remote manipulator systems with a supervisory computer. The system discussed here implements linear plus on/off (bang-bang) closed-loop control with a +-2/sup 0/ deadband. The deadband area is desirable to avoid operator seasickness caused by continuous camera movement. Programming considerations for camera control, including operator interface options, are discussed. The example problem presented is based on an actual implementation using a PDP 11/34 computer, a TeleOperator Systems SM-229 manipulator, and an Oak Ridge National Laboratory (ORNL) camera-positioning system. 3 references, 6 figures, 2 tables.

  15. Learning from YouTube [Video Book

    Science.gov (United States)

    Juhasz, Alexandra

    2011-01-01

    YouTube is a mess. YouTube is for amateurs. YouTube dissolves the real. YouTube is host to inconceivable combos. YouTube is best for corporate-made community. YouTube is badly baked. These are a few of the things Media Studies professor Alexandra Juhasz (and her class) learned about YouTube when she set out to investigate what actually happens…

  16. Charm production in flux tubes

    CERN Document Server

    Aguiar, C E; Nazareth, R A M S; Pech, G

    1996-01-01

    We argue that the non-perturbative Schwinger mechanism may play an important role in the hadronic production of charm. We present a flux tube model which assumes that the colliding hadrons become color charged because of gluon exchange, and that a single non-elementary flux tube is built up as they recede. The strong chromoelectric field inside this tube creates quark pairs (including charmed ones) and the ensuing color screening breaks the tube into excited hadronic clusters. On their turn these clusters, or `fireballs', decay statistically into the final hadrons. The model is able to account for the soft production of charmed, strange and lighter hadrons within a unified framework.

  17. Charm production in flux tubes

    Science.gov (United States)

    Aguiar, C. E.; Kodama, T.; Nazareth, R. A. M. S.; Pech, G.

    1996-01-01

    We argue that the nonperturbative Schwinger mechanism may play an important role in the hadronic production of charm. We present a flux tube model which assumes that the colliding hadrons become color charged because of gluon exchange, and that a single nonelementary flux tube is built up as they recede. The strong chromoelectric field inside this tube creates quark pairs (including charmed ones) and the ensuing color screening breaks the tube into excited hadronic clusters. In their turn these clusters, or ``fireballs,'' decay statistically into the final hadrons. The model is able to account for the soft production of charmed, strange, and lighter hadrons within a unified framework.

  18. YouTube and 'psychiatry'.

    Science.gov (United States)

    Gordon, Robert; Miller, John; Collins, Noel

    2015-12-01

    YouTube is a video-sharing website that is increasingly used to share and disseminate health-related information, particularly among younger people. There are reports that social media sites, such as YouTube, are being used to communicate an anti-psychiatry message but this has never been confirmed in any published analysis of YouTube clip content. This descriptive study revealed that the representation of 'psychiatry' during summer 2012 was predominantly negative. A subsequent smaller re-analysis suggests that the negative portrayal of 'psychiatry' on YouTube is a stable phenomenon. The significance of this and how it could be addressed are discussed.

  19. Method for producing a tube

    Science.gov (United States)

    Peterson, Kenneth A.; Rohde, Steven B.; Pfeifer, Kent B.; Turner, Timothy S.

    2007-01-02

    A method is described for producing tubular substrates having parallel spaced concentric rings of electrical conductors that can be used as the drift tube of an Ion Mobility Spectrometer (IMS). The invention comprises providing electrodes on the inside of a tube that are electrically connected to the outside of the tube through conductors that extend between adjacent plies of substrate that are combined to form the tube. Tubular substrates are formed from flexible polymeric printed wiring board materials, ceramic materials and material compositions of glass and ceramic, commonly known as Low Temperature Co-Fired Ceramic (LTCC). The adjacent plies are sealed together around the electrode.

  20. Electronic cameras for low-light microscopy.

    Science.gov (United States)

    Rasnik, Ivan; French, Todd; Jacobson, Ken; Berland, Keith

    2013-01-01

    This chapter introduces to electronic cameras, discusses the various parameters considered for evaluating their performance, and describes some of the key features of different camera formats. The chapter also presents the basic understanding of functioning of the electronic cameras and how these properties can be exploited to optimize image quality under low-light conditions. Although there are many types of cameras available for microscopy, the most reliable type is the charge-coupled device (CCD) camera, which remains preferred for high-performance systems. If time resolution and frame rate are of no concern, slow-scan CCDs certainly offer the best available performance, both in terms of the signal-to-noise ratio and their spatial resolution. Slow-scan cameras are thus the first choice for experiments using fixed specimens such as measurements using immune fluorescence and fluorescence in situ hybridization. However, if video rate imaging is required, one need not evaluate slow-scan CCD cameras. A very basic video CCD may suffice if samples are heavily labeled or are not perturbed by high intensity illumination. When video rate imaging is required for very dim specimens, the electron multiplying CCD camera is probably the most appropriate at this technological stage. Intensified CCDs provide a unique tool for applications in which high-speed gating is required. The variable integration time video cameras are very attractive options if one needs to acquire images at video rate acquisition, as well as with longer integration times for less bright samples. This flexibility can facilitate many diverse applications with highly varied light levels.

  1. Intelligent Camera for Surface Defect Inspection

    Institute of Scientific and Technical Information of China (English)

    CHENG Wan-sheng; ZHAO Jie; WANG Ke-cheng

    2007-01-01

    An intelligent camera for surface defect inspection is presented which can pre-process the surface image of a rolled strip and pick defective areas out at a spead of 1600 meters per minute. The camera is made up of a high speed line CCD, a 60Mb/s CCD digitizer with correlated double sampling function, and a field programmable gate array(FPGA), which can quickly distinguish defective areas using a perceptron embedded in FPGA thus the data to be further processed would dramatically be reduced. Some experiments show that the camera can meet high producing speed, and reduce cost and complexity of automation surface inspection systems.

  2. Multi-digital Still Cameras with CCD

    Institute of Scientific and Technical Information of China (English)

    LIU Wen-jing; LONG Zai-chuan; XIONG Ping; HUAN Yao-xiong

    2006-01-01

    Digital still camera is a completely typical tool for capturing the digital images. With the development of IC technology and optimization-algorithm, the performance of digital still cameras(DSCs) will be more and more powerful in the world. But can we obtain the more and better info using the combined information from the multi-digital still camera? The answer is yes by some experiments. By using multi-DSC at different angles, the various 3-D informations of the object are obtained.

  3. Fuzzy logic control for camera tracking system

    Science.gov (United States)

    Lea, Robert N.; Fritz, R. H.; Giarratano, J.; Jani, Yashvant

    1992-01-01

    A concept utilizing fuzzy theory has been developed for a camera tracking system to provide support for proximity operations and traffic management around the Space Station Freedom. Fuzzy sets and fuzzy logic based reasoning are used in a control system which utilizes images from a camera and generates required pan and tilt commands to track and maintain a moving target in the camera's field of view. This control system can be implemented on a fuzzy chip to provide an intelligent sensor for autonomous operations. Capabilities of the control system can be expanded to include approach, handover to other sensors, caution and warning messages.

  4. A Benchmark for Virtual Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2015-01-01

    Automatically animating and placing the virtual camera in a dynamic environment is a challenging task. The camera is expected to maximise and maintain a set of properties — i.e. visual composition — while smoothly moving through the environment and avoiding obstacles. A large number of different....... For this reason, in this paper, we propose a benchmark for the problem of virtual camera control and we analyse a number of different problems in different virtual environments. Each of these scenarios is described through a set of complexity measures and, as a result of this analysis, a subset of scenarios...

  5. Close-range photogrammetry with video cameras

    Science.gov (United States)

    Burner, A. W.; Snow, W. L.; Goad, W. K.

    1985-01-01

    Examples of photogrammetric measurements made with video cameras uncorrected for electronic and optical lens distortions are presented. The measurement and correction of electronic distortions of video cameras using both bilinear and polynomial interpolation are discussed. Examples showing the relative stability of electronic distortions over long periods of time are presented. Having corrected for electronic distortion, the data are further corrected for lens distortion using the plumb line method. Examples of close-range photogrammetric data taken with video cameras corrected for both electronic and optical lens distortion are presented.

  6. Task Panel Sensing with a Movable Camera

    Science.gov (United States)

    Wolfe, William J.; Mathis, Donald W.; Magee, Michael; Hoff, William A.

    1990-03-01

    This paper discusses the integration of model based computer vision with a robot planning system. The vision system deals with structured objects with several movable parts (the "Task Panel"). The robot planning system controls a T3-746 manipulator that has a gripper and a wrist mounted camera. There are two control functions: move the gripper into position for manipulating the panel fixtures (doors, latches, etc.), and move the camera into positions preferred by the vision system. This paper emphasizes the issues related to repositioning the camera for improved viewpoints.

  7. Effect of the sequence of tube rolling in a tube bundle of a shell and tube heat exchanger on the stress-deformed state of the tube sheet

    Science.gov (United States)

    Tselishchev, M. F.; Plotnikov, P. N.; Brodov, Yu. M.

    2015-11-01

    Rolling the tube sheet of a heat exchanger with U-shaped tubes, as exemplified by the vapor cooler GP-24, was simulated. The simulation was performed using the finite element method with account of elas- tic-plastic properties of the tube and tube sheet materials. The simulation consisted of two stages; at the first stage, maximum and residual contact stress in the conjunction of a separate tube and the tube sheet was determined using the "equivalent sleeve" model; at the second stage, the obtained contact stress was applied to the hole surface in the tube sheet. Thus, different tube rolling sequences were simulated: from the center to the periphery of the tube sheet and from the periphery to the center along a spiral line. The studies showed that the tube rolling sequence noticeably influences the value of the tube sheet residual deflection for the same rolling parameters of separate tubes. Residual deflection of the tube sheet in different planes was determined. It was established that the smallest residual deflection corresponds to the tube rolling sequence from the periphery to the center of the tube sheet. The following dependences were obtained for different rolling sequences: maximum deformation of the tube sheet as a function of the number of rolled tubes, residual deformation of the tube sheet along its surface, and residual deflection of the tube sheet as a function of the rotation angle at the periphery. The preferred sequence of tube rolling for minimizing the tube sheet deformation is indicated.

  8. The photothermal camera - a new non destructive inspection tool; La camera photothermique - une nouvelle methode de controle non destructif

    Energy Technology Data Exchange (ETDEWEB)

    Piriou, M. [AREVA NP Centre Technique SFE - Zone Industrielle et Portuaire Sud - BP13 - 71380 Saint Marcel (France)

    2007-07-01

    The Photothermal Camera, developed by the Non-Destructive Inspection Department at AREVA NP's Technical Center, is a device created to replace penetrant testing, a method whose drawbacks include environmental pollutants, industrial complexity and potential operator exposure. We have already seen how the Photothermal Camera can work alongside or instead of conventional surface inspection techniques such as penetrant, magnetic particle or eddy currents. With it, users can detect without any surface contact ligament defects or openings measuring just a few microns on rough oxidized, machined or welded metal parts. It also enables them to work on geometrically varied surfaces, hot parts or insulating (dielectric) materials without interference from the magnetic properties of the inspected part. The Photothermal Camera method has already been used for in situ inspections of tube/plate welds on an intermediate heat exchanger of the Phenix fast reactor. It also replaced the penetrant method for weld inspections on the ITER vacuum chamber, for weld crack detection on vessel head adapter J-welds, and for detecting cracks brought on by heat crazing. What sets this innovative method apart from others is its ability to operate at distances of up to two meters from the inspected part, as well as its remote control functionality at distances of up to 15 meters (or more via Ethernet), and its emissions-free environmental cleanliness. These make it a true alternative to penetrant testing, to the benefit of operator and environmental protection. (author) [French] La Camera Photothermique, developpee par le departement des Examens Non Destructifs du Centre Technique de AREVA NP, est un equipement destine a remplacer le ressuage, source de pollution pour l'environnement, de complexite pour l'industrialisation et eventuellement de dosimetrie pour les operateurs. Il a ete demontre que la Camera Photothermique peut etre utilisee en complement ou en remplacement des

  9. Infrared imaging of LED lighting tubes and fluorescent tubes

    Science.gov (United States)

    Siikanen, Sami; Kivi, Sini; Kauppinen, Timo; Juuti, Mikko

    2011-05-01

    The low energy efficiency of conventional light sources is mainly caused by generation of waste heat. We used infrared (IR) imaging in order to monitor the heating of both LED tube luminaires and ordinary T8 fluorescent tubes. The IR images showed clearly how the surface temperatures of the fluorescent tube ends quickly rose up to about +50...+70°C, whereas the highest surface temperatures seen on the LED tubes were only about +30...+40°C. The IR images demonstrated how the heat produced by the individual LED chips can be efficiently guided to the supporting structure in order to keep the LED emitters cool and hence maintain efficient operation. The consumed electrical power and produced illuminance were also recorded during 24 hour measurements. In order to assess the total luminous efficacy of the luminaires, separate luminous flux measurements were made in a large integrating sphere. The currently available LED tubes showed efficacies of up to 88 lm/W, whereas a standard "cool white" T8 fluorescent tube produced ca. 75 lm/W. Both lamp types gave ca. 110 - 130 lx right below the ceiling-mounted luminaire, but the LED tubes consume only 40 - 55% of the electric power compared to fluorescent tubes.

  10. Towards Adaptive Virtual Camera Control In Computer Games

    OpenAIRE

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We inves- tigate the relationship between camera placement and playing behaviour in games and build a user model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platf...

  11. Nasogastric tube syndrome induced by an indwelling long intestinal tube.

    Science.gov (United States)

    Sano, Naoki; Yamamoto, Masayoshi; Nagai, Kentaro; Yamada, Keiichi; Ohkohchi, Nobuhiro

    2016-04-21

    The nasogastric tube (NGT) has become a frequently used device to alleviate gastrointestinal symptoms. Nasogastric tube syndrome (NTS) is an uncommon but potentially life-threatening complication of an indwelling NGT. NTS is characterized by acute upper airway obstruction due to bilateral vocal cord paralysis. We report a case of a 76-year-old man with NTS, induced by an indwelling long intestinal tube. He was admitted to our hospital for treatment of sigmoid colon cancer. He underwent sigmoidectomy to release a bowel obstruction, and had a long intestinal tube inserted to decompress the intestinal tract. He presented acute dyspnea following prolonged intestinal intubation, and bronchoscopy showed bilateral vocal cord paralysis. The NGT was removed immediately, and tracheotomy was performed. The patient was finally discharged in a fully recovered state. NTS be considered in patients complaining of acute upper airway obstruction, not only with a NGT inserted but also with a long intestinal tube.

  12. Camera vibration measurement using blinking light-emitting diode array.

    Science.gov (United States)

    Nishi, Kazuki; Matsuda, Yuichi

    2017-01-23

    We present a new method for measuring camera vibrations such as camera shake and shutter shock. This method successfully detects the vibration trajectory and transient waveforms from the camera image itself. We employ a time-varying pattern as the camera test chart over the conventional static pattern. This pattern is implemented using a specially developed blinking light-emitting-diode array. We describe the theoretical framework and pattern analysis of the camera image for measuring camera vibrations. Our verification experiments show that our method has a detection accuracy and sensitivity of 0.1 pixels, and is robust against image distortion. Measurement results of camera vibrations in commercial cameras are also demonstrated.

  13. Piezoelectric Rotary Tube Motor

    Science.gov (United States)

    Fisher, Charles D.; Badescu, Mircea; Braun, David F.; Culhane, Robert

    2011-01-01

    A custom rotary SQUIGGLE(Registered TradeMark) motor has been developed that sets new benchmarks for small motor size, high position resolution, and high torque without gear reduction. Its capabilities cannot be achieved with conventional electromagnetic motors. It consists of piezoelectric plates mounted on a square flexible tube. The plates are actuated via voltage waveforms 90 out of phase at the resonant frequency of the device to create rotary motion. The motors were incorporated into a two-axis postioner that was designed for fiber-fed spectroscopy for ground-based and space-based projects. The positioner enables large-scale celestial object surveys to take place in a practical amount of time.

  14. Contrail study with ground-based cameras

    Directory of Open Access Journals (Sweden)

    U. Schumann

    2013-08-01

    Full Text Available Photogrammetric methods and analysis results for contrails observed with wide-angle cameras are described. Four cameras of two different types (view angle −1. With this information, the aircraft causing the contrails are identified by comparison to traffic waypoint data. The observations are compared with synthetic camera pictures of contrails simulated with the contrail prediction model CoCiP, a Lagrangian model using air traffic movement data and numerical weather prediction (NWP data as input. The results provide tests for the NWP and contrail models. The cameras show spreading and thickening contrails suggesting ice-supersaturation in the ambient air. The ice-supersaturated layer is found thicker and more humid in this case than predicted by the NWP model used. The simulated and observed contrail positions agree up to differences caused by uncertain wind data. The contrail widths, which depend on wake vortex spreading, ambient shear and turbulence, were partly wider than simulated.

  15. Planetary camera control improves microfiche production

    Science.gov (United States)

    Chesterton, W. L.; Lewis, E. B.

    1965-01-01

    Microfiche is prepared using an automatic control system for a planetary camera. The system provides blank end-of-row exposures and signals card completion so the legend of the next card may by photographed.

  16. Calibration Procedures on Oblique Camera Setups

    Science.gov (United States)

    Kemper, G.; Melykuti, B.; Yu, C.

    2016-06-01

    Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager) is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna -IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first step with the help of

  17. Research of Camera Calibration Based on DSP

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2013-09-01

    Full Text Available To take advantage of the high-efficiency and stability of DSP in the data processing and the functions of OpenCV library, this study brought forward a scheme that camera calibration in DSP embedded system calibration. An arithmetic of camera calibration based on OpenCV is designed by analyzing the camera model and lens distortion. The transplantation of EMCV to DSP is completed and the arithmetic of camera calibration is migrated and optimized based on the CCS development environment and the DSP/BIOS system. On the premise of realizing calibration function, this arithmetic improves the efficiency of program execution and the precision of calibration and lays the foundation for further research of the visual location based on DSP embedded system.

  18. A Survey of Catadioptric Omnidirectional Camera Calibration

    Directory of Open Access Journals (Sweden)

    Yan Zhang

    2013-02-01

    Full Text Available For dozen years, computer vision becomes more popular, in which omnidirectional camera has a larger field of view and widely been used in many fields, such as: robot navigation, visual surveillance, virtual reality, three-dimensional reconstruction, and so on. Camera calibration is an essential step to obtain three-dimensional geometric information from a two-dimensional image. Meanwhile, the omnidirectional camera image has catadioptric distortion, which need to be corrected in many applications, thus the study of such camera calibration method has important theoretical significance and practical applications. This paper firstly introduces the research status of catadioptric omnidirectional imaging system; then the image formation process of catadioptric omnidirectional imaging system has been given; finally a simple classification of omnidirectional imaging method is given, and we discussed the advantages and disadvantages of these methods.

  19. High-performance digital color video camera

    Science.gov (United States)

    Parulski, Kenneth A.; D'Luna, Lionel J.; Benamati, Brian L.; Shelley, Paul R.

    1992-01-01

    Typical one-chip color cameras use analog video processing circuits. An improved digital camera architecture has been developed using a dual-slope A/D conversion technique and two full-custom CMOS digital video processing integrated circuits, the color filter array (CFA) processor and the RGB postprocessor. The system used a 768 X 484 active element interline transfer CCD with a new field-staggered 3G color filter pattern and a lenslet overlay, which doubles the sensitivity of the camera. The industrial-quality digital camera design offers improved image quality, reliability, manufacturability, while meeting aggressive size, power, and cost constraints. The CFA processor digital VLSI chip includes color filter interpolation processing, an optical black clamp, defect correction, white balance, and gain control. The RGB postprocessor digital integrated circuit includes a color correction matrix, gamma correction, 2D edge enhancement, and circuits to control the black balance, lens aperture, and focus.

  20. Increase in the Array Television Camera Sensitivity

    Science.gov (United States)

    Shakhrukhanov, O. S.

    A simple adder circuit for successive television frames that enables to considerably increase the sensitivity of such radiation detectors is suggested by the example of array television camera QN902K.

  1. POLICE BODY CAMERAS: SEEING MAY BE BELIEVING

    Directory of Open Access Journals (Sweden)

    Noel Otu

    2016-11-01

    Full Text Available While the concept of body-mounted cameras (BMC worn by police officers is a controversial issue, it is not new. Since in the early-2000s, police departments across the United States, England, Brazil, and Australia have been implementing wearable cameras. Like all devices used in policing, body-mounted cameras can create a sense of increased power, but also additional responsibilities for both the agencies and individual officers. This paper examines the public debate regarding body-mounted cameras. The conclusions drawn show that while these devices can provide information about incidents relating to police–citizen encounters, and can deter citizen and police misbehavior, these devices can also violate a citizen’s privacy rights. This paper outlines several ramifications for practice as well as implications for policy.

  2. Selecting the Right Camera for Your Desktop.

    Science.gov (United States)

    Rhodes, John

    1997-01-01

    Provides an overview of camera options and selection criteria for desktop videoconferencing. Key factors in image quality are discussed, including lighting, resolution, and signal-to-noise ratio; and steps to improve image quality are suggested. (LRW)

  3. CALIBRATION PROCEDURES ON OBLIQUE CAMERA SETUPS

    Directory of Open Access Journals (Sweden)

    G. Kemper

    2016-06-01

    Full Text Available Beside the creation of virtual animated 3D City models, analysis for homeland security and city planning, the accurately determination of geometric features out of oblique imagery is an important task today. Due to the huge number of single images the reduction of control points force to make use of direct referencing devices. This causes a precise camera-calibration and additional adjustment procedures. This paper aims to show the workflow of the various calibration steps and will present examples of the calibration flight with the final 3D City model. In difference to most other software, the oblique cameras are used not as co-registered sensors in relation to the nadir one, all camera images enter the AT process as single pre-oriented data. This enables a better post calibration in order to detect variations in the single camera calibration and other mechanical effects. The shown sensor (Oblique Imager is based o 5 Phase One cameras were the nadir one has 80 MPIX equipped with a 50 mm lens while the oblique ones capture images with 50 MPix using 80 mm lenses. The cameras are mounted robust inside a housing to protect this against physical and thermal deformations. The sensor head hosts also an IMU which is connected to a POS AV GNSS Receiver. The sensor is stabilized by a gyro-mount which creates floating Antenna –IMU lever arms. They had to be registered together with the Raw GNSS-IMU Data. The camera calibration procedure was performed based on a special calibration flight with 351 shoots of all 5 cameras and registered the GPS/IMU data. This specific mission was designed in two different altitudes with additional cross lines on each flying heights. The five images from each exposure positions have no overlaps but in the block there are many overlaps resulting in up to 200 measurements per points. On each photo there were in average 110 well distributed measured points which is a satisfying number for the camera calibration. In a first

  4. Compact stereo endoscopic camera using microprism arrays.

    Science.gov (United States)

    Yang, Sung-Pyo; Kim, Jae-Jun; Jang, Kyung-Won; Song, Weon-Kook; Jeong, Ki-Hun

    2016-03-15

    This work reports a microprism array (MPA) based compact stereo endoscopic camera with a single image sensor. The MPAs were monolithically fabricated by using two-step photolithography and geometry-guided resist reflow to form an appropriate prism angle for stereo image pair formation. The fabricated MPAs were transferred onto a glass substrate with a UV curable resin replica by using polydimethylsiloxane (PDMS) replica molding and then successfully integrated in front of a single camera module. The stereo endoscopic camera with MPA splits an image into two stereo images and successfully demonstrates the binocular disparities between the stereo image pairs for objects with different distances. This stereo endoscopic camera can serve as a compact and 3D imaging platform for medical, industrial, or military uses.

  5. Ge Quantum Dot Infrared Imaging Camera Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Luna Innovations Incorporated proposes to develop a high performance Ge quantum dots-based infrared (IR) imaging camera on Si substrate. The high sensitivity, large...

  6. Vacuum compatible miniature CCD camera head

    Science.gov (United States)

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  7. Flaming on YouTube

    NARCIS (Netherlands)

    Moor, Peter J.; Heuvelman, Ard; Verleur, Ria

    2010-01-01

    In this explorative study, flaming on YouTube was studied using surveys of YouTube users. Flaming is defined as displaying hostility by insulting, swearing or using otherwise offensive language. Three general conclusions were drawn. First, although many users said that they themselves do not flame,

  8. Interpreting Shock Tube Ignition Data

    Science.gov (United States)

    2003-10-01

    times only for high concentrations (of order 1% fuel or greater). The requirements of engine (IC, HCCI , CI and SI) modelers also present a different...Paper 03F-61 Interpreting Shock Tube Ignition Data D. F. Davidson and R. K. Hanson Mechanical Engineering ... Engineering Department Stanford University, Stanford CA 94305 Abstract Chemical kinetic modelers make extensive use of shock tube ignition data

  9. CMOS Camera Array With Onboard Memory

    Science.gov (United States)

    Gat, Nahum

    2009-01-01

    A compact CMOS (complementary metal oxide semiconductor) camera system has been developed with high resolution (1.3 Megapixels), a USB (universal serial bus) 2.0 interface, and an onboard memory. Exposure times, and other operating parameters, are sent from a control PC via the USB port. Data from the camera can be received via the USB port and the interface allows for simple control and data capture through a laptop computer.

  10. A stereoscopic lens for digital cinema cameras

    Science.gov (United States)

    Lipton, Lenny; Rupkalvis, John

    2015-03-01

    Live-action stereoscopic feature films are, for the most part, produced using a costly post-production process to convert planar cinematography into stereo-pair images and are only occasionally shot stereoscopically using bulky dual-cameras that are adaptations of the Ramsdell rig. The stereoscopic lens design described here might very well encourage more live-action image capture because it uses standard digital cinema cameras and workflow to save time and money.

  11. Analyzing storage media of digital camera

    OpenAIRE

    Chow, KP; Tse, KWH; Law, FYW; Ieong, RSC; Kwan, MYK; Tse, H.; Lai, PKY

    2009-01-01

    Digital photography has become popular in recent years. Photographs have become common tools for people to record every tiny parts of their daily life. By analyzing the storage media of a digital camera, crime investigators may extract a lot of useful information to reconstruct the events. In this work, we will discuss a few approaches in analyzing these kinds of storage media of digital cameras. A hypothetical crime case will be used as case study for demonstration of concepts. © 2009 IEEE.

  12. Compact pnCCD-based X-ray camera with high spatial and energy resolution: a color X-ray camera.

    Science.gov (United States)

    Scharf, O; Ihle, S; Ordavo, I; Arkadiev, V; Bjeoumikhov, A; Bjeoumikhova, S; Buzanich, G; Gubzhokov, R; Günther, A; Hartmann, R; Kühbacher, M; Lang, M; Langhoff, N; Liebel, A; Radtke, M; Reinholz, U; Riesemeier, H; Soltau, H; Strüder, L; Thünemann, A F; Wedell, R

    2011-04-01

    For many applications there is a requirement for nondestructive analytical investigation of the elemental distribution in a sample. With the improvement of X-ray optics and spectroscopic X-ray imagers, full field X-ray fluorescence (FF-XRF) methods are feasible. A new device for high-resolution X-ray imaging, an energy and spatial resolving X-ray camera, is presented. The basic idea behind this so-called "color X-ray camera" (CXC) is to combine an energy dispersive array detector for X-rays, in this case a pnCCD, with polycapillary optics. Imaging is achieved using multiframe recording of the energy and the point of impact of single photons. The camera was tested using a laboratory 30 μm microfocus X-ray tube and synchrotron radiation from BESSY II at the BAMline facility. These experiments demonstrate the suitability of the camera for X-ray fluorescence analytics. The camera simultaneously records 69,696 spectra with an energy resolution of 152 eV for manganese K(α) with a spatial resolution of 50 μm over an imaging area of 12.7 × 12.7 mm(2). It is sensitive to photons in the energy region between 3 and 40 keV, limited by a 50 μm beryllium window, and the sensitive thickness of 450 μm of the chip. Online preview of the sample is possible as the software updates the sums of the counts for certain energy channel ranges during the measurement and displays 2-D false-color maps as well as spectra of selected regions. The complete data cube of 264 × 264 spectra is saved for further qualitative and quantitative processing.

  13. Single camera stereo using structure from motion

    Science.gov (United States)

    McBride, Jonah; Snorrason, Magnus; Goodsell, Thomas; Eaton, Ross; Stevens, Mark R.

    2005-05-01

    Mobile robot designers frequently look to computer vision to solve navigation, obstacle avoidance, and object detection problems such as those encountered in parking lot surveillance. Stereo reconstruction is a useful technique in this domain and can be done in two ways. The first requires a fixed stereo camera rig to provide two side-by-side images; the second uses a single camera in motion to provide the images. While stereo rigs can be accurately calibrated in advance, they rely on a fixed baseline distance between the two cameras. The advantage of a single-camera method is the flexibility to change the baseline distance to best match each scenario. This directly increases the robustness of the stereo algorithm and increases the effective range of the system. The challenge comes from accurately rectifying the images into an ideal stereo pair. Structure from motion (SFM) can be used to compute the camera motion between the two images, but its accuracy is limited and small errors can cause rectified images to be misaligned. We present a single-camera stereo system that incorporates a Levenberg-Marquardt minimization of rectification parameters to bring the rectified images into alignment.

  14. The Use of Camera Traps in Wildlife

    Directory of Open Access Journals (Sweden)

    Yasin Uçarlı

    2013-11-01

    Full Text Available Camera traps are increasingly used in the abundance and density estimates of wildlife species. Camera traps are very good alternative for direct observation in case, particularly, steep terrain, dense vegetation covered areas or nocturnal species. The main reason for the use of camera traps is eliminated that the economic, personnel and time loss in a continuous manner at the same time in different points. Camera traps, motion and heat sensitive, can take a photo or video according to the models. Crossover points and feeding or mating areas of the focal species are addressed as a priority camera trap set locations. The population size can be finding out by the images combined with Capture-Recapture methods. The population density came out the population size divided to effective sampling area size. Mating and breeding season, habitat choice, group structures and survival rates of the focal species can be achieved from the images. Camera traps are very useful to obtain the necessary data about the particularly mysterious species with economically in planning and conservation efforts.

  15. A comparison of colour micrographs obtained with a charged couple devise (CCD) camera and a 35-mm camera

    DEFF Research Database (Denmark)

    Pedersen, Mads Møller; Smedegaard, Jesper; Jensen, Peter Koch

    2005-01-01

    ophthalmology, colour CCD camera, colour film, digital imaging, resolution, micrographs, histopathology, light microscopy......ophthalmology, colour CCD camera, colour film, digital imaging, resolution, micrographs, histopathology, light microscopy...

  16. Lag Camera: A Moving Multi-Camera Array for Scene-Acquisition

    Directory of Open Access Journals (Sweden)

    Yi Xu

    2007-04-01

    Full Text Available Many applications, such as telepresence, virtual reality, and interactive walkthroughs, require a three-dimensional (3Dmodel of real-world environments. Methods, such as lightfields, geometric reconstruction and computer vision use cameras to acquire visual samples of the environment and construct a model. Unfortunately, obtaining models of real-world locations is a challenging task. In particular, important environments are often actively in use, containing moving objects, such as people entering and leaving the scene. The methods previously listed have difficulty in capturing the color and structure of the environment while in the presence of moving and temporary occluders. We describe a class of cameras called lag cameras. The main concept is to generalize a camera to take samples over space and time. Such a camera, can easily and interactively detect moving objects while continuously moving through the environment. Moreover, since both the lag camera and occluder are moving, the scene behind the occluder is captured by the lag camera even from viewpoints where the occluder lies in between the lag camera and the hidden scene. We demonstrate an implementation of a lag camera, complete with analysis and captured environments.

  17. Water-storage-tube systems. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Hemker, P.

    1981-12-24

    Passive solar collection/storage/distribution systems were surveyed, designed, fabricated, and mechanically and thermally tested. The types studied were clear and opaque fiberglass tubes, metal tubes with plastic liners, and thermosyphoning tubes. (MHR)

  18. Vidicon storage tube electrical input/output

    Science.gov (United States)

    Lipoma, P.

    1972-01-01

    Electrical data storage tube is assembled from standard vidicon tube using conventional amplification and control circuits. Vidicon storage tube is simple, inexpensive and has an erase and preparation time of less than 5 microseconds.

  19. The electronics system for the LBNL positron emission mammography (PEM) camera

    CERN Document Server

    Moses, W W; Baker, K; Jones, W; Lenox, M; Ho, M H; Weng, M

    2001-01-01

    Describes the electronics for a high-performance positron emission mammography (PEM) camera. It is based on the electronics for a human brain positron emission tomography (PET) camera (the Siemens/CTI HRRT), modified to use a detector module that incorporates a photodiode (PD) array. An application-specified integrated circuit (ASIC) services the photodetector (PD) array, amplifying its signal and identifying the crystal of interaction. Another ASIC services the photomultiplier tube (PMT), measuring its output and providing a timing signal. Field-programmable gate arrays (FPGAs) and lookup RAMs are used to apply crystal-by-crystal correction factors and measure the energy deposit and the interaction depth (based on the PD/PMT ratio). Additional FPGAs provide event multiplexing, derandomization, coincidence detection, and real-time rebinning. Embedded PC/104 microprocessors provide communication, real-time control, and configure the system. Extensive use of FPGAs make the overall design extremely flexible, all...

  20. Design considerations for a high-spatial-resolution positron camera with dense-drift-space MWPC's

    Science.gov (United States)

    Delguerra, A.; Perez-Mendez, V.; Schwartz, G.; Nelson, W. R.

    1982-10-01

    A multiplane Positron Camera is proposed, made of six MWPC modules arranged to form the lateral surface of a hexagonal prism. Each module (50 x 50 sq cm) has a 2 cm thick lead-glass tube converter on both sides of a MWPC pressurized to 2 atm. Experimental measurements are presented to show how to reduce the parallax error by determining in which of the two converter layers the photon has interacted. The results of a detailed Monte Carlo calculation for the efficiency of this type of converter are shown to be in excellent agreement with the experimental measurements. The expected performance of the Positron Camera is presented: a true coincidence rate of 56,000 counts/s (with an equal accidental coincidence rate and a 30% Compton scatter contamination) and a spatial resolution better than 5.0 mm (FWHM) for a 400 micron Ci pointlike source embedded in a 10 cm radius water phantom.

  1. Impact of laser phase and amplitude noises on streak camera temporal resolution

    Energy Technology Data Exchange (ETDEWEB)

    Wlotzko, V., E-mail: wlotzko@optronis.com [ICube, UMR 7357, University of Strasbourg and CNRS, 23 rue du Loess, 67037 Strasbourg (France); Optronis GmbH, Ludwigstrasse 2, 77694 Kehl (Germany); Uhring, W. [ICube, UMR 7357, University of Strasbourg and CNRS, 23 rue du Loess, 67037 Strasbourg (France); Summ, P. [Optronis GmbH, Ludwigstrasse 2, 77694 Kehl (Germany)

    2015-09-15

    Streak cameras are now reaching sub-picosecond temporal resolution. In cumulative acquisition mode, this resolution does not entirely rely on the electronic or the vacuum tube performances but also on the light source characteristics. The light source, usually an actively mode-locked laser, is affected by phase and amplitude noises. In this paper, the theoretical effects of such noises on the synchronization of the streak system are studied in synchroscan and triggered modes. More precisely, the contribution of band-pass filters, delays, and time walk is ascertained. Methods to compute the resulting synchronization jitter are depicted. The results are verified by measurement with a streak camera combined with a Ti:Al{sub 2}O{sub 3} solid state laser oscillator and also a fiber oscillator.

  2. Unsteady pressure-sensitive paint measurement based on the heterodyne method using low frame rate camera.

    Science.gov (United States)

    Matsuda, Yu; Yorita, Daisuke; Egami, Yasuhiro; Kameya, Tomohiro; Kakihara, Noriaki; Yamaguchi, Hiroki; Asai, Keisuke; Niimi, Tomohide

    2013-10-01

    The pressure-sensitive paint technique based on the heterodyne method was proposed for the precise pressure measurement of unsteady flow fields. This measurement is realized by detecting the beat signal that results from interference between a modulating illumination light source and a pressure fluctuation. The beat signal is captured by a camera with a considerably lower frame rate than the frequency of the pressure fluctuation. By carefully adjusting the frequency of the light and the camera frame rate, the signal at the frequency of interest is detected, while the noise signals at other frequencies are eliminated. To demonstrate the proposed method, we measured the pressure fluctuations in a resonance tube at the fundamental, second, and third harmonics. The pressure fluctuation distributions were successfully obtained and were consistent with measurements from a pressure transducer. The proposed method is a useful technique for measuring unsteady phenomena.

  3. Sausage Instabilities on top of Kinking Lengthening Current-Carrying Magnetic Flux Tubes

    Science.gov (United States)

    von der Linden, Jens; You, Setthivoine

    2015-11-01

    Observations indicate that the dynamics of magnetic flux tubes in our cosmos and terrestrial experiments involve fast topological change beyond MHD reconnection. Recent experiments suggest that hierarchies of instabilities coupling disparate plasma scales could be responsible for this fast topological change by accessing two-fluid and kinetic scales. This study will explore the possibility of sausage instabilities developing on top of a kink instability in lengthening current-carrying magnetic flux tubes. Current driven flux tubes evolve over a wide range of aspect ratios k and current to magnetic flux ratios λ . An analytical stability criterion and numerical investigations, based on applying Newcomb's variational approach to idealized magnetic flux tubes with core and skin currents, indicate a dependence of the stability boundaries on current profiles and overlapping kink and sausage unstable regions in the k - λ trajectory of the flux tubes. A triple electrode planar plasma gun (Mochi.LabJet) is designed to generate flux tubes with discrete core and skin currents. Measurements from a fast-framing camera and a high resolution magnetic probe are being assembled into stability maps of the k - λ space of flux tubes. This work was sponsored in part by the US DOE Grant DE-SC0010340.

  4. SU-E-E-06: Teaching About the Gamma Camera and Ultrasound Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Lowe, M; Spiro, A [Loyola University Maryland, Baltimore, Maryland (United States); Vogel, R [Iowa Doppler Products, Iowa City, Iowa (United States); Donaldson, N; Gosselin, C [Rockhurst University, Kansas City, MO (United States)

    2015-06-15

    Purpose: Instructional modules on applications of physics in medicine are being developed. The target audience consists of students who have had an introductory undergraduate physics course. This presentation will concentrate on an active learning approach to teach the principles of the gamma camera. There will also be a description of an apparatus to teach ultrasound imaging. Methods: Since a real gamma camera is not feasible in the undergraduate classroom, we have developed two types of optical apparatus that teach the main principles. To understand the collimator, LEDS mimic gamma emitters in the body, and the photons pass through an array of tubes. The distance, spacing, diameter, and length of the tubes can be varied to understand the effect upon the resolution of the image. To determine the positions of the gamma emitters, a second apparatus uses a movable green laser, fluorescent plastic in lieu of the scintillation crystal, acrylic rods that mimic the PMTs, and a photodetector to measure the intensity. The position of the laser is calculated with a centroid algorithm.To teach the principles of ultrasound imaging, we are using the sound head and pulser box of an educational product, variable gain amplifier, rotation table, digital oscilloscope, Matlab software, and phantoms. Results: Gamma camera curriculum materials have been implemented in the classroom at Loyola in 2014 and 2015. Written work shows good knowledge retention and a more complete understanding of the material. Preliminary ultrasound imaging materials were run in 2015. Conclusion: Active learning methods add another dimension to descriptions in textbooks and are effective in keeping the students engaged during class time. The teaching apparatus for the gamma camera and ultrasound imaging can be expanded to include more cases, and could potentially improve students’ understanding of artifacts and distortions in the images.

  5. Camera Calibration Accuracy at Different Uav Flying Heights

    Science.gov (United States)

    Yusoff, A. R.; Ariff, M. F. M.; Idris, K. M.; Majid, Z.; Chong, A. K.

    2017-02-01

    Unmanned Aerial Vehicles (UAVs) can be used to acquire highly accurate data in deformation survey, whereby low-cost digital cameras are commonly used in the UAV mapping. Thus, camera calibration is considered important in obtaining high-accuracy UAV mapping using low-cost digital cameras. The main focus of this study was to calibrate the UAV camera at different camera distances and check the measurement accuracy. The scope of this study included camera calibration in the laboratory and on the field, and the UAV image mapping accuracy assessment used calibration parameters of different camera distances. The camera distances used for the image calibration acquisition and mapping accuracy assessment were 1.5 metres in the laboratory, and 15 and 25 metres on the field using a Sony NEX6 digital camera. A large calibration field and a portable calibration frame were used as the tools for the camera calibration and for checking the accuracy of the measurement at different camera distances. Bundle adjustment concept was applied in Australis software to perform the camera calibration and accuracy assessment. The results showed that the camera distance at 25 metres is the optimum object distance as this is the best accuracy obtained from the laboratory as well as outdoor mapping. In conclusion, the camera calibration at several camera distances should be applied to acquire better accuracy in mapping and the best camera parameter for the UAV image mapping should be selected for highly accurate mapping measurement.

  6. How to Build Your Own Document Camera for around $100

    Science.gov (United States)

    Van Orden, Stephen

    2010-01-01

    Document cameras can have great utility in second language classrooms. However, entry-level consumer document cameras start at around $350. This article describes how the author built three document cameras and offers suggestions for how teachers can successfully build their own quality document camera using a webcam for around $100.

  7. 16 CFR 1025.45 - In camera materials.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false In camera materials. 1025.45 Section 1025.45... PROCEEDINGS Hearings § 1025.45 In camera materials. (a) Definition. In camera materials are documents... excluded from the public record. (b) In camera treatment of documents and testimony. The Presiding...

  8. Development and Characterization of a Single Line of Sight Framing Camera

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, D K; Bell, P M; Dymoke-Bradshaw, A K L; Hares, J D; Bahr, R E; Smalyuk, V A

    2000-06-13

    We present initial characterization data from a new single line of sight (SLOS) x-ray framing camera. The instrument uses an image dissecting structure inside an electron optic tube to produce up to four simultaneous DC images from a single image incident on the cathode and a microchannel plate based device to provide the temporal gating of those images. A series of gated images have been obtained using a short pulse UV laser source, and the spatial resolution of those images is compared to those obtained using a more traditional MCP based system.

  9. A 3D high-resolution gamma camera for radiopharmaceutical studies with small animals

    CERN Document Server

    Loudos, G K; Giokaris, N D; Styliaris, E; Archimandritis, S C; Varvarigou, A D; Papanicolas, C N; Majewski, S; Weisenberger, D; Pani, R; Scopinaro, F; Uzunoglu, N K; Maintas, D; Stefanis, K

    2003-01-01

    The results of studies conducted with a small field of view tomographic gamma camera based on a Position Sensitive Photomultiplier Tube are reported. The system has been used for the evaluation of radiopharmaceuticals in small animals. Phantom studies have shown a spatial resolution of 2 mm in planar and 2-3 mm in tomographic imaging. Imaging studies in mice have been carried out both in 2D and 3D. Conventional radiopharmaceuticals have been used and the results have been compared with images from a clinically used system.

  10. PEG tubes: dealing with complications.

    Science.gov (United States)

    Malhi, Hardip; Thompson, Rosie

    A percutaneous endoscopic gastronomy tube can be used to deliver nutrition, hydration and medicines directly into the patient's stomach. Patients will require a tube if they are unable to swallow safely, putting them at risk of aspiration of food, drink and medicines into their lungs. It is vital that nurses are aware of the complications that may arise when caring for a patient with a PEG tube. It is equally important that nurses know how to deal with these complications or from where tc seek advice. This article provides a quick troubleshooting guide to help nurses deal with complications that can arise with PEG feeding.

  11. Acoustical studies on corrugated tubes

    Science.gov (United States)

    Balaguru, Rajavel

    Corrugated tubes and pipes offer greater global flexibility combined with local rigidity. They are used in numerous engineering applications such as vacuum cleaner hosing, air conditioning systems of aircraft and automobiles, HVAC control systems of heating ducts in buildings, compact heat exchangers, medical equipment and offshore gas and oil transportation flexible riser pipelines. Recently there has been a renewed research interest in analyzing the flow through a corrugated tube to understand the underlying mechanism of so called whistling, although the whistling in such a tube was identified in early twentieth century. The phenomenon of whistling in a corrugated tube is interesting because an airflow through a smooth walled tube of similar dimensions will not generate any whistling tones. Study of whistling in corrugated tubes is important because, it not only causes an undesirable noise problem but also results in flow-acoustic coupling. Such a coupling can cause significant structural vibrations due to flow-acoustic-structure interaction. This interaction would cause flow-induced vibrations that could result in severe damage to mechanical systems having corrugated tubes. In this research work, sound generation (whistling) in corrugated tubes due to airflow is analyzed using experimental as well as Computational Fluid Dynamics-Large Eddy Simulation (CFD-LES) techniques. Sound generation mechanisms resulting in whistling have been investigated. The whistling in terms of frequencies and sound pressure levels for different flow velocities are studied. The analytical and experimental studies are carried out to understand the influence of various parameters of corrugated tubes such as cavity length, cavity width, cavity depth, pitch, Reynolds numbers and number of corrugations. The results indicate that there is a good agreement between theoretically calculated, computationally predicted and experimentally measured whistling frequencies and sound pressure levels

  12. Electronic components, tubes and transistors

    CERN Document Server

    Dummer, G W A

    1965-01-01

    Electronic Components, Tubes and Transistors aims to bridge the gap between the basic measurement theory of resistance, capacitance, and inductance and the practical application of electronic components in equipments. The more practical or usage aspect of electron tubes and semiconductors is given emphasis over theory. The essential characteristics of each main type of component, tube, and transistor are summarized. This book is comprised of six chapters and begins with a discussion on the essential characteristics in terms of the parameters usually required in choosing a resistor, including s

  13. Assessment and confirmation of tracheal intubation when capnography fails: a novel use for an USB camera.

    Science.gov (United States)

    Karippacheril, John George; Umesh, Goneppanavar; Nanda, Shetty

    2013-10-01

    A 62 year old male with a right pyriform fossa lesion extending to the right arytenoid and obscuring the glottic inlet was planned for laser assisted excision. Direct laryngoscopic assessment after topicalization of the airway, showed a Cormack Lehane grade 3 view. We report a case where, in the absence of a fiberscope, a novel inexpensive Universal Serial Bus camera was used to obtain an optimal laryngoscopic view. This provided direct visual confirmation of tracheal intubation with a Laser Flex tube, when capnography failed to show any trace. Capnography may not be reliable as a sole indicator of confirmation of correct endotracheal tube placement. Video laryngoscopy may provide additional confirmation of endotracheal intubation.

  14. Hidden cameras everything you need to know about covert recording, undercover cameras and secret filming

    CERN Document Server

    Plomin, Joe

    2016-01-01

    Providing authoritative information on the practicalities of using hidden cameras to expose abuse or wrongdoing, this book is vital reading for anyone who may use or encounter secret filming. It gives specific advice on using phones or covert cameras and unravels the complex legal and ethical issues that need to be considered.

  15. Mobile phone camera benchmarking: combination of camera speed and image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-01-01

    When a mobile phone camera is tested and benchmarked, the significance of quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. For example, ISO 15781 defines several measurements to evaluate various camera system delays. However, the speed or rapidity metrics of the mobile phone's camera system have not been used with the quality metrics even if the camera speed has become more and more important camera performance feature. There are several tasks in this work. Firstly, the most important image quality metrics are collected from the standards and papers. Secondly, the speed related metrics of a mobile phone's camera system are collected from the standards and papers and also novel speed metrics are identified. Thirdly, combinations of the quality and speed metrics are validated using mobile phones in the market. The measurements are done towards application programming interface of different operating system. Finally, the results are evaluated and conclusions are made. The result of this work gives detailed benchmarking results of mobile phone camera systems in the market. The paper defines also a proposal of combined benchmarking metrics, which includes both quality and speed parameters.

  16. Modulated CMOS camera for fluorescence lifetime microscopy.

    Science.gov (United States)

    Chen, Hongtao; Holst, Gerhard; Gratton, Enrico

    2015-12-01

    Widefield frequency-domain fluorescence lifetime imaging microscopy (FD-FLIM) is a fast and accurate method to measure the fluorescence lifetime of entire images. However, the complexity and high costs involved in construction of such a system limit the extensive use of this technique. PCO AG recently released the first luminescence lifetime imaging camera based on a high frequency modulated CMOS image sensor, QMFLIM2. Here we tested and provide operational procedures to calibrate the camera and to improve the accuracy using corrections necessary for image analysis. With its flexible input/output options, we are able to use a modulated laser diode or a 20 MHz pulsed white supercontinuum laser as the light source. The output of the camera consists of a stack of modulated images that can be analyzed by the SimFCS software using the phasor approach. The nonuniform system response across the image sensor must be calibrated at the pixel level. This pixel calibration is crucial and needed for every camera settings, e.g. modulation frequency and exposure time. A significant dependency of the modulation signal on the intensity was also observed and hence an additional calibration is needed for each pixel depending on the pixel intensity level. These corrections are important not only for the fundamental frequency, but also for the higher harmonics when using the pulsed supercontinuum laser. With these post data acquisition corrections, the PCO CMOS-FLIM camera can be used for various biomedical applications requiring a large frame and high speed acquisition.

  17. Design of Endoscopic Capsule With Multiple Cameras.

    Science.gov (United States)

    Gu, Yingke; Xie, Xiang; Li, Guolin; Sun, Tianjia; Wang, Dan; Yin, Zheng; Zhang, Pengfei; Wang, Zhihua

    2015-08-01

    In order to reduce the miss rate of the wireless capsule endoscopy, in this paper, we propose a new system of the endoscopic capsule with multiple cameras. A master-slave architecture, including an efficient bus architecture and a four level clock management architecture, is applied for the Multiple Cameras Endoscopic Capsule (MCEC). For covering more area of the gastrointestinal tract wall with low power, multiple cameras with a smart image capture strategy, including movement sensitive control and camera selection, are used in the MCEC. To reduce the data transfer bandwidth and power consumption to prolong the MCEC's working life, a low complexity image compressor with PSNR 40.7 dB and compression rate 86% is implemented. A chipset is designed and implemented for the MCEC and a six cameras endoscopic capsule prototype is implemented by using the chipset. With the smart image capture strategy, the coverage rate of the MCEC prototype can achieve 98% and its power consumption is only about 7.1 mW.

  18. Calibration of action cameras for photogrammetric purposes.

    Science.gov (United States)

    Balletti, Caterina; Guerra, Francesco; Tsioukas, Vassilios; Vernier, Paolo

    2014-09-18

    The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a) easy to handle, (b) capable of performing under extreme conditions and more importantly (c) able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  19. Calibration of Action Cameras for Photogrammetric Purposes

    Directory of Open Access Journals (Sweden)

    Caterina Balletti

    2014-09-01

    Full Text Available The use of action cameras for photogrammetry purposes is not widespread due to the fact that until recently the images provided by the sensors, using either still or video capture mode, were not big enough to perform and provide the appropriate analysis with the necessary photogrammetric accuracy. However, several manufacturers have recently produced and released new lightweight devices which are: (a easy to handle, (b capable of performing under extreme conditions and more importantly (c able to provide both still images and video sequences of high resolution. In order to be able to use the sensor of action cameras we must apply a careful and reliable self-calibration prior to the use of any photogrammetric procedure, a relatively difficult scenario because of the short focal length of the camera and its wide angle lens that is used to obtain the maximum possible resolution of images. Special software, using functions of the OpenCV library, has been created to perform both the calibration and the production of undistorted scenes for each one of the still and video image capturing mode of a novel action camera, the GoPro Hero 3 camera that can provide still images up to 12 Mp and video up 8 Mp resolution.

  20. Designing Camera Networks by Convex Quadratic Programming

    KAUST Repository

    Ghanem, Bernard

    2015-05-04

    ​In this paper, we study the problem of automatic camera placement for computer graphics and computer vision applications. We extend the problem formulations of previous work by proposing a novel way to incorporate visibility constraints and camera-to-camera relationships. For example, the placement solution can be encouraged to have cameras that image the same important locations from different viewing directions, which can enable reconstruction and surveillance tasks to perform better. We show that the general camera placement problem can be formulated mathematically as a convex binary quadratic program (BQP) under linear constraints. Moreover, we propose an optimization strategy with a favorable trade-off between speed and solution quality. Our solution is almost as fast as a greedy treatment of the problem, but the quality is significantly higher, so much so that it is comparable to exact solutions that take orders of magnitude more computation time. Because it is computationally attractive, our method also allows users to explore the space of solutions for variations in input parameters. To evaluate its effectiveness, we show a range of 3D results on real-world floorplans (garage, hotel, mall, and airport). ​

  1. Camera Calibration with Radial Variance Component Estimation

    Science.gov (United States)

    Mélykuti, B.; Kruck, E. J.

    2014-11-01

    Camera calibration plays a more and more important role in recent times. Beside real digital aerial survey cameras the photogrammetric market is dominated by a big number of non-metric digital cameras mounted on UAVs or other low-weight flying platforms. The in-flight calibration of those systems has a significant role to enhance the geometric accuracy of survey photos considerably. It is expected to have a better precision of photo measurements in the center of images then along the edges or in the corners. With statistical methods the accuracy of photo measurements in dependency of the distance of points from image center has been analyzed. This test provides a curve for the measurement precision as function of the photo radius. A high number of camera types have been tested with well penetrated point measurements in image space. The result of the tests led to a general consequence to show a functional connection between accuracy and radial distance and to give a method how to check and enhance the geometrical capability of the cameras in respect to these results.

  2. Metrology camera system of prime focus spectrograph for Suburu telescope

    Science.gov (United States)

    Wang, Shiang-Yu; Chou, Richard C. Y.; Huang, Pin-Jie; Ling, Hung-Hsu; Karr, Jennifer; Chang, Yin-Chang; Hu, Yen-Sang; Hsu, Shu-Fu; Chen, Hsin-Yo; Gunn, James E.; Reiley, Dan J.; Tamura, Naoyuki; Takato, Naruhisa; Shimono, Atsushi

    2016-08-01

    The Prime Focus Spectrograph (PFS) is a new optical/near-infrared multi-fiber spectrograph designed for the prime focus of the 8.2m Subaru telescope. PFS will cover a 1.3 degree diameter field with 2394 fibers to complement the imaging capabilities of Hyper SuprimeCam. To retain high throughput, the final positioning accuracy between the fibers and observing targets of PFS is required to be less than 10 microns. The metrology camera system (MCS) serves as the optical encoder of the fiber motors for the configuring of fibers. MCS provides the fiber positions within a 5 microns error over the 45 cm focal plane. The information from MCS will be fed into the fiber positioner control system for the closed loop control. MCS will be located at the Cassegrain focus of Subaru telescope in order to cover the whole focal plane with one 50M pixel Canon CMOS camera. It is a 380mm Schmidt type telescope which generates a uniform spot size with a 10 micron FWHM across the field for reasonable sampling of the point spread function. Carbon fiber tubes are used to provide a stable structure over the operating conditions without focus adjustments. The CMOS sensor can be read in 0.8s to reduce the overhead for the fiber configuration. The positions of all fibers can be obtained within 0.5s after the readout of the frame. This enables the overall fiber configuration to be less than 2 minutes. MCS will be installed inside a standard Subaru Cassgrain Box. All components that generate heat are located inside a glycol cooled cabinet to reduce the possible image motion due to heat. The optics and camera for MCS have been delivered and tested. The mechanical parts and supporting structure are ready as of spring 2016. The integration of MCS will start in the summer of 2016. In this report, the performance of the MCS components, the alignment and testing procedure as well as the status of the PFS MCS will be presented.

  3. Calibration method for a central catadioptric-perspective camera system.

    Science.gov (United States)

    He, Bingwei; Chen, Zhipeng; Li, Youfu

    2012-11-01

    A central catadioptric-perspective camera system is widely used nowadays. A critical problem is that current calibration methods cannot determine the extrinsic parameters between the central catadioptric camera and a perspective camera effectively. We present a novel calibration method for a central catadioptric-perspective camera system, in which the central catadioptric camera has a hyperbolic mirror. Two cameras are used to capture images of one calibration pattern at different spatial positions. A virtual camera is constructed at the origin of the central catadioptric camera and faced toward the calibration pattern. The transformation between the virtual camera and the calibration pattern could be computed first and the extrinsic parameters between the central catadioptric camera and the calibration pattern could be obtained. Three-dimensional reconstruction results of the calibration pattern show a high accuracy and validate the feasibility of our method.

  4. Speed cameras : how they work and what effect they have.

    OpenAIRE

    2011-01-01

    Much research has been carried out into the effects of speed cameras, and the research shows consistently positive results. International review studies report that speed cameras produce a reduction of approximately 20% in personal injury crashes on road sections where cameras are used. In the Netherlands, research also indicates positive effects on speed behaviour and road safety. Dutch drivers find speed cameras in fixed pole-mounted positions more acceptable than cameras in hidden police c...

  5. Tube-wave seismic imaging

    Science.gov (United States)

    Korneev, Valeri A [LaFayette, CA

    2009-05-05

    The detailed analysis of cross well seismic data for a gas reservoir in Texas revealed two newly detected seismic wave effects, recorded approximately 2000 feet above the reservoir. A tube-wave (150) is initiated in a source well (110) by a source (111), travels in the source well (110), is coupled to a geological feature (140), propagates (151) through the geological feature (140), is coupled back to a tube-wave (152) at a receiver well (120), and is and received by receiver(s) (121) in either the same (110) or a different receiving well (120). The tube-wave has been shown to be extremely sensitive to changes in reservoir characteristics. Tube-waves appear to couple most effectively to reservoirs where the well casing is perforated, allowing direct fluid contact from the interior of a well case to the reservoir.

  6. NEI You Tube Videos: Amblyopia

    Medline Plus

    Full Text Available ... questions Clinical Studies Publications Catalog Photos and Images Spanish Language Information Dilated Exam Grants and Funding Extramural ... Low Vision Refractive Errors Retinopathy of Prematurity Science Spanish Videos Webinars NEI YouTube Videos: Amblyopia Embedded video ...

  7. Pulse tube refrigerator; Parusukan reitoki

    Energy Technology Data Exchange (ETDEWEB)

    Hozumi, Yoshikazu [University of Tsukuba, Tsukuba (Japan); Shiraishi, Masao [Hiroshima University, Hiroshima (Japan)

    1999-06-05

    In the cryogenic field, high temperature superconductivity and research and development of the peripheral technology are popular. Refrigerating machine development of the very low temperature is also one of the results. Research and development are mainly advanced as a refrigerating machine of the center for the aerospace plane installation. There is special and small very low temperature refrigerating machine called 'the pulse tube refrigerating machine' of which the practical application is also recently being attempted for the semiconductor cooling using high temperature superconductivity. At present, the basic research of elucidation of refrigeration phenomenon of pulse tube refrigerating machine and development of high-performance pulse tube refrigerating machine is carried out by experiment in the Ministry of International Trade and Industry Mechanical Engineering Lab., Agency of Industrial Sci. and Technology and numerical simulation in Chiyoda Corp. In this report, the pulse tube refrigerating machine is introduced, and the application in the chemical engineering field is considered. (NEDO)

  8. Lunar Core Drive Tubes Summary

    Data.gov (United States)

    National Aeronautics and Space Administration — Contains a brief summary and high resolution imagery from various lunar rock and core drive tubes collected from the Apollo and Luna missions to the moon.

  9. Mechanical Design of the LSST Camera

    Energy Technology Data Exchange (ETDEWEB)

    Nordby, Martin; Bowden, Gordon; Foss, Mike; Guiffre, Gary; /SLAC; Ku, John; /Unlisted; Schindler, Rafe; /SLAC

    2008-06-13

    The LSST camera is a tightly packaged, hermetically-sealed system that is cantilevered into the main beam of the LSST telescope. It is comprised of three refractive lenses, on-board storage for five large filters, a high-precision shutter, and a cryostat that houses the 3.2 giga-pixel CCD focal plane along with its support electronics. The physically large optics and focal plane demand large structural elements to support them, but the overall size of the camera and its components must be minimized to reduce impact on the image stability. Also, focal plane and optics motions must be minimized to reduce systematic errors in image reconstruction. Design and analysis for the camera body and cryostat will be detailed.

  10. Generating Stereoscopic Television Images With One Camera

    Science.gov (United States)

    Coan, Paul P.

    1996-01-01

    Straightforward technique for generating stereoscopic television images involves use of single television camera translated laterally between left- and right-eye positions. Camera acquires one of images (left- or right-eye image), and video signal from image delayed while camera translated to position where it acquires other image. Length of delay chosen so both images displayed simultaneously or as nearly simultaneously as necessary to obtain stereoscopic effect. Technique amenable to zooming in on small areas within broad scenes. Potential applications include three-dimensional viewing of geological features and meteorological events from spacecraft and aircraft, inspection of workpieces moving along conveyor belts, and aiding ground and water search-and-rescue operations. Also used to generate and display imagery for public education and general information, and possible for medical purposes.

  11. HIGH SPEED KERR CELL FRAMING CAMERA

    Science.gov (United States)

    Goss, W.C.; Gilley, L.F.

    1964-01-01

    The present invention relates to a high speed camera utilizing a Kerr cell shutter and a novel optical delay system having no moving parts. The camera can selectively photograph at least 6 frames within 9 x 10/sup -8/ seconds during any such time interval of an occurring event. The invention utilizes particularly an optical system which views and transmits 6 images of an event to a multi-channeled optical delay relay system. The delay relay system has optical paths of successively increased length in whole multiples of the first channel optical path length, into which optical paths the 6 images are transmitted. The successively delayed images are accepted from the exit of the delay relay system by an optical image focusing means, which in turn directs the images into a Kerr cell shutter disposed to intercept the image paths. A camera is disposed to simultaneously view and record the 6 images during a single exposure of the Kerr cell shutter. (AEC)

  12. Tube wall thickness measurement apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Lagasse, P.R.

    1987-01-06

    An apparatus is described for measuring the thickness of a tube's wall for the tube's entire length and circumference by determining the deviation of the tube wall thickness from the known thickness of a selected standard item, the apparatus comprising: a. a base; b. a first support member having first and second ends, the first end being connected to the base, the first support member having a sufficiently small circumference that the tube can be slid over the first support member; c. a spherical element, the spherical element being connected to the second end of the first support member. The spherical element has a sufficiently small circumference at its equator that the tube can be slid over the spherical element, the spherical element having at its equator a larger circumference than the first support member; d. a second support member having first and second ends, the first end being connected to the base, the second support member being spaced apart form the first support member; e. a positioning element connected to and moveable relative to the second support member; and f. an indicator connected to the positioning element and being moveable thereby to a location proximate the spherical element. The indicator includes a contact ball for contacting the selected standard item and holding it against the spherical element, the contact ball contacting the tube when the tube is disposed about the spherical element. The indicator includes a dial having a rotatable needle for indicating the deviation of the tube wall thickness from the thickness of the selected standard item, the rotatable needle being operatively connected to and responsive to the position of the contact ball.

  13. Phase camera experiment for Advanced Virgo

    Energy Technology Data Exchange (ETDEWEB)

    Agatsuma, Kazuhiro, E-mail: agatsuma@nikhef.nl [National Institute for Subatomic Physics, Amsterdam (Netherlands); Beuzekom, Martin van; Schaaf, Laura van der [National Institute for Subatomic Physics, Amsterdam (Netherlands); Brand, Jo van den [National Institute for Subatomic Physics, Amsterdam (Netherlands); VU University, Amsterdam (Netherlands)

    2016-07-11

    We report on a study of the phase camera, which is a frequency selective wave-front sensor of a laser beam. This sensor is utilized for monitoring sidebands produced by phase modulations in a gravitational wave (GW) detector. Regarding the operation of the GW detectors, the laser modulation/demodulation method is used to measure mirror displacements and used for the position controls. This plays a significant role because the quality of controls affect the noise level of the GW detector. The phase camera is able to monitor each sideband separately, which has a great benefit for the manipulation of the delicate controls. Also, overcoming mirror aberrations will be an essential part of Advanced Virgo (AdV), which is a GW detector close to Pisa. Especially low-frequency sidebands can be affected greatly by aberrations in one of the interferometer cavities. The phase cameras allow tracking such changes because the state of the sidebands gives information on mirror aberrations. A prototype of the phase camera has been developed and is currently tested. The performance checks are almost completed and the installation of the optics at the AdV site has started. After the installation and commissioning, the phase camera will be combined to a thermal compensation system that consists of CO{sub 2} lasers and compensation plates. In this paper, we focus on the prototype and show some limitations from the scanner performance. - Highlights: • The phase camera is being developed for a gravitational wave detector. • A scanner performance limits the operation speed and layout design of the system. • An operation range was found by measuring the frequency response of the scanner.

  14. Small Orbital Stereo Tracking Camera Technology Development

    Science.gov (United States)

    Bryan, Tom; MacLeod, Todd; Gagliano, Larry

    2016-01-01

    On-Orbit Small Debris Tracking and Characterization is a technical gap in the current National Space Situational Awareness necessary to safeguard orbital assets and crew. This poses a major risk of MOD damage to ISS and Exploration vehicles. In 2015 this technology was added to NASA's Office of Chief Technologist roadmap. For missions flying in or assembled in or staging from LEO, the physical threat to vehicle and crew is needed in order to properly design the proper level of MOD impact shielding and proper mission design restrictions. Need to verify debris flux and size population versus ground RADAR tracking. Use of ISS for In-Situ Orbital Debris Tracking development provides attitude, power, data and orbital access without a dedicated spacecraft or restricted operations on-board a host vehicle as a secondary payload. Sensor Applicable to in-situ measuring orbital debris in flux and population in other orbits or on other vehicles. Could enhance safety on and around ISS. Some technologies extensible to monitoring of extraterrestrial debris as well To help accomplish this, new technologies must be developed quickly. The Small Orbital Stereo Tracking Camera is one such up and coming technology. It consists of flying a pair of intensified megapixel telephoto cameras to evaluate Orbital Debris (OD) monitoring in proximity of International Space Station. It will demonstrate on-orbit optical tracking (in situ) of various sized objects versus ground RADAR tracking and small OD models. The cameras are based on Flight Proven Advanced Video Guidance Sensor pixel to spot algorithms (Orbital Express) and military targeting cameras. And by using twin cameras we can provide Stereo images for ranging & mission redundancy. When pointed into the orbital velocity vector (RAM), objects approaching or near the stereo camera set can be differentiated from the stars moving upward in background.

  15. Dermatology on YouTube.

    Science.gov (United States)

    Boyers, Lindsay N; Quest, Tyler; Karimkhani, Chante; Connett, Jessica; Dellavalle, Robert P

    2014-06-15

    YouTube, reaches upwards of six billion users on a monthly basis and is a unique source of information distribution and communication. Although the influence of YouTube on personal health decision-making is well established, this study assessed the type of content and viewership on a broad scope of dermatology related content on YouTube. Select terms (i.e. dermatology, sun protection, skin cancer, skin cancer awareness, and skin conditions) were searched on YouTube. Overall, the results included 100 videos with over 47 million viewers. Advocacy was the most prevalent content type at 24% of the total search results. These 100 videos were "shared" a total of 101,173 times and have driven 6,325 subscriptions to distinct YouTube user pages. Of the total videos, 35% were uploaded by or featured an MD/DO/PhD in dermatology or other specialty/field, 2% FNP/PA, 1% RN, and 62% other. As one of the most trafficked global sites on the Internet, YouTube is a valuable resource for dermatologists, physicians in other specialties, and the general public to share their dermatology-related content and gain subscribers. However, challenges of accessing and determining evidence-based data remain an issue.

  16. Tube wall thickness measurement apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Lagasse, P.R.

    1985-06-21

    An apparatus for measuring the thickness of a tube's wall for the tube's entire length and radius by determining the deviation of the tube wall thickness from the known thickness of a selected standard item. The apparatus comprises a base and a first support member having first and second ends. The first end is connected to the base and the second end is connected to a spherical element. A second support member is connected to the base and spaced apart from the first support member. A positioning element is connected to and movable relative to the second support member. An indicator is connected to the positioning element and is movable to a location proximate the spherical element. The indicator includes a contact ball for first contacting the selected standard item and holding it against the spherical element. The contact ball then contacts the tube when the tube is disposed about the spherical element. The indicator includes a dial having a rotatable needle for indicating the deviation of the tube wall thickness from the thickness of the selected standard item.

  17. Tube wall thickness measurement apparatus

    Science.gov (United States)

    Lagasse, P.R.

    1985-06-21

    An apparatus for measuring the thickness of a tube's wall for the tube's entire length and radius by determining the deviation of the tube wall thickness from the known thickness of a selected standard item. The apparatus comprises a base and a first support member having first and second ends. The first end is connected to the base and the second end is connected to a spherical element. A second support member is connected to the base and spaced apart from the first support member. A positioning element is connected to and movable relative to the second support member. An indicator is connected to the positioning element and is movable to a location proximate the spherical element. The indicator includes a contact ball for first contacting the selected standard item and holding it against the spherical element. The contact ball then contacts the tube when the tube is disposed about the spherical element. The indicator includes a dial having a rotatable needle for indicating the deviation of the tube wall thickness from the thickness of the selected standard item.

  18. Tube wall thickness measurement apparatus

    Energy Technology Data Exchange (ETDEWEB)

    Lagasse, Paul R. (Santa Fe, NM)

    1987-01-01

    An apparatus for measuring the thickness of a tube's wall for the tube's entire length and circumference by determining the deviation of the tube wall thickness from the known thickness of a selected standard item. The apparatus comprises a base and a first support member having first and second ends. The first end is connected to the base and the second end is connected to a spherical element. A second support member is connected to the base and spaced apart from the first support member. A positioning element is connected to and movable relative to the second support member. An indicator is connected to the positioning element and is movable to a location proximate the spherical element. The indicator includes a contact ball for first contacting the selected standard item and holding it against the spherical element. The contact ball then contacts the tube when the tube is disposed about the spherical element. The indicator includes a dial having a rotatable needle for indicating the deviation of the tube wall thickness from the thickness of the selected standard item.

  19. A laser tube position regulator

    Energy Technology Data Exchange (ETDEWEB)

    Sinyitiro, A.; Norio, K.

    1984-03-26

    An improved design is patented for a mechanism and method of regulating, with a high degree of accuracy, the position of a laser tube in a gas laser inside the optical resonator formed by external mirrors. The laser tube is held in two holders. Each holder contains an L shaped bracket which supports a semitransparent plate. The plate is positioned so that its center is over the center of the end of the tube which is in the form of a Brewster window. A narrow parallel beam is directed along the tube axis from an external auxiliary laser. The beam passes through the semitransparent mirror of the optical resonator in the adjusted laser, through the first Brewster window, the tube itself, and the second Brewster window and is reflected back in the reverse direction from a fully reflecting mirror in the optical resonator. This provides partial reflection of the beam from the external Brewster mirror surface. The tube position in the holders is regulated continuously so that the luminous spots from the beams reflected off the Brewster windows fall on the semitransparent plates in the center of the latter which is designated as the point of intersection.

  20. Virtual camera synthesis for soccer game replays

    Directory of Open Access Journals (Sweden)

    S. Sagas

    2013-07-01

    Full Text Available In this paper, we present a set of tools developed during the creation of a platform that allows the automatic generation of virtual views in a live soccer game production. Observing the scene through a multi-camera system, a 3D approximation of the players is computed and used for the synthesis of virtual views. The system is suitable both for static scenes, to create bullet time effects, and for video applications, where the virtual camera moves as the game plays.

  1. Nitrogen camera: detection of antipersonnel mines

    Science.gov (United States)

    Trower, W. Peter; Saunders, Anna W.; Shvedunov, Vasiliy I.

    1997-01-01

    We describe a nuclear technique, the nitrogen camera, with which we have produced images of elemental nitrogen in concentrations and with surface densities typical of buried plastic anti-personnel mines. We have, under laboratory conditions, obtained images of nitrogen in amounts substantially less than in these small 200 g mines. We report our progress in creating the enabling technology to make the nitrogen camera a field deployable instrument: a mobile 70 MeV electron racetrack microtron and scintillator/semiconductor materials and the detectors based on them.

  2. Camera-enabled techniques for organic synthesis

    Directory of Open Access Journals (Sweden)

    Steven V. Ley

    2013-05-01

    Full Text Available A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future.

  3. Camera-enabled techniques for organic synthesis

    Science.gov (United States)

    Ingham, Richard J; O’Brien, Matthew; Browne, Duncan L

    2013-01-01

    Summary A great deal of time is spent within synthetic chemistry laboratories on non-value-adding activities such as sample preparation and work-up operations, and labour intensive activities such as extended periods of continued data collection. Using digital cameras connected to computer vision algorithms, camera-enabled apparatus can perform some of these processes in an automated fashion, allowing skilled chemists to spend their time more productively. In this review we describe recent advances in this field of chemical synthesis and discuss how they will lead to advanced synthesis laboratories of the future. PMID:23766820

  4. Analysis of Brown camera distortion model

    Science.gov (United States)

    Nowakowski, Artur; Skarbek, Władysław

    2013-10-01

    Contemporary image acquisition devices introduce optical distortion into image. It results in pixel displacement and therefore needs to be compensated for many computer vision applications. The distortion is usually modeled by the Brown distortion model, which parameters can be included in camera calibration task. In this paper we describe original model, its dependencies and analyze orthogonality with regard to radius for its decentering distortion component. We also report experiments with camera calibration algorithm included in OpenCV library, especially a stability of distortion parameters estimation is evaluated.

  5. Vasomotor assessment by camera-based photoplethysmography

    Directory of Open Access Journals (Sweden)

    Trumpp Alexander

    2016-09-01

    Full Text Available Camera-based photoplethysmography (cbPPG is a novel technique that allows the contactless acquisition of cardio-respiratory signals. Previous works on cbPPG most often focused on heart rate extraction. This contribution is directed at the assessment of vasomotor activity by means of cameras. In an experimental study, we show that vasodilation and vasoconstriction both lead to significant changes in cbPPG signals. Our findings underline the potential of cbPPG to monitor vasomotor functions in real-life applications.

  6. A multidetector scintillation camera with 254 channels

    DEFF Research Database (Denmark)

    Sveinsdottir, E; Larsen, B; Rommer, P

    1977-01-01

    A computer-based scintillation camera has been designed for both dynamic and static radionuclide studies. The detecting head has 254 independent sodium iodide crystals, each with a photomultiplier and amplifier. In dynamic measurements simultaneous events can be recorded, and 1 million total counts...... per second can be accommodated with less than 0.5% loss in any one channel. This corresponds to a calculated deadtime of 5 nsec. The multidetector camera is being used for 133Xe dynamic studies of regional cerebral blood flow in man and for 99mTc and 197 Hg static imaging of the brain....

  7. Digital Camera as Gloss Measurement Device

    Directory of Open Access Journals (Sweden)

    Mihálik A.

    2016-05-01

    Full Text Available Nowadays digital cameras with both high resolution and the high dynamic range (HDR can be considered as parallel multiple sensors producing multiple measurements at once. In this paper we describe a technique for processing the captured HDR data and than fit them to theoretical surface reflection models in the form of bidirectional reflectance distribution function (BRDF. Finally, the tabular BRDF can be used to calculate the gloss reflection of the surface. We compare the captured glossiness by digital camera with gloss measured with the industry device and conclude that the results fit well in our experiments.

  8. Evaluation of mobile phone camera benchmarking using objective camera speed and image quality metrics

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2014-11-01

    When a mobile phone camera is tested and benchmarked, the significance of image quality metrics is widely acknowledged. There are also existing methods to evaluate the camera speed. However, the speed or rapidity metrics of the mobile phone's camera system has not been used with the quality metrics even if the camera speed has become a more and more important camera performance feature. There are several tasks in this work. First, the most important image quality and speed-related metrics of a mobile phone's camera system are collected from the standards and papers and, also, novel speed metrics are identified. Second, combinations of the quality and speed metrics are validated using mobile phones on the market. The measurements are done toward application programming interface of different operating systems. Finally, the results are evaluated and conclusions are made. The paper defines a solution to combine different image quality and speed metrics to a single benchmarking score. A proposal of the combined benchmarking metric is evaluated using measurements of 25 mobile phone cameras on the market. The paper is a continuation of a previous benchmarking work expanded with visual noise measurement and updates of the latest mobile phone versions.

  9. Development of a high resolution gamma camera system using finely grooved GAGG scintillator

    Energy Technology Data Exchange (ETDEWEB)

    Yamamoto, Seiichi [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine (Japan); Kataoka, Jun; Oshima, Tsubasa [Research Institute for Science and Engineering, Waseda University (Japan); Ogata, Yoshimune [Radiological and Medical Laboratory Sciences, Nagoya University Graduate School of Medicine (Japan); Watabe, Tadashi; Ikeda, Hayato; Kanai, Yasukazu; Hatazawa, Jun [Osaka University Graduate School of Medicine (Japan)

    2016-06-11

    High resolution gamma cameras require small pixel scintillator blocks with high light output. However, manufacturing a small pixel scintillator block is difficult when the pixel size becomes small. To solve this limitation, we developed a high resolution gamma camera system using a finely grooved Ce-doped Gd{sub 3}Al{sub 2}Ga{sub 3}O{sub 12} (GAGG) plate. Our gamma camera's detector consists of a 1-mm-thick finely grooved GAGG plate that is optically coupled to a 1-in. position sensitive photomultiplier tube (PSPMT). The grooved GAGG plate has 0.2×0.2 mm pixels with 0.05-mm wide slits (between the pixels) that were manufactured using a dicing saw. We used a Hamamatsu PSPMT with a 1-in. square high quantum efficiency (HQE) PSPMT (R8900-100-C12). The energy resolution for the Co-57 gamma photons (122 keV) was 18.5% FWHM. The intrinsic spatial resolution was estimated to be 0.7-mm FWHM. With a 0.5-mm diameter pinhole collimator mounted to its front, we achieved a high resolution, small field-of-view gamma camera. The system spatial resolution for the Co-57 gamma photons was 1.0-mm FWHM, and the sensitivity was 0.0025%, 10 mm from the collimator surface. The Tc-99m HMDP administered mouse images showed the fine structures of the mouse body's parts. Our developed high resolution small pixel GAGG gamma camera is promising for such small animal imaging.

  10. Free Piston Double Diaphragm Shock Tube

    OpenAIRE

    OGURA, Eiji; FUNABIKI, Katsushi; SATO, Shunichi; Abe, Takashi; 小倉, 栄二; 船曳, 勝之; 佐藤, 俊逸; 安部, 隆士

    1997-01-01

    A free piston double diaphragm shock tube was newly developed for generation of high Mach number shock wave. Its characteristics was investigated for various operation parameters; such as a strength of the diaphragm at the end of the comparession tube, an initial pressure of low pressure tube, an initial pressure of medium pressure tube and the volume of compression tube. Under the restriction of fixed pressures for the driver high pressure tube (32×10^5Pa) and the low pressure tube (40Pa) in...

  11. Towards Adaptive Virtual Camera Control In Computer Games

    DEFF Research Database (Denmark)

    Burelli, Paolo; Yannakakis, Georgios N.

    2011-01-01

    Automatic camera control aims to define a framework to control virtual camera movements in dynamic and unpredictable virtual environments while ensuring a set of desired visual properties. We inves- tigate the relationship between camera placement and playing behaviour in games and build a user...... model of the camera behaviour that can be used to control camera movements based on player preferences. For this purpose, we collect eye gaze, camera and game-play data from subjects playing a 3D platform game, we cluster gaze and camera information to identify camera behaviour profiles and we employ...... machine learning to build predictive models of the virtual camera behaviour. The perfor- mance of the models on unseen data reveals accuracies above 70% for all the player behaviour types identified. The characteristics of the gener- ated models, their limits and their use for creating adaptive automatic...

  12. Pump element for a tube pump

    DEFF Research Database (Denmark)

    2011-01-01

    relative to the rod element so as to allow for a fluid flow in the tube through the first valve member, along the rod element, and through the second valve member. The tube comprises an at least partly flexible tube portion between the valve members such that a repeated deformation of the flexible tube...... portion acts to alternately close and open the valve members thereby generating a fluid flow through the tube. The invention further relates to a pump element comprising at least two non-return valve members connected by a rod element, and for insertion in an at least partly flexible tube in such tube...... pump as mentioned above, thereby acting to generate a fluid flow through the tube upon repeated deformation of the tube between the two valve members. The pump element may comprise a connecting part for coupling to another tube and may comprise a sealing part establishing a fluid tight connection...

  13. Multimodal sensing-based camera applications

    Science.gov (United States)

    Bordallo López, Miguel; Hannuksela, Jari; Silvén, J. Olli; Vehviläinen, Markku

    2011-02-01

    The increased sensing and computing capabilities of mobile devices can provide for enhanced mobile user experience. Integrating the data from different sensors offers a way to improve application performance in camera-based applications. A key advantage of using cameras as an input modality is that it enables recognizing the context. Therefore, computer vision has been traditionally utilized in user interfaces to observe and automatically detect the user actions. The imaging applications can also make use of various sensors for improving the interactivity and the robustness of the system. In this context, two applications fusing the sensor data with the results obtained from video analysis have been implemented on a Nokia Nseries mobile device. The first solution is a real-time user interface that can be used for browsing large images. The solution enables the display to be controlled by the motion of the user's hand using the built-in sensors as complementary information. The second application is a real-time panorama builder that uses the device's accelerometers to improve the overall quality, providing also instructions during the capture. The experiments show that fusing the sensor data improves camera-based applications especially when the conditions are not optimal for approaches using camera data alone.

  14. Mapping large environments with an omnivideo camera

    NARCIS (Netherlands)

    Esteban, I.; Booij, O.; Zivkovic, Z.; Krose, B.

    2009-01-01

    We study the problem of mapping a large indoor environment using an omnivideo camera. Local features from omnivideo images and epipolar geometry are used to compute the relative pose between pairs of images. These poses are then used in an Extended Information Filter using a trajectory based represe

  15. Parametrizable cameras for 3D computational steering

    NARCIS (Netherlands)

    Mulder, J.D.; Wijk, J.J. van

    1997-01-01

    We present a method for the definition of multiple views in 3D interfaces for computational steering. The method uses the concept of a point-based parametrizable camera object. This concept enables a user to create and configure multiple views on his custom 3D interface in an intuitive graphical man

  16. Increased Automation in Stereo Camera Calibration Techniques

    Directory of Open Access Journals (Sweden)

    Brandi House

    2006-08-01

    Full Text Available Robotic vision has become a very popular field in recent years due to the numerous promising applications it may enhance. However, errors within the cameras and in their perception of their environment can cause applications in robotics to fail. To help correct these internal and external imperfections, stereo camera calibrations are performed. There are currently many accurate methods of camera calibration available; however, most or all of them are time consuming and labor intensive. This research seeks to automate the most labor intensive aspects of a popular calibration technique developed by Jean-Yves Bouguet. His process requires manual selection of the extreme corners of a checkerboard pattern. The modified process uses embedded LEDs in the checkerboard pattern to act as active fiducials. Images are captured of the checkerboard with the LEDs on and off in rapid succession. The difference of the two images automatically highlights the location of the four extreme corners, and these corner locations take the place of the manual selections. With this modification to the calibration routine, upwards of eighty mouse clicks are eliminated per stereo calibration. Preliminary test results indicate that accuracy is not substantially affected by the modified procedure. Improved automation to camera calibration procedures may finally penetrate the barriers to the use of calibration in practice.

  17. The Legal Implications of Surveillance Cameras

    Science.gov (United States)

    Steketee, Amy M.

    2012-01-01

    The nature of school security has changed dramatically over the last decade. Schools employ various measures, from metal detectors to identification badges to drug testing, to promote the safety and security of staff and students. One of the increasingly prevalent measures is the use of security cameras. In fact, the U.S. Department of Education…

  18. Autofocus method for scanning remote sensing cameras.

    Science.gov (United States)

    Lv, Hengyi; Han, Chengshan; Xue, Xucheng; Hu, Changhong; Yao, Cheng

    2015-07-10

    Autofocus methods are conventionally based on capturing the same scene from a series of positions of the focal plane. As a result, it has been difficult to apply this technique to scanning remote sensing cameras where the scenes change continuously. In order to realize autofocus in scanning remote sensing cameras, a novel autofocus method is investigated in this paper. Instead of introducing additional mechanisms or optics, the overlapped pixels of the adjacent CCD sensors on the focal plane are employed. Two images, corresponding to the same scene on the ground, can be captured at different times. Further, one step of focusing is done during the time interval, so that the two images can be obtained at different focal plane positions. Subsequently, the direction of the next step of focusing is calculated based on the two images. The analysis shows that the method investigated operates without restriction of the time consumption of the algorithm and realizes a total projection for general focus measures and algorithms from digital still cameras to scanning remote sensing cameras. The experiment results show that the proposed method is applicable to the entire focus measure family, and the error ratio is, on average, no more than 0.2% and drops to 0% by reliability improvement, which is lower than that of prevalent approaches (12%). The proposed method is demonstrated to be effective and has potential in other scanning imaging applications.

  19. Lights, Camera, Read! Arizona Reading Program Manual.

    Science.gov (United States)

    Arizona State Dept. of Library, Archives and Public Records, Phoenix.

    This document is the manual for the Arizona Reading Program (ARP) 2003 entitled "Lights, Camera, Read!" This theme spotlights books that were made into movies, and allows readers to appreciate favorite novels and stories that have progressed to the movie screen. The manual consists of eight sections. The Introduction includes welcome letters from…

  20. Camera! Action! Collaborate with Digital Moviemaking

    Science.gov (United States)

    Swan, Kathleen Owings; Hofer, Mark; Levstik, Linda S.

    2007-01-01

    Broadly defined, digital moviemaking integrates a variety of media (images, sound, text, video, narration) to communicate with an audience. There is near-ubiquitous access to the necessary software (MovieMaker and iMovie are bundled free with their respective operating systems) and hardware (computers with Internet access, digital cameras, etc.).…

  1. Metasurface lens: Shrinking the camera size

    Science.gov (United States)

    Sun, Cheng

    2017-01-01

    A miniaturized camera has been developed by integrating a planar metasurface lens doublet with a CMOS image sensor. The metasurface lens doublet corrects the monochromatic aberration and thus delivers nearly diffraction-limited image quality over a wide field of view.

  2. Camera shutter is actuated by electric signal

    Science.gov (United States)

    Neff, J. E.

    1964-01-01

    Rotary solenoid energized by an electric signal opens a camera shutter, and when the solenoid is de-energized a spring closes it. By the use of a microswitch, the shutter may be opened and closed in one continuous, rapid operation when the solenoid is actuated.

  3. Digital Camera Control for Faster Inspection

    Science.gov (United States)

    Brown, Katharine; Siekierski, James D.; Mangieri, Mark L.; Dekome, Kent; Cobarruvias, John; Piplani, Perry J.; Busa, Joel

    2009-01-01

    Digital Camera Control Software (DCCS) is a computer program for controlling a boom and a boom-mounted camera used to inspect the external surface of a space shuttle in orbit around the Earth. Running in a laptop computer in the space-shuttle crew cabin, DCCS commands integrated displays and controls. By means of a simple one-button command, a crewmember can view low- resolution images to quickly spot problem areas and can then cause a rapid transition to high- resolution images. The crewmember can command that camera settings apply to a specific small area of interest within the field of view of the camera so as to maximize image quality within that area. DCCS also provides critical high-resolution images to a ground screening team, which analyzes the images to assess damage (if any); in so doing, DCCS enables the team to clear initially suspect areas more quickly than would otherwise be possible and further saves time by minimizing the probability of re-imaging of areas already inspected. On the basis of experience with a previous version (2.0) of the software, the present version (3.0) incorporates a number of advanced imaging features that optimize crewmember capability and efficiency.

  4. Video Analysis with a Web Camera

    Science.gov (United States)

    Wyrembeck, Edward P.

    2009-01-01

    Recent advances in technology have made video capture and analysis in the introductory physics lab even more affordable and accessible. The purchase of a relatively inexpensive web camera is all you need if you already have a newer computer and Vernier's Logger Pro 3 software. In addition to Logger Pro 3, other video analysis tools such as…

  5. Teaching Camera Calibration by a Constructivist Methodology

    Science.gov (United States)

    Samper, D.; Santolaria, J.; Pastor, J. J.; Aguilar, J. J.

    2010-01-01

    This article describes the Metrovisionlab simulation software and practical sessions designed to teach the most important machine vision camera calibration aspects in courses for senior undergraduate students. By following a constructivist methodology, having received introductory theoretical classes, students use the Metrovisionlab application to…

  6. Camera Systems Rapidly Scan Large Structures

    Science.gov (United States)

    2013-01-01

    Needing a method to quickly scan large structures like an aircraft wing, Langley Research Center developed the line scanning thermography (LST) system. LST works in tandem with a moving infrared camera to capture how a material responds to changes in temperature. Princeton Junction, New Jersey-based MISTRAS Group Inc. now licenses the technology and uses it in power stations and industrial plants.

  7. Measuring rainfall with low-cost cameras

    Science.gov (United States)

    Allamano, Paola; Cavagnero, Paolo; Croci, Alberto; Laio, Francesco

    2016-04-01

    In Allamano et al. (2015), we propose to retrieve quantitative measures of rainfall intensity by relying on the acquisition and analysis of images captured from professional cameras (SmartRAIN technique in the following). SmartRAIN is based on the fundamentals of camera optics and exploits the intensity changes due to drop passages in a picture. The main steps of the method include: i) drop detection, ii) blur effect removal, iii) estimation of drop velocities, iv) drop positioning in the control volume, and v) rain rate estimation. The method has been applied to real rain events with errors of the order of ±20%. This work aims to bridge the gap between the need of acquiring images via professional cameras and the possibility of exporting the technique to low-cost webcams. We apply the image processing algorithm to frames registered with low-cost cameras both in the lab (i.e., controlled rain intensity) and field conditions. The resulting images are characterized by lower resolutions and significant distortions with respect to professional camera pictures, and are acquired with fixed aperture and a rolling shutter. All these hardware limitations indeed exert relevant effects on the readability of the resulting images, and may affect the quality of the rainfall estimate. We demonstrate that a proper knowledge of the image acquisition hardware allows one to fully explain the artefacts and distortions due to the hardware. We demonstrate that, by correcting these effects before applying the image processing algorithm, quantitative rain intensity measures are obtainable with a good accuracy also with low-cost modules.

  8. A novel fully integrated handheld gamma camera

    Energy Technology Data Exchange (ETDEWEB)

    Massari, R.; Ucci, A.; Campisi, C. [Biostructure and Bioimaging Institute (IBB), National Research Council of Italy (CNR), Rome (Italy); Scopinaro, F. [University of Rome “La Sapienza”, S. Andrea Hospital, Rome (Italy); Soluri, A., E-mail: alessandro.soluri@ibb.cnr.it [Biostructure and Bioimaging Institute (IBB), National Research Council of Italy (CNR), Rome (Italy)

    2016-10-01

    In this paper, we present an innovative, fully integrated handheld gamma camera, namely designed to gather in the same device the gamma ray detector with the display and the embedded computing system. The low power consumption allows the prototype to be battery operated. To be useful in radioguided surgery, an intraoperative gamma camera must be very easy to handle since it must be moved to find a suitable view. Consequently, we have developed the first prototype of a fully integrated, compact and lightweight gamma camera for radiopharmaceuticals fast imaging. The device can operate without cables across the sterile field, so it may be easily used in the operating theater for radioguided surgery. The prototype proposed consists of a Silicon Photomultiplier (SiPM) array coupled with a proprietary scintillation structure based on CsI(Tl) crystals. To read the SiPM output signals, we have developed a very low power readout electronics and a dedicated analog to digital conversion system. One of the most critical aspects we faced designing the prototype was the low power consumption, which is mandatory to develop a battery operated device. We have applied this detection device in the lymphoscintigraphy technique (sentinel lymph node mapping) comparing the results obtained with those of a commercial gamma camera (Philips SKYLight). The results obtained confirm a rapid response of the device and an adequate spatial resolution for the use in the scintigraphic imaging. This work confirms the feasibility of a small gamma camera with an integrated display. This device is designed for radioguided surgery and small organ imaging, but it could be easily combined into surgical navigation systems.

  9. Experimental and numerical study on the growth and collapse of a bubble in a narrow tube

    Institute of Scientific and Technical Information of China (English)

    Bao-Yu Ni; A-Man Zhang; Qian-Xi Wang; Bin Wang

    2012-01-01

    The growth,expansion and collapse of a bubble in a narrow tube are studied using both experiments and numerical simulations.In experiment,the bubble is generatedby an electric spark in a water tank and recorded by a highspeed camera system.In numerical simulation,the evolution of the bubble is solved by adopting axisymmetric boundary integral equation,considering the surface tension effect.The results of experiments and numerical simulations are compared and good agreements are achieved.Both of them show that a counter-jet forms and penetrates the bubble at the end of the collapse stage,before a ring type bubble forms.Under the attraction of the tube wall due to Bjerknes force,a ring jet is generated,pointing towards the tube.On the basis of this,some physical quantities like the pressure on the tube wall and kinetic energy are calculated in a case study.The effects of tube diameters and tube lengths on the bubble's behaviors are also investigated.

  10. NEW VERSATILE CAMERA CALIBRATION TECHNIQUE BASED ON LINEAR RECTIFICATION

    Institute of Scientific and Technical Information of China (English)

    Pan Feng; Wang Xuanyin

    2004-01-01

    A new versatile camera calibration technique for machine vision using off-the-shelf cameras is described. Aimed at the large distortion of the off-the-shelf cameras, a new camera distortion rectification technology based on line-rectification is proposed. A full-camera-distortion model is introduced and a linear algorithm is provided to obtain the solution. After the camera rectification intrinsic and extrinsic parameters are obtained based on the relationship between the homograph and absolute conic. This technology needs neither a high-accuracy three-dimensional calibration block, nor a complicated translation or rotation platform. Both simulations and experiments show that this method is effective and robust.

  11. Minimizing radiation dose to patient and staff during fluoroscopic, nasoenteral tube insertions

    Energy Technology Data Exchange (ETDEWEB)

    Rudin, S.; Bednarek, D.R. (New York Univ., NY (United States). School of Medicine)

    1992-02-01

    Since the fluoroscopic image during nasoenteral tube placements is used for guidance and not for diagnosis, a lower contrast image with increased quantum mottle can be easily tolerated. The three methods to reduce the radiation dose rate investigated consisted of removing camera from the image intensifier output phosphor, and setting the fluoroscopic mA to the minimum value so that the kVp could be maximized. Fluoroscopic frozen video frames of a clinical tube insertion comparing the images with and without the dose-saving techniques are presented. Measurements of the radiation dose rates using a Plexiglas phantom show that the dose for patient and staff during fluoroscopic-guided nasoenteral tube placements can be reduced by over a factor of 10 without significantly adversely affecting the actual placement procedure. (author).

  12. Physics of magnetic flux tubes

    CERN Document Server

    Ryutova, Margarita

    2015-01-01

    This book is the first account of the physics of magnetic flux tubes from their fundamental properties to collective phenomena in an ensembles of flux tubes. The physics of magnetic flux tubes is absolutely vital for understanding fundamental physical processes in the solar atmosphere shaped and governed by magnetic fields. High-resolution and high cadence observations from recent space and  ground-based instruments taken simultaneously at different heights and temperatures not only show the ubiquity of filamentary structure formation but also allow to study how various events are interconnected by system of magnetic flux tubes. The book covers both theory and observations. Theoretical models presented in analytical and phenomenological forms are tailored for practical applications. These are welded with state-of-the-art observations from early decisive ones to the most recent data that open a new phase-space for exploring the Sun and sun-like stars. Concept of magnetic flux tubes is central to various magn...

  13. VUV Testing of Science Cameras at MSFC: QE Measurement of the CLASP Flight Cameras

    Science.gov (United States)

    Champey, Patrick R.; Kobayashi, Ken; Winebarger, A.; Cirtain, J.; Hyde, D.; Robertson, B.; Beabout, B.; Beabout, D.; Stewart, M.

    2015-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed a science camera suitable for sub-orbital missions for observations in the UV, EUV and soft X-ray. Six cameras were built and tested for the Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP), a joint National Astronomical Observatory of Japan (NAOJ) and MSFC sounding rocket mission. The CLASP camera design includes a frame-transfer e2v CCD57-10 512x512 detector, dual channel analog readout electronics and an internally mounted cold block. At the flight operating temperature of -20 C, the CLASP cameras achieved the low-noise performance requirements (less than or equal to 25 e- read noise and greater than or equal to 10 e-/sec/pix dark current), in addition to maintaining a stable gain of approximately equal to 2.0 e-/DN. The e2v CCD57-10 detectors were coated with Lumogen-E to improve quantum efficiency (QE) at the Lyman- wavelength. A vacuum ultra-violet (VUV) monochromator and a NIST calibrated photodiode were employed to measure the QE of each camera. Four flight-like cameras were tested in a high-vacuum chamber, which was configured to operate several tests intended to verify the QE, gain, read noise, dark current and residual non-linearity of the CCD. We present and discuss the QE measurements performed on the CLASP cameras. We also discuss the high-vacuum system outfitted for testing of UV and EUV science cameras at MSFC.

  14. Drift tubes of Linac 2

    CERN Multimedia

    1977-01-01

    With the advent of the 800 MeV PS Booster in 1972, the original injector of the PS, a 50 MeV Alvarez-type proton linac, had reached its limits, in terms of intensity and stability. In 1973 one therefore decided to build a new linac (Linac 2), also with a drift-tube Alvarez structure and an energy of 50 MeV. It had a new Cockcroft-Walton preinjector with 750 keV, instead of the previous one with 500 keV. Linac 2 was put into service in 1980. The old Linac 1 was then used for the study of, and later operation with, various types of ions. This picture shows Linac 2 drift-tubes, suspended on stems coming from the top, in contrast to Linac 1, where the drift-tubes stood on stems coming from the bottom.

  15. [The Use of a Tracheal Tube for Guiding Nasogastric Tube Insertion].

    Science.gov (United States)

    Saima, Shunsuke; Asai, Takashi; Okuda, Yasuhisa

    2016-04-01

    An obese patient was scheduled for shoulder joint surgery under general anesthesia. After induction of anesthesia and tracheal intubation, insertion of a gastric tube was difficult. A new tracheal tube was prepared, the connecter was removed, and the tube was cut longitudinally. The tube was inserted orally into the esophagus. A gastric tube was passed through the nose, and its tip was taken out of the mouth. The tip of the gastric tube was passed through the tracheal tube, and its correct position in the stomach was confirmed by auscultation of the epigastrium. The tracheal tube was carefully taken out from the esophagus leaving the gastric tube in the stomach. The cut tracheal tube was peeled off from the gastric tube. Correct positioning of the gastric tube was re-confirmed.

  16. Registration of Sub-Sequence and Multi-Camera Reconstructions for Camera Motion Estimation

    Directory of Open Access Journals (Sweden)

    Michael Wand

    2010-08-01

    Full Text Available This paper presents different application scenarios for which the registration of sub-sequence reconstructions or multi-camera reconstructions is essential for successful camera motion estimation and 3D reconstruction from video. The registration is achieved by merging unconnected feature point tracks between the reconstructions. One application is drift removal for sequential camera motion estimation of long sequences. The state-of-the-art in drift removal is to apply a RANSAC approach to find unconnected feature point tracks. In this paper an alternative spectral algorithm for pairwise matching of unconnected feature point tracks is used. It is then shown that the algorithms can be combined and applied to novel scenarios where independent camera motion estimations must be registered into a common global coordinate system. In the first scenario multiple moving cameras, which capture the same scene simultaneously, are registered. A second new scenario occurs in situations where the tracking of feature points during sequential camera motion estimation fails completely, e.g., due to large occluding objects in the foreground, and the unconnected tracks of the independent reconstructions must be merged. In the third scenario image sequences of the same scene, which are captured under different illuminations, are registered. Several experiments with challenging real video sequences demonstrate that the presented techniques work in practice.

  17. The AOTF-based NO2 camera

    Science.gov (United States)

    Dekemper, Emmanuel; Vanhamel, Jurgen; Van Opstal, Bert; Fussen, Didier

    2016-12-01

    The abundance of NO2 in the boundary layer relates to air quality and pollution source monitoring. Observing the spatiotemporal distribution of NO2 above well-delimited (flue gas stacks, volcanoes, ships) or more extended sources (cities) allows for applications such as monitoring emission fluxes or studying the plume dynamic chemistry and its transport. So far, most attempts to map the NO2 field from the ground have been made with visible-light scanning grating spectrometers. Benefiting from a high retrieval accuracy, they only achieve a relatively low spatiotemporal resolution that hampers the detection of dynamic features. We present a new type of passive remote sensing instrument aiming at the measurement of the 2-D distributions of NO2 slant column densities (SCDs) with a high spatiotemporal resolution. The measurement principle has strong similarities with the popular filter-based SO2 camera as it relies on spectral images taken at wavelengths where the molecule absorption cross section is different. Contrary to the SO2 camera, the spectral selection is performed by an acousto-optical tunable filter (AOTF) capable of resolving the target molecule's spectral features. The NO2 camera capabilities are demonstrated by imaging the NO2 abundance in the plume of a coal-fired power plant. During this experiment, the 2-D distribution of the NO2 SCD was retrieved with a temporal resolution of 3 min and a spatial sampling of 50 cm (over a 250 × 250 m2 area). The detection limit was close to 5 × 1016 molecules cm-2, with a maximum detected SCD of 4 × 1017 molecules cm-2. Illustrating the added value of the NO2 camera measurements, the data reveal the dynamics of the NO to NO2 conversion in the early plume with an unprecedent resolution: from its release in the air, and for 100 m upwards, the observed NO2 plume concentration increased at a rate of 0.75-1.25 g s-1. In joint campaigns with SO2 cameras, the NO2 camera could also help in removing the bias introduced by the

  18. National Guidelines for Digital Camera Systems Certification

    Science.gov (United States)

    Yaron, Yaron; Keinan, Eran; Benhamu, Moshe; Regev, Ronen; Zalmanzon, Garry

    2016-06-01

    Digital camera systems are a key component in the production of reliable, geometrically accurate, high-resolution geospatial products. These systems have replaced film imaging in photogrammetric data capturing. Today, we see a proliferation of imaging sensors collecting photographs in different ground resolutions, spectral bands, swath sizes, radiometric characteristics, accuracies and carried on different mobile platforms. In addition, these imaging sensors are combined with navigational tools (such as GPS and IMU), active sensors such as laser scanning and powerful processing tools to obtain high quality geospatial products. The quality (accuracy, completeness, consistency, etc.) of these geospatial products is based on the use of calibrated, high-quality digital camera systems. The new survey regulations of the state of Israel specify the quality requirements for each geospatial product including: maps at different scales and for different purposes, elevation models, orthophotographs, three-dimensional models at different levels of details (LOD) and more. In addition, the regulations require that digital camera systems used for mapping purposes should be certified using a rigorous mapping systems certification and validation process which is specified in the Director General Instructions. The Director General Instructions for digital camera systems certification specify a two-step process as follows: 1. Theoretical analysis of system components that includes: study of the accuracy of each component and an integrative error propagation evaluation, examination of the radiometric and spectral response curves for the imaging sensors, the calibration requirements, and the working procedures. 2. Empirical study of the digital mapping system that examines a typical project (product scale, flight height, number and configuration of ground control points and process). The study examine all the aspects of the final product including; its accuracy, the product pixels size

  19. Working session 3: Tubing integrity

    Energy Technology Data Exchange (ETDEWEB)

    Cueto-Felgueroso, C. [Tecnatom, S.A., San Sebastian de los Reyes, Madrid (Spain); Strosnider, J. [NRC, Washington, DC (United States)

    1997-02-01

    Twenty-three individuals representing nine countries (Belgium, Canada, the Czech Republic, France, Japan, the Slovak Republic, Spain, the UK, and the US) participated in the session on tube integrity. These individuals represented utilities, vendors, consultants and regulatory authorities. The major subjects discussed by the group included overall objectives of managing steam generator tube degradation, necessary elements of a steam generator degradation management program, the concept of degradation specific management, structural integrity evaluations, leakage evaluations, and specific degradation mechanisms. The group`s discussions on these subjects, including conclusions and recommendations, are summarized in this article.

  20. Mechanical Instabilities of Biological Tubes

    Science.gov (United States)

    Hannezo, Edouard; Prost, Jacques; Joanny, Jean-François

    2012-07-01

    We study theoretically the morphologies of biological tubes affected by various pathologies. When epithelial cells grow, the negative tension produced by their division provokes a buckling instability. Several shapes are investigated: varicose, dilated, sinuous, or sausagelike. They are all found in pathologies of tracheal, renal tubes, or arteries. The final shape depends crucially on the mechanical parameters of the tissues: Young’s modulus, wall-to-lumen ratio, homeostatic pressure. We argue that since tissues must be in quasistatic mechanical equilibrium, abnormal shapes convey information as to what causes the pathology. We calculate a phase diagram of tubular instabilities which could be a helpful guide for investigating the underlying genetic regulation.

  1. Orifice plates and venturi tubes

    CERN Document Server

    Reader-Harris, Michael

    2015-01-01

    This book gives the background to differential-pressure flow measurement and goes through the requirements explaining the reason for them. For those who want to use an orifice plate or a Venturi tube the standard ISO 5167 and its associated Technical Reports give the instructions required.  However, they rarely tell the users why they should follow certain instructions.  This book helps users of the ISO standards for orifice plates and Venturi tubes to understand the reasons why the standards are as they are, to apply them effectively, and to understand the consequences of deviations from the standards.

  2. Anatomy of the Eustachian Tube.

    Science.gov (United States)

    Leuwer, Rudolf

    2016-10-01

    The eustachian tube consists of 2 compartments: the Rüdinger's safety canal and the auxiliary gap. It is surrounded by a cartilaginous wall on the craniomedial side and a membranous wall on the inferolateral side. The eustachian tube cartilage is firmly attached to the skull base by the lateral and the medial suspensory ligaments, which are separated by the medial Ostmann fat pad. The function of the isometric tensor veli palatini muscle is modulated by hypomochlia, which have an influence on the muscular force vectors.

  3. Method for out-of-focus camera calibration.

    Science.gov (United States)

    Bell, Tyler; Xu, Jing; Zhang, Song

    2016-03-20

    State-of-the-art camera calibration methods assume that the camera is at least nearly in focus and thus fail if the camera is substantially defocused. This paper presents a method which enables the accurate calibration of an out-of-focus camera. Specifically, the proposed method uses a digital display (e.g., liquid crystal display monitor) to generate fringe patterns that encode feature points into the carrier phase; these feature points can be accurately recovered, even if the fringe patterns are substantially blurred (i.e., the camera is substantially defocused). Experiments demonstrated that the proposed method can accurately calibrate a camera regardless of the amount of defocusing: the focal length difference is approximately 0.2% when the camera is focused compared to when the camera is substantially defocused.

  4. World's fastest and most sensitive astronomical camera

    Science.gov (United States)

    2009-06-01

    The next generation of instruments for ground-based telescopes took a leap forward with the development of a new ultra-fast camera that can take 1500 finely exposed images per second even when observing extremely faint objects. The first 240x240 pixel images with the world's fastest high precision faint light camera were obtained through a collaborative effort between ESO and three French laboratories from the French Centre National de la Recherche Scientifique/Institut National des Sciences de l'Univers (CNRS/INSU). Cameras such as this are key components of the next generation of adaptive optics instruments of Europe's ground-based astronomy flagship facility, the ESO Very Large Telescope (VLT). ESO PR Photo 22a/09 The CCD220 detector ESO PR Photo 22b/09 The OCam camera ESO PR Video 22a/09 OCam images "The performance of this breakthrough camera is without an equivalent anywhere in the world. The camera will enable great leaps forward in many areas of the study of the Universe," says Norbert Hubin, head of the Adaptive Optics department at ESO. OCam will be part of the second-generation VLT instrument SPHERE. To be installed in 2011, SPHERE will take images of giant exoplanets orbiting nearby stars. A fast camera such as this is needed as an essential component for the modern adaptive optics instruments used on the largest ground-based telescopes. Telescopes on the ground suffer from the blurring effect induced by atmospheric turbulence. This turbulence causes the stars to twinkle in a way that delights poets, but frustrates astronomers, since it blurs the finest details of the images. Adaptive optics techniques overcome this major drawback, so that ground-based telescopes can produce images that are as sharp as if taken from space. Adaptive optics is based on real-time corrections computed from images obtained by a special camera working at very high speeds. Nowadays, this means many hundreds of times each second. The new generation instruments require these

  5. Depth estimation and camera calibration of a focused plenoptic camera for visual odometry

    Science.gov (United States)

    Zeller, Niclas; Quint, Franz; Stilla, Uwe

    2016-08-01

    This paper presents new and improved methods of depth estimation and camera calibration for visual odometry with a focused plenoptic camera. For depth estimation we adapt an algorithm previously used in structure-from-motion approaches to work with images of a focused plenoptic camera. In the raw image of a plenoptic camera, scene patches are recorded in several micro-images under slightly different angles. This leads to a multi-view stereo-problem. To reduce the complexity, we divide this into multiple binocular stereo problems. For each pixel with sufficient gradient we estimate a virtual (uncalibrated) depth based on local intensity error minimization. The estimated depth is characterized by the variance of the estimate and is subsequently updated with the estimates from other micro-images. Updating is performed in a Kalman-like fashion. The result of depth estimation in a single image of the plenoptic camera is a probabilistic depth map, where each depth pixel consists of an estimated virtual depth and a corresponding variance. Since the resulting image of the plenoptic camera contains two plains: the optical image and the depth map, camera calibration is divided into two separate sub-problems. The optical path is calibrated based on a traditional calibration method. For calibrating the depth map we introduce two novel model based methods, which define the relation of the virtual depth, which has been estimated based on the light-field image, and the metric object distance. These two methods are compared to a well known curve fitting approach. Both model based methods show significant advantages compared to the curve fitting method. For visual odometry we fuse the probabilistic depth map gained from one shot of the plenoptic camera with the depth data gained by finding stereo correspondences between subsequent synthesized intensity images of the plenoptic camera. These images can be synthesized totally focused and thus finding stereo correspondences is enhanced

  6. Situational Awareness from a Low-Cost Camera System

    Science.gov (United States)

    Freudinger, Lawrence C.; Ward, David; Lesage, John

    2010-01-01

    A method gathers scene information from a low-cost camera system. Existing surveillance systems using sufficient cameras for continuous coverage of a large field necessarily generate enormous amounts of raw data. Digitizing and channeling that data to a central computer and processing it in real time is difficult when using low-cost, commercially available components. A newly developed system is located on a combined power and data wire to form a string-of-lights camera system. Each camera is accessible through this network interface using standard TCP/IP networking protocols. The cameras more closely resemble cell-phone cameras than traditional security camera systems. Processing capabilities are built directly onto the camera backplane, which helps maintain a low cost. The low power requirements of each camera allow the creation of a single imaging system comprising over 100 cameras. Each camera has built-in processing capabilities to detect events and cooperatively share this information with neighboring cameras. The location of the event is reported to the host computer in Cartesian coordinates computed from data correlation across multiple cameras. In this way, events in the field of view can present low-bandwidth information to the host rather than high-bandwidth bitmap data constantly being generated by the cameras. This approach offers greater flexibility than conventional systems, without compromising performance through using many small, low-cost cameras with overlapping fields of view. This means significant increased viewing without ignoring surveillance areas, which can occur when pan, tilt, and zoom cameras look away. Additionally, due to the sharing of a single cable for power and data, the installation costs are lower. The technology is targeted toward 3D scene extraction and automatic target tracking for military and commercial applications. Security systems and environmental/ vehicular monitoring systems are also potential applications.

  7. Pile volume measurement by range imaging camera in indoor environment

    OpenAIRE

    C. Altuntas

    2014-01-01

    Range imaging (RIM) camera is recent technology in 3D location measurement. The new study areas have been emerged in measurement and data processing together with RIM camera. It has low-cost and fast measurement technique compared to the current measurement techniques. However its measurement accuracy varies according to effects resulting from the device and the environment. The direct sunlight is affect measurement accuracy of the camera. Thus, RIM camera should be used for indoor ...

  8. Manually operated piston-driven shock tube

    OpenAIRE

    Reddy, KPJ; Sharath, N

    2013-01-01

    A simple hand-operated shock tube capable of producing Mach 2 shock waves is described. Performance of this miniature shock tube using compressed high pressure air created by a manually operated piston in the driver section of the shock tube as driver gas with air at 1 atm pressure as the test gas in the driven tube is presented. The performance of the shock tube is found to match well with the theoretically estimated values using normal shock relations. Applications of this shock tube named ...

  9. 21 CFR 876.5980 - Gastrointestinal tube and accessories.

    Science.gov (United States)

    2010-04-01

    ... intubation, feeding tube, gastroenterostomy tube, Levine tube, nasogastric tube, single lumen tube with... § 876.9. (2) Class I (general controls) for the dissolvable nasogastric feed tube guide for the nasogastric tube. The class I device is exempt from the premarket notification procedures in subpart E of...

  10. Camera Augmented Mobile C-arm

    Science.gov (United States)

    Wang, Lejing; Weidert, Simon; Traub, Joerg; Heining, Sandro Michael; Riquarts, Christian; Euler, Ekkehard; Navab, Nassir

    The Camera Augmented Mobile C-arm (CamC) system that extends a regular mobile C-arm by a video camera provides an X-ray and video image overlay. Thanks to the mirror construction and one time calibration of the device, the acquired X-ray images are co-registered with the video images without any calibration or registration during the intervention. It is very important to quantify and qualify the system before its introduction into the OR. In this communication, we extended the previously performed overlay accuracy analysis of the CamC system by another clinically important parameter, the applied radiation dose for the patient. Since the mirror of the CamC system will absorb and scatter radiation, we introduce a method for estimating the correct applied dose by using an independent dose measurement device. The results show that the mirror absorbs and scatters 39% of X-ray radiation.

  11. First polarised light with the NIKA camera

    CERN Document Server

    Ritacco, A; Adane, A; Ade, P; André, P; Beelen, A; Belier, B; Benoît, A; Bideaud, A; Billot, N; Bourrion, O; Calvo, M; Catalano, A; Coiffard, G; Comis, B; D'Addabbo, A; Désert, F -X; Doyle, S; Goupy, J; Kramer, C; Leclercq, S; Macías-Pérez, J F; Martino, J; Mauskopf, P; Maury, A; Mayet, F; Monfardini, A; Pajot, F; Pascale, E; Perotto, L; Pisano, G; Ponthieu, N; Rebolo-Iglesias, M; Réveret, V; Rodriguez, L; Savini, G; Schuster, K; Sievers, A; Thum, C; Triqueneaux, S; Tucker, C; Zylka, R

    2015-01-01

    NIKA is a dual-band camera operating with 315 frequency multiplexed LEKIDs cooled at 100 mK. NIKA is designed to observe the sky in intensity and polarisation at 150 and 260 GHz from the IRAM 30-m telescope. It is a test-bench for the final NIKA2 camera. The incoming linear polarisation is modulated at four times the mechanical rotation frequency by a warm rotating multi-layer Half Wave Plate. Then, the signal is analysed by a wire grid and finally absorbed by the LEKIDs. The small time constant (< 1ms ) of the LEKID detectors combined with the modulation of the HWP enables the quasi-simultaneous measurement of the three Stokes parameters I, Q, U, representing linear polarisation. In this pa- per we present results of recent observational campaigns demonstrating the good performance of NIKA in detecting polarisation at mm wavelength.

  12. SLAM using camera and IMU sensors.

    Energy Technology Data Exchange (ETDEWEB)

    Rothganger, Fredrick H.; Muguira, Maritza M.

    2007-01-01

    Visual simultaneous localization and mapping (VSLAM) is the problem of using video input to reconstruct the 3D world and the path of the camera in an 'on-line' manner. Since the data is processed in real time, one does not have access to all of the data at once. (Contrast this with structure from motion (SFM), which is usually formulated as an 'off-line' process on all the data seen, and is not time dependent.) A VSLAM solution is useful for mobile robot navigation or as an assistant for humans exploring an unknown environment. This report documents the design and implementation of a VSLAM system that consists of a small inertial measurement unit (IMU) and camera. The approach is based on a modified Extended Kalman Filter. This research was performed under a Laboratory Directed Research and Development (LDRD) effort.

  13. The large APEX bolometer camera LABOCA

    Science.gov (United States)

    Siringo, Giorgio; Kreysa, Ernst; Kovacs, Attila; Schuller, Frederic; Weiß, Axel; Esch, Walter; Gemünd, Hans-Peter; Jethava, Nikhil; Lundershausen, Gundula; Güsten, Rolf; Menten, Karl M.; Beelen, Alexandre; Bertoldi, Frank; Beeman, Jeffrey W.; Haller, Eugene E.; Colin, Angel

    2008-07-01

    A new facility instrument, the Large APEX Bolometer Camera (LABOCA), developed by the Max-Planck-Institut für Radioastronomie (MPIfR, Bonn, Germany), has been commissioned in May 2007 for operation on the Atacama Pathfinder Experiment telescope (APEX), a 12 m submillimeter radio telescope located at 5100 m altitude on Llano de Chajnantor in northern Chile. For mapping, this 295-bolometer camera for the 870 micron atmospheric window operates in total power mode without wobbling the secondary mirror. One LABOCA beam is 19 arcsec FWHM and the field of view of the complete array covers 100 square arcmin. Combined with the high efficiency of APEX and the excellent atmospheric transmission at the site, LABOCA offers unprecedented capability in large scale mapping of submillimeter continuum emission. Details of design and operation are presented.

  14. First Polarised Light with the NIKA Camera

    Science.gov (United States)

    Ritacco, A.; Adam, R.; Adane, A.; Ade, P.; André, P.; Beelen, A.; Belier, B.; Benoît, A.; Bideaud, A.; Billot, N.; Bourrion, O.; Calvo, M.; Catalano, A.; Coiffard, G.; Comis, B.; D'Addabbo, A.; Désert, F.-X.; Doyle, S.; Goupy, J.; Kramer, C.; Leclercq, S.; Macías-Pérez, J. F.; Martino, J.; Mauskopf, P.; Maury, A.; Mayet, F.; Monfardini, A.; Pajot, F.; Pascale, E.; Perotto, L.; Pisano, G.; Ponthieu, N.; Rebolo-Iglesias, M.; Revéret, V.; Rodriguez, L.; Savini, G.; Schuster, K.; Sievers, A.; Thum, C.; Triqueneaux, S.; Tucker, C.; Zylka, R.

    2016-08-01

    NIKA is a dual-band camera operating with 315 frequency multiplexed LEKIDs cooled at 100 mK. NIKA is designed to observe the sky in intensity and polarisation at 150 and 260 GHz from the IRAM 30-m telescope. It is a test-bench for the final NIKA2 camera. The incoming linear polarisation is modulated at four times the mechanical rotation frequency by a warm rotating multi-layer half- wave plate. Then, the signal is analyzed by a wire grid and finally absorbed by the lumped element kinetic inductance detectors (LEKIDs). The small time constant (ms ) of the LEKIDs combined with the modulation of the HWP enables the quasi-simultaneous measurement of the three Stokes parameters I, Q, U, representing linear polarisation. In this paper, we present the results of recent observational campaigns demonstrating the good performance of NIKA in detecting polarisation at millimeter wavelength.

  15. Cervical SPECT Camera for Parathyroid Imaging

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2012-08-31

    Primary hyperparathyroidism characterized by one or more enlarged parathyroid glands has become one of the most common endocrine diseases in the world affecting about 1 per 1000 in the United States. Standard treatment is highly invasive exploratory neck surgery called Parathyroidectomy. The surgery has a notable mortality rate because of the close proximity to vital structures. The move to minimally invasive parathyroidectomy is hampered by the lack of high resolution pre-surgical imaging techniques that can accurately localize the parathyroid with respect to surrounding structures. We propose to develop a dedicated ultra-high resolution (~ 1 mm) and high sensitivity (10x conventional camera) cervical scintigraphic imaging device. It will be based on a multiple pinhole-camera SPECT system comprising a novel solid state CZT detector that offers the required performance. The overall system will be configured to fit around the neck and comfortably image a patient.

  16. AUTOMATIC THEFT SECURITY SYSTEM (SMART SURVEILLANCE CAMERA

    Directory of Open Access Journals (Sweden)

    Veena G.S

    2013-12-01

    Full Text Available The proposed work aims to create a smart application camera, with the intention of eliminating the need for a human presence to detect any unwanted sinister activities, such as theft in this case. Spread among the campus, are certain valuable biometric identification systems at arbitrary locations. The application monitosr these systems (hereafter referred to as “object” using our smart camera system based on an OpenCV platform. By using OpenCV Haar Training, employing the Viola-Jones algorithm implementation in OpenCV, we teach the machine to identify the object in environmental conditions. An added feature of face recognition is based on Principal Component Analysis (PCA to generate Eigen Faces and the test images are verified by using distance based algorithm against the eigenfaces, like Euclidean distance algorithm or Mahalanobis Algorithm. If the object is misplaced, or an unauthorized user is in the extreme vicinity of the object, an alarm signal is raised.

  17. Advanced EVA Suit Camera System Development Project

    Science.gov (United States)

    Mock, Kyla

    2016-01-01

    The National Aeronautics and Space Administration (NASA) at the Johnson Space Center (JSC) is developing a new extra-vehicular activity (EVA) suit known as the Advanced EVA Z2 Suit. All of the improvements to the EVA Suit provide the opportunity to update the technology of the video imagery. My summer internship project involved improving the video streaming capabilities of the cameras that will be used on the Z2 Suit for data acquisition. To accomplish this, I familiarized myself with the architecture of the camera that is currently being tested to be able to make improvements on the design. Because there is a lot of benefit to saving space, power, and weight on the EVA suit, my job was to use Altium Design to start designing a much smaller and simplified interface board for the camera's microprocessor and external components. This involved checking datasheets of various components and checking signal connections to ensure that this architecture could be used for both the Z2 suit and potentially other future projects. The Orion spacecraft is a specific project that may benefit from this condensed camera interface design. The camera's physical placement on the suit also needed to be determined and tested so that image resolution can be maximized. Many of the options of the camera placement may be tested along with other future suit testing. There are multiple teams that work on different parts of the suit, so the camera's placement could directly affect their research or design. For this reason, a big part of my project was initiating contact with other branches and setting up multiple meetings to learn more about the pros and cons of the potential camera placements we are analyzing. Collaboration with the multiple teams working on the Advanced EVA Z2 Suit is absolutely necessary and these comparisons will be used as further progress is made for the overall suit design. This prototype will not be finished in time for the scheduled Z2 Suit testing, so my time was

  18. Continuous Graph Partitioning for Camera Network Surveillance

    Science.gov (United States)

    2012-07-23

    robot teams. IEEE Transactions on Robotics , 26(1):32–47, 2010. [15] S. M. LaValle. Planning Algorithms. Cambridge University Press, 2006. Available at... Transactions on Robotics , 28(3):592–606, 2012. [21] M. Spindler, F. Pasqualetti, and F. Bullo. Distributed multi-camera synchronization for smart-intruder...F. Pasqualetti, A. Franchi, and F. Bullo. On cooperative patrolling: Optimal trajectories, complexity analysis and approximation algorithms. IEEE

  19. Task-based automatic camera placement

    OpenAIRE

    Kabak, Mustafa

    2010-01-01

    Ankara : The Department of Computer Engineering and the Institute of Engineering and Science of Bilkent Univ, 2010. Thesis (Master's) -- Bilkent University, 2010. Includes biblioraphical references 56-57. Placing cameras to view an animation that takes place in a virtual 3D environment is a di cult task. Correctly placing an object in space and orienting it, and furthermore, animating it to follow the action in the scene is an activity that requires considerable expertise. ...

  20. Using a portable holographic camera in cosmetology

    Science.gov (United States)

    Bakanas, R.; Gudaitis, G. A.; Zacharovas, S. J.; Ratcliffe, D. B.; Hirsch, S.; Frey, S.; Thelen, A.; Ladrière, N.; Hering, P.

    2006-07-01

    The HSF-MINI portable holographic camera is used to record holograms of the human face. The recorded holograms are analyzed using a unique three-dimensional measurement system that provides topometric data of the face with resolution less than or equal to 0.5 mm. The main advantages of this method over other, more traditional methods (such as laser triangulation and phase-measurement triangulation) are discussed.

  1. VIRUS-P: camera design and performance

    Science.gov (United States)

    Tufts, Joseph R.; MacQueen, Phillip J.; Smith, Michael P.; Segura, Pedro R.; Hill, Gary J.; Edmonston, Robert D.

    2008-07-01

    We present the design and performance of the prototype Visible Integral-field Replicable Unit Spectrograph (VIRUS-P) camera. Commissioned in 2007, VIRUS-P is the prototype for 150+ identical fiber-fed integral field spectrographs for the Hobby-Eberly Telescope Dark Energy Experiment. With minimal complexity, the gimbal mounted, double-Schmidt design achieves high on-sky throughput, image quality, contrast, and stability with novel optics, coatings, baffling, and minimization of obscuration. The system corrector working for both the collimator and f / 1.33 vacuum Schmidt camera serves as the cryostat window while a 49 mm square aspheric field flattener sets the central obscuration. The mount, electronics, and cooling of the 2k × 2k, Fairchild Imaging CCD3041-BI fit in the field-flattener footprint. Ultra-black knife edge baffles at the corrector, spider, and adjustable mirror, and a detector mask, match the optical footprints at each location and help maximize the 94% contrast between 245 spectra. An optimally stiff and light symmetric four vane stainless steel spider supports the CCD which is thermally isolated with an equally stiff Ultem-1000 structure. The detector/field flattener spacing is maintained to 1 μm for all camera orientations and repeatably reassembled to 12 μm. Invar rods in tension hold the camera focus to +/-4 μm over a -5-25 °C temperature range. Delivering a read noise of 4.2 e- RMS, sCTE of 1-10-5 , and pCTE of 1-10-6 at 100 kpix/s, the McDonald V2 controller also helps to achieve a 38 hr hold time with 3 L of LN2 while maintaining the detector temperature setpoint to 150 μK (5σ RMS).

  2. Camera Development for the Cherenkov Telescope Array

    Science.gov (United States)

    Moncada, Roberto Jose

    2017-01-01

    With the Cherenkov Telescope Array (CTA), the very-high-energy gamma-ray universe, between 30 GeV and 300 TeV, will be probed at an unprecedented resolution, allowing deeper studies of known gamma-ray emitters and the possible discovery of new ones. This exciting project could also confirm the particle nature of dark matter by looking for the gamma rays produced by self-annihilating weakly interacting massive particles (WIMPs). The telescopes will use the imaging atmospheric Cherenkov technique (IACT) to record Cherenkov photons that are produced by the gamma-ray induced extensive air shower. One telescope design features dual-mirror Schwarzschild-Couder (SC) optics that allows the light to be finely focused on the high-resolution silicon photomultipliers of the camera modules starting from a 9.5-meter primary mirror. Each camera module will consist of a focal plane module and front-end electronics, and will have four TeV Array Readout with GSa/s Sampling and Event Trigger (TARGET) chips, giving them 64 parallel input channels. The TARGET chip has a self-trigger functionality for readout that can be used in higher logic across camera modules as well as across individual telescopes, which will each have 177 camera modules. There will be two sites, one in the northern and the other in the southern hemisphere, for full sky coverage, each spanning at least one square kilometer. A prototype SC telescope is currently under construction at the Fred Lawrence Whipple Observatory in Arizona. This work was supported by the National Science Foundation's REU program through NSF award AST-1560016.

  3. Rank-based camera spectral sensitivity estimation.

    Science.gov (United States)

    Finlayson, Graham; Darrodi, Maryam Mohammadzadeh; Mackiewicz, Michal

    2016-04-01

    In order to accurately predict a digital camera response to spectral stimuli, the spectral sensitivity functions of its sensor need to be known. These functions can be determined by direct measurement in the lab-a difficult and lengthy procedure-or through simple statistical inference. Statistical inference methods are based on the observation that when a camera responds linearly to spectral stimuli, the device spectral sensitivities are linearly related to the camera rgb response values, and so can be found through regression. However, for rendered images, such as the JPEG images taken by a mobile phone, this assumption of linearity is violated. Even small departures from linearity can negatively impact the accuracy of the recovered spectral sensitivities, when a regression method is used. In our work, we develop a novel camera spectral sensitivity estimation technique that can recover the linear device spectral sensitivities from linear images and the effective linear sensitivities from rendered images. According to our method, the rank order of a pair of responses imposes a constraint on the shape of the underlying spectral sensitivity curve (of the sensor). Technically, each rank-pair splits the space where the underlying sensor might lie in two parts (a feasible region and an infeasible region). By intersecting the feasible regions from all the ranked-pairs, we can find a feasible region of sensor space. Experiments demonstrate that using rank orders delivers equal estimation to the prior art. However, the Rank-based method delivers a step-change in estimation performance when the data is not linear and, for the first time, allows for the estimation of the effective sensitivities of devices that may not even have "raw mode." Experiments validate our method.

  4. Tracking Using Peer-to-Peer Smart Infrared Cameras

    Science.gov (United States)

    2008-11-05

    calibration and gesture recognition from multi-spectral camera setups, including infrared and visible cameras. Result: We developed new object models for...work on single-camera gesture recognition . We partnered with Yokogawa Electric to develop new architectures for embedded computer vision. We developed

  5. Speed cameras : how they work and what effect they have.

    NARCIS (Netherlands)

    2011-01-01

    Much research has been carried out into the effects of speed cameras, and the research shows consistently positive results. International review studies report that speed cameras produce a reduction of approximately 20% in personal injury crashes on road sections where cameras are used. In the Nethe

  6. CCD characterization for a range of color cameras

    NARCIS (Netherlands)

    Withagen, P.J.; Groen, F.C.A.; Schutte, K.

    2005-01-01

    CCD cameras are widely used for remote sensing and image processing applications. However, most cameras are produced to create nice images, not to do accurate measurements. Post processing operations such as gamma adjustment and automatic gain control are incorporated in the camera. When a (CCD) cam

  7. 16 CFR 3.45 - In camera orders.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false In camera orders. 3.45 Section 3.45... PRACTICE FOR ADJUDICATIVE PROCEEDINGS Hearings § 3.45 In camera orders. (a) Definition. Except as hereinafter provided, material made subject to an in camera order will be kept confidential and not placed...

  8. 21 CFR 892.1100 - Scintillation (gamma) camera.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Scintillation (gamma) camera. 892.1100 Section 892...) MEDICAL DEVICES RADIOLOGY DEVICES Diagnostic Devices § 892.1100 Scintillation (gamma) camera. (a) Identification. A scintillation (gamma) camera is a device intended to image the distribution of radionuclides...

  9. Weed detection by UAV with camera guided landing sequence

    DEFF Research Database (Denmark)

    Dyrmann, Mads

    the built-in GPS, allows for the UAV to be navigated within the field of view of a camera, which is mounted on the landing platform. The camera on the platform determines the UAVs position and orientation from markers printed on the UAV, whereby it can be guided in its landing. The UAV has a camera mounted...

  10. 39 CFR 3001.31a - In camera orders.

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false In camera orders. 3001.31a Section 3001.31a Postal... Applicability § 3001.31a In camera orders. (a) Definition. Except as hereinafter provided, documents and testimony made subject to in camera orders are not made a part of the public record, but are...

  11. 15 CFR 743.3 - Thermal imaging camera reporting.

    Science.gov (United States)

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Thermal imaging camera reporting. 743... REPORTING § 743.3 Thermal imaging camera reporting. (a) General requirement. Exports of thermal imaging cameras must be reported to BIS as provided in this section. (b) Transactions to be reported. Exports...

  12. 21 CFR 878.4160 - Surgical camera and accessories.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Surgical camera and accessories. 878.4160 Section... (CONTINUED) MEDICAL DEVICES GENERAL AND PLASTIC SURGERY DEVICES Surgical Devices § 878.4160 Surgical camera and accessories. (a) Identification. A surgical camera and accessories is a device intended to be...

  13. Single eye or camera with depth perception

    Science.gov (United States)

    Kornreich, Philipp; Farell, Bart

    2012-10-01

    An imager that can measure the distance from each pixel to the point on the object that is in focus at the pixel is described. This is accomplished by a short photoconducting lossi lightguide section at each pixel. The eye or camera lens selects the object point who's range is to be determined at the pixel. Light arriving at an image point trough a convex lens adds constructively only if it comes from the object point that is in focus at this pixel.. Light waves from all other object points cancel. Thus the lightguide at this pixel receives light from one object point only. This light signal has a phase component proportional to the range. The light intensity modes and thus the photocurrent in the lightguides shift in response to the phase of the incoming light. Contacts along the length of the lightguide collect the photocurrent signal containing the range information. Applications of this camera include autonomous vehicle navigation and robotic vision. An interesting application is as part of a crude teleportation system consisting of this camera and a three dimensional printer at a remote location.

  14. Auto convergence for stereoscopic 3D cameras

    Science.gov (United States)

    Zhang, Buyue; Kothandaraman, Sreenivas; Batur, Aziz Umit

    2012-03-01

    Viewing comfort is an important concern for 3-D capable consumer electronics such as 3-D cameras and TVs. Consumer generated content is typically viewed at a close distance which makes the vergence-accommodation conflict particularly pronounced, causing discomfort and eye fatigue. In this paper, we present a Stereo Auto Convergence (SAC) algorithm for consumer 3-D cameras that reduces the vergence-accommodation conflict on the 3-D display by adjusting the depth of the scene automatically. Our algorithm processes stereo video in realtime and shifts each stereo frame horizontally by an appropriate amount to converge on the chosen object in that frame. The algorithm starts by estimating disparities between the left and right image pairs using correlations of the vertical projections of the image data. The estimated disparities are then analyzed by the algorithm to select a point of convergence. The current and target disparities of the chosen convergence point determines how much horizontal shift is needed. A disparity safety check is then performed to determine whether or not the maximum and minimum disparity limits would be exceeded after auto convergence. If the limits would be exceeded, further adjustments are made to satisfy the safety limits. Finally, desired convergence is achieved by shifting the left and the right frames accordingly. Our algorithm runs real-time at 30 fps on a TI OMAP4 processor. It is tested using an OMAP4 embedded prototype stereo 3-D camera. It significantly improves 3-D viewing comfort.

  15. Stereo cameras on the International Space Station

    Science.gov (United States)

    Sabbatini, Massimo; Visentin, Gianfranco; Collon, Max; Ranebo, Hans; Sunderland, David; Fortezza, Raimondo

    2007-02-01

    Three-dimensional media is a unique and efficient means to virtually visit/observe objects that cannot be easily reached otherwise, like the International Space Station. The advent of auto-stereoscopic displays and stereo projection system is making the stereo media available to larger audiences than the traditional scientists and design engineers communities. It is foreseen that a major demand for 3D content shall come from the entertainment area. Taking advantage of the 6 months long permanence on the International Space Station of a colleague European Astronaut, Thomas Reiter, the Erasmus Centre uploaded to the ISS a newly developed, fully digital stereo camera, the Erasmus Recording Binocular. Testing the camera and its human interfaces in weightlessness, as well as accurately mapping the interior of the ISS are the main objectives of the experiment that has just been completed at the time of writing. The intent of this paper is to share with the readers the design challenges tackled in the development and operation of the ERB camera and highlight some of the future plans the Erasmus Centre team has in the pipeline.

  16. Infrared stereo camera for human machine interface

    Science.gov (United States)

    Edmondson, Richard; Vaden, Justin; Chenault, David

    2012-06-01

    Improved situational awareness results not only from improved performance of imaging hardware, but also when the operator and human factors are considered. Situational awareness for IR imaging systems frequently depends on the contrast available. A significant improvement in effective contrast for the operator can result when depth perception is added to the display of IR scenes. Depth perception through flat panel 3D displays are now possible due to the number of 3D displays entering the consumer market. Such displays require appropriate and human friendly stereo IR video input in order to be effective in the dynamic military environment. We report on a stereo IR camera that has been developed for integration on to an unmanned ground vehicle (UGV). The camera has auto-convergence capability that significantly reduces ill effects due to image doubling, minimizes focus-convergence mismatch, and eliminates the need for the operator to manually adjust camera properties. Discussion of the size, weight, and power requirements as well as integration onto the robot platform will be given along with description of the stand alone operation.

  17. Imaging characteristics of photogrammetric camera systems

    Science.gov (United States)

    Welch, R.; Halliday, J.

    1973-01-01

    In view of the current interest in high-altitude and space photographic systems for photogrammetric mapping, the United States Geological Survey (U.S.G.S.) undertook a comprehensive research project designed to explore the practical aspects of applying the latest image quality evaluation techniques to the analysis of such systems. The project had two direct objectives: (1) to evaluate the imaging characteristics of current U.S.G.S. photogrammetric camera systems; and (2) to develop methodologies for predicting the imaging capabilities of photogrammetric camera systems, comparing conventional systems with new or different types of systems, and analyzing the image quality of photographs. Image quality was judged in terms of a number of evaluation factors including response functions, resolving power, and the detectability and measurability of small detail. The limiting capabilities of the U.S.G.S. 6-inch and 12-inch focal length camera systems were established by analyzing laboratory and aerial photographs in terms of these evaluation factors. In the process, the contributing effects of relevant parameters such as lens aberrations, lens aperture, shutter function, image motion, film type, and target contrast procedures for analyzing image quality and predicting and comparing performance capabilities. ?? 1973.

  18. Remote hardware-reconfigurable robotic camera

    Science.gov (United States)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.

    2001-10-01

    In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.

  19. Color camera pyrometry for high explosive detonations

    Science.gov (United States)

    Densmore, John; Biss, Matthew; Homan, Barrie; McNesby, Kevin

    2011-06-01

    Temperature measurements of high-explosive and combustion processes are difficult because of the speed and environment of the events. We have characterized and calibrated a digital high-speed color camera that may be used as an optical pyrometer to overcome these challenges. The camera provides both high temporal and spatial resolution. The color filter array of the sensor uses three color filters to measure the spectral distribution of the imaged light. A two-color ratio method is used to calculate a temperature using the color filter array raw image data and a gray-body assumption. If the raw image data is not available, temperatures may be calculated from processed images or movies depending on proper analysis of the digital color imaging pipeline. We analyze three transformations within the pipeline (demosaicing, white balance, and gamma-correction) to determine their effect on the calculated temperature. Using this technique with a Vision Research Phantom color camera, we have measured the temperature of exploded C-4 charges. The surface temperature of the resulting fireball rapidly increases after detonation and then decayed to a constant value of approximately 1980 K. Processed images indicates that the temperature remains constant until the light intensity decreased below the background value.

  20. Refocusing distance of a standard plenoptic camera.

    Science.gov (United States)

    Hahne, Christopher; Aggoun, Amar; Velisavljevic, Vladan; Fiebig, Susanne; Pesch, Matthias

    2016-09-19

    Recent developments in computational photography enabled variation of the optical focus of a plenoptic camera after image exposure, also known as refocusing. Existing ray models in the field simplify the camera's complexity for the purpose of image and depth map enhancement, but fail to satisfyingly predict the distance to which a photograph is refocused. By treating a pair of light rays as a system of linear functions, it will be shown in this paper that its solution yields an intersection indicating the distance to a refocused object plane. Experimental work is conducted with different lenses and focus settings while comparing distance estimates with a stack of refocused photographs for which a blur metric has been devised. Quantitative assessments over a 24 m distance range suggest that predictions deviate by less than 0.35 % in comparison to an optical design software. The proposed refocusing estimator assists in predicting object distances just as in the prototyping stage of plenoptic cameras and will be an essential feature in applications demanding high precision in synthetic focus or where depth map recovery is done by analyzing a stack of refocused photographs.

  1. Theory and applications of smart cameras

    CERN Document Server

    2016-01-01

    This book presents an overview of smart camera systems, considering practical applications but also reviewing fundamental aspects of the underlying technology.  It introduces in a tutorial style the principles of sensing and signal processing, and also describes topics such as wireless connection to the Internet of Things (IoT) which is expected to be the biggest market for smart cameras. It is an excellent guide to the fundamental of smart camera technology, and the chapters complement each other well as the authors have worked as a team under the auspice of GFP(Global Frontier Project), the largest-scale funded research in Korea.  This is the third of three books based on the Integrated Smart Sensors research project, which describe the development of innovative devices, circuits, and system-level enabling technologies.  The aim of the project was to develop common platforms on which various devices and sensors can be loaded, and to create systems offering significant improvements in information processi...

  2. Terrain mapping camera for Chandrayaan-1

    Indian Academy of Sciences (India)

    A S Kiran Kumar; A Roy Chowdhury

    2005-12-01

    The Terrain Mapping Camera (TMC)on India ’s first satellite for lunar exploration,Chandrayaan-1, is for generating high-resolution 3-dimensional maps of the Moon.With this instrument,a complete topographic map of the Moon with 5 m spatial resolution and 10-bit quantization will be available for scienti fic studies.The TMC will image within the panchromatic spectral band of 0.4 to 0.9 m with a stereo view in the fore,nadir and aft directions of the spacecraft movement and have a B/H ratio of 1.The swath coverage will be 20 km.The camera is configured for imaging in the push broom-mode with three linear detectors in the image plane.The camera will have four gain settings to cover the varying illumination conditions of the Moon.Additionally,a provision of imaging with reduced resolution,for improving Signal-to-Noise Ratio (SNR)in polar regions,which have poor illumination conditions throughout,has been made.SNR of better than 100 is expected in the ± 60° latitude region for mature mare soil,which is one of the darkest regions on the lunar surface. This paper presents a brief description of the TMC instrument.

  3. OurTube / David Talbot

    Index Scriptorium Estoniae

    Talbot, David

    2009-01-01

    USA California Ülikooli töötajate Abram Stern'i ja Michael Dale'i poolt 2005. a. algatatud USA kongressis peetud kõnede videoremiksidest ja nende poolt loodud veebisaidist Metavid.org. Ka YouTube keskkonnast ja wikipedia katsetustest muuta oma keskkond multimeedialisemaks

  4. Working session 1: Tubing degradation

    Energy Technology Data Exchange (ETDEWEB)

    Kharshafdjian, G. [Atomic Energy of Canada, Mississauga, Ontario (Canada); Turluer, G. [IPSN, Fontenay-aux-Roses (France)

    1997-02-01

    A general introductory overview of the purpose of the group and the general subject area of SG tubing degradation was given by the facilitator. The purpose of the session was described as to {open_quotes}develop conclusions and proposals on regulatory and technical needs required to deal with the issues of SG tubing degradation.{close_quotes} Types, locations and characteristics of tubing degradation in steam generators were briefly reviewed. The well-known synergistic effects of materials, environment, and stress and strain/strain rate, subsequently referred to by the acronym {open_quotes}MESS{close_quotes} by some of the group members, were noted. The element of time (i.e., evolution of these variables with time) was emphasized. It was also suggested that the group might want to consider the related topics of inspection capabilities, operational variables, degradation remedies, and validity of test data, and some background information in these areas was provided. The presentation given by Peter Millet during the Plenary Session was reviewed; Specifically, the chemical aspects and the degradation from the secondary side of the steam generator were noted. The main issues discussed during the October 1995 EPRI meeting on secondary side corrosion were reported, and a listing of the potential SG tube degradations was provided and discussed.

  5. Kundt's Tube Experiment Using Smartphones

    Science.gov (United States)

    Parolin, Sara Orsola; Pezzi, Giovanni

    2015-01-01

    This article deals with a modern version of Kundt's tube experiment. Using economic instruments and a couple of smartphones, it is possible to "see" nodes and antinodes of standing acoustic waves in a column of vibrating air and to measure the speed of sound.

  6. Welding the CNGS decay tube

    CERN Multimedia

    Maximilien Brice

    2004-01-01

    3.6 km of welds were required for the 1 km long CERN Neutrinos to Gran Sasso (CNGS) decay tube, in which particles produced in the collision with a proton and a graphite target will decay into muons and muon neutrinos. Four highly skilled welders performed this delicate task.

  7. Thermal inhomogeneities in vortex tubes

    Science.gov (United States)

    Lemesh, N. I.; Senchuk, L. A.

    An experimental study of the effect of the temperature of the inlet gas on the temperature difference between the hot and cold streams discharged from a Ranque-Hilsch vortex tube is described. The experimental results are presented in graphical form. It is that the temperature difference increases with the temperature of the entering gas.

  8. A malfunctioning nasogastric feeding tube

    Directory of Open Access Journals (Sweden)

    Emanuele Cereda

    2013-02-01

    Full Text Available A critical point of nasogastric feeding tube placement, potentially resulting in an unsafe and/or non-effective operation of the device, is the monitoring of its proper placement into the stomach. A properly obtained and interpreted radiograph is currently recommended to confirm placement. We reported the case of a 68-year-old demented woman referred for complicated dysphagia. A nasogastric tube was blindly inserted and its placement was confirmed by the radiologist. Enteral nutrition was initiated but the patient began to vomit immediately. After reviewing the radiograph it was understood that a gastric loop in the tube and its tip pointing upwards did not allow a safe infusion of the feeding formula. It is not enough having the radiologist reporting that a nasogastric feeding tube is placed in the stomach; the inclusion in the report of specific warnings on any potential cause of malfunctioning of the device should be considered. The presence of a gastric loop should be taken into account as a cause of potential malfunctioning.

  9. Fin-tube solar collectors

    Science.gov (United States)

    1980-01-01

    Report presents test procedures and results of thermal-performance evaluation of seven commercial fin tube (liquid) solar collector-absorber plates. Tests were conducted indoors at Marshall Space Flight Center Solar simulator. Results are graphically shown along with supporting test data and summary, indicating efficiency as function of collector inlet temperature.

  10. Holography and the Future Tube

    CERN Document Server

    Gibbons, G W

    2000-01-01

    The Future Tube $T^+_n$ of n-dimensional Minkowski spacetime may be identified with the reduced phase space or ` ` space of motions" of a particle moving in (n+1)-dimensional Anti-de-Sitter spacetime. Both are isomorphic to a bounded homogeneous domain in ${\\bf C}^{n}$ whose Shilov boundary may be identified with $n$-dimensional conformally compactified Minkowski spacetime.

  11. Observed inter-camera variability of clinically relevant performance characteristics for Siemens Symbia gamma cameras.

    Science.gov (United States)

    Kappadath, S Cheenu; Erwin, William D; Wendt, Richard E

    2006-11-28

    We conducted an evaluation of the intercamera (i.e., between cameras) variability in clinically relevant performance characteristics for Symbia gamma cameras (Siemens Medical Solutions, Malvern, PA) based on measurements made using nine separate systems. The significance of the observed intercamera variability was determined by comparing it to the intracamera (i.e., within a single camera) variability. Measurements of performance characteristics were based on the standards of the National Electrical Manufacturers Association and reports 6, 9, 22, and 52 from the American Association of Physicists in Medicine. All measurements were performed using 99mTc (except 57Co used for extrinsic resolution) and low-energy, high-resolution collimation. Of the nine cameras, four have crystals 3/8 in. thick and five have crystals 5/8 in. thick. We evaluated intrinsic energy resolution, intrinsic and extrinsic spatial resolution, intrinsic integral and differential flood uniformity over the useful field-of-view, count rate at 20% count loss, planar sensitivity, single-photon emission computed tomography (SPECT) resolution, and SPECT integral uniformity. The intracamera variability was estimated by repeated measurements of the performance characteristics on a single system. The significance of the observed intercamera variability was evaluated using the two-tailed F distribution. The planar sensitivity of the gamma cameras tested was found be variable at the 99.8% confidence level for both the 3/8-in. and 5/8-in. crystal systems. The integral uniformity and energy resolution were found to be variable only for the 5/8-in. crystal systems at the 98% and 90% confidence level, respectively. All other performance characteristics tested exhibited no significant variability between camera systems. The measured variability reported here could perhaps be used to define nominal performance values of Symbia gamma cameras for planar and SPECT imaging.

  12. The accessory fallopian tube: A rare anomaly

    Directory of Open Access Journals (Sweden)

    Kusum R Gandhi

    2012-01-01

    Full Text Available This paper presents a rare anatomical variation in the form of accessory fallopian tube on right side. The duplication of fallopian tube was observed in a 34-year-old female during routine undergraduate dissection in our department. Fallopian tube is the part of uterus that carries the ovum from the ovary to the uterus. Accessory fallopian tube is the congenital anomaly attached to the ampullary part of main tube. This accessory tube is common site of pyosalpinx, hydrosalpinx, cystic swelling and torsion. The ovum released by the ovary may also be captured by the blind accessory tube leading to infertility or ectopic pregnancy. Hence, all patients of infertility or pelvic inflammatory disease should be screened to rule out the presence of accessory fallopian tube and if encountered should be removed.

  13. Structure and growth thermodynamics of carbon tubes

    Institute of Scientific and Technical Information of China (English)

    李文治; 钱露茜; 钱生法; 周维亚; 王刚; 付春生; 赵日安; 解思深

    1996-01-01

    Carbon tubes were prepared by Ni (or Ti) catalytic pyrolysis of acetylene. The catalytic effect of nanometer nickel powders is related to the reduction temperature in H2 atmosphere. Nanometer nickel powders reduced at high temperature have a distinguished catalytic effect, and the yield of the carbon tubes is relatively high; but for the nickel powders reduced at low temperature, the yield of carbon tubes is low, and no tube can be formed. Carbon tubes can only be grown along the edges or on the tips of the Ni (or Ti) sheets reduced at about 770C. But if Ni (or Ti) sheets are etched in acid, at lot of carbon tubes with various forms can be formed on their surface. The structure and morphology of the carbon tubes is studied, and the growth thermodynamics for the straight, curved and helical carbon tubes are systematically investigated for the first time.

  14. Dynamics Calculation of Travel Wave Tube

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    During the dynamics calculating of the travel tube, we must obtain the field map in the tube. The field map can be affected by not only the beam loading, but also the attenuation coefficient. The calculation of the attenuation coefficient

  15. Opioid Use and Neural Tube Defects

    Science.gov (United States)

    ... CDC.gov . Articles Key Findings: Opioid Use and Neural Tube Defects Recommend on Facebook Tweet Share Compartir The ... relationship to having a pregnancy affected by a neural tube defect (NTD). Researchers from Boston University and CDC ...

  16. When to change a tracheostomy tube.

    Science.gov (United States)

    White, Alexander C; Kher, Sucharita; O'Connor, Heidi H

    2010-08-01

    Knowing when to change a tracheostomy tube is important for optimal management of all patients with tracheostomy tubes. The first tracheostomy tube change, performed 1-2 weeks after placement, carries some risk and should be performed by a skilled operator in a safe environment. The risk associated with changing the tracheostomy tube then usually diminishes over time as the tracheo-cutaneous tract matures. A malpositioned tube can be a source of patient distress and patient-ventilator asynchrony, and is important to recognize and correct. Airway endoscopy can be helpful to ensure optimal positioning of a replacement tracheostomy tube. Some of the specialized tracheostomy tubes available on the market are discussed. There are few data available to guide the timing of routine tracheostomy tube changes. Some guidelines are suggested.

  17. Cryogenic system for the ArTeMiS large sub millimeter camera

    Science.gov (United States)

    Ercolani, E.; Relland, J.; Clerc, L.; Duband, L.; Jourdan, T.; Talvard, M.; Le Pennec, J.; Martignac, J.; Visticot, F.

    2014-07-01

    A new photonic camera has been developed in the framework of the ArTéMis project (Bolometers architecture for large field of view ground based telescopes in the sub-millimeter). This camera scans the sky in the sub-millimeter range at simultaneously three different wavelengths, namely 200 μm, 350 μm, 450 μm, and is installed inside the APEX telescope located at 5100m above sea level in Chile. Bolometric detectors cooled to 300 mK are used in the camera, which is integrated in an original cryostat developed at the low temperature laboratory (SBT) of the INAC institut. This cryostat contains filters, optics, mirrors and detectors which have to be implemented according to mass, size and stiffness requirements. As a result the cryostat exhibits an unusual geometry. The inner structure of the cryostat is a 40 K plate which acts as an optical bench and is bound to the external vessel through two hexapods, one fixed and the other one mobile thanks to a ball bearing. Once the cryostat is cold, this characteristic enabled all the different elements to be aligned with the optical axis. The cryogenic chain is built around a pulse tube cooler (40 K and 4 K) coupled to a double stage helium sorption cooler (300 mK). The cryogenic and vacuum processes are managed by a Siemens PLC and all the data are showed and stored on a CEA SCADA system. This paper describes the mechanical and thermal design of the cryostat, its command control, and the first thermal laboratory tests. This work was carried out in collaboration with the Astrophysics laboratory SAp of the IRFU institut. SAp and SBT have installed the camera in July 2013 inside the Cassegrain cabin of APEX.

  18. Evaluation of a scientific CMOS camera for astronomical observations

    Institute of Scientific and Technical Information of China (English)

    Peng Qiu; Yong-Na Mao; Xiao-Meng Lu; E Xiang; Xiao-Jun Jiang

    2013-01-01

    We evaluate the performance of the first generation scientific CMOS (sCMOS) camera used for astronomical observations.The sCMOS camera was attached to a 25 cm telescope at Xinglong Observatory,in order to estimate its photometric capabilities.We further compared the capabilities of the sCMOS camera with that of full-frame and electron multiplying CCD cameras in laboratory tests and observations.The results indicate the sCMOS camera is capable of performing photometry of bright sources,especially when high spatial resolution or temporal resolution is desired.

  19. [A tube retractor for cardiac surgery].

    Science.gov (United States)

    Ohkado, A; Shiikawa, A; Ishitoya, H; Murata, A

    2001-03-01

    A retractor exclusively used to retract the tubes in cardiac surgery which needs cardiopulmonary bypass was developed. The half-cylinder-shaped end, the lightly curved handle and the flat and triangular grip enable easy and effective grasp of the tubes. This new instrument facilitates operative procedures by effectively retracting the tubes which persistently obstruct the operative field, in such a case of placement of a retrograde cardioplegia tube via the right atrium.

  20. High-dimensional camera shake removal with given depth map.

    Science.gov (United States)

    Yue, Tao; Suo, Jinli; Dai, Qionghai

    2014-06-01

    Camera motion blur is drastically nonuniform for large depth-range scenes, and the nonuniformity caused by camera translation is depth dependent but not the case for camera rotations. To restore the blurry images of large-depth-range scenes deteriorated by arbitrary camera motion, we build an image blur model considering 6-degrees of freedom (DoF) of camera motion with a given scene depth map. To make this 6D depth-aware model tractable, we propose a novel parametrization strategy to reduce the number of variables and an effective method to estimate high-dimensional camera motion as well. The number of variables is reduced by temporal sampling motion function, which describes the 6-DoF camera motion by sampling the camera trajectory uniformly in time domain. To effectively estimate the high-dimensional camera motion parameters, we construct the probabilistic motion density function (PMDF) to describe the probability distribution of camera poses during exposure, and apply it as a unified constraint to guide the convergence of the iterative deblurring algorithm. Specifically, PMDF is computed through a back projection from 2D local blur kernels to 6D camera motion parameter space and robust voting. We conduct a series of experiments on both synthetic and real captured data, and validate that our method achieves better performance than existing uniform methods and nonuniform methods on large-depth-range scenes.

  1. Presence capture cameras - a new challenge to the image quality

    Science.gov (United States)

    Peltoketo, Veli-Tapani

    2016-04-01

    Commercial presence capture cameras are coming to the markets and a new era of visual entertainment starts to get its shape. Since the true presence capturing is still a very new technology, the real technical solutions are just passed a prototyping phase and they vary a lot. Presence capture cameras have still the same quality issues to tackle as previous phases of digital imaging but also numerous new ones. This work concentrates to the quality challenges of presence capture cameras. A camera system which can record 3D audio-visual reality as it is has to have several camera modules, several microphones and especially technology which can synchronize output of several sources to a seamless and smooth virtual reality experience. Several traditional quality features are still valid in presence capture cameras. Features like color fidelity, noise removal, resolution and dynamic range create the base of virtual reality stream quality. However, co-operation of several cameras brings a new dimension for these quality factors. Also new quality features can be validated. For example, how the camera streams should be stitched together with 3D experience without noticeable errors and how to validate the stitching? The work describes quality factors which are still valid in the presence capture cameras and defines the importance of those. Moreover, new challenges of presence capture cameras are investigated in image and video quality point of view. The work contains considerations how well current measurement methods can be used in presence capture cameras.

  2. Roll-forming tubes to header plates

    Science.gov (United States)

    Kramer, K.

    1976-01-01

    Technique has been developed for attaching and sealing tubes to header plates using a unique roll-forming tool. Technique is useful for attaching small tubes which are difficult to roll into conventional grooves in header plate tube holes, and for attaching when welding, brazing, or soldering is not desirable.

  3. Feasibility of Upper Port Plug tube handling

    NARCIS (Netherlands)

    Koning, J.F.; Elzendoorn, B.S.Q.; Ronden, D.M.S.; Klinkhamer, J.F.F.; Biel, W.; Krasikov, Y.; Walker, C.I.

    2011-01-01

    Central, retractable tubes are proposed in several Upper Port Plugs (UPPs) designs for ITER, to enable fast exchange of specific components of diagnostics housed in these UPPs. This paper investigates into possible designs to enable the efficient handling of tubes. The feasibility of tube handling i

  4. Qualification Tests of Micro-camera Modules for Space Applications

    Science.gov (United States)

    Kimura, Shinichi; Miyasaka, Akira

    Visual capability is very important for space-based activities, for which small, low-cost space cameras are desired. Although cameras for terrestrial applications are continually being improved, little progress has been made on cameras used in space, which must be extremely robust to withstand harsh environments. This study focuses on commercial off-the-shelf (COTS) CMOS digital cameras because they are very small and are based on an established mass-market technology. Radiation and ultrahigh-vacuum tests were conducted on a small COTS camera that weighs less than 100 mg (including optics). This paper presents the results of the qualification tests for COTS cameras and for a small, low-cost COTS-based space camera.

  5. The GCT camera for the Cherenkov Telescope Array

    CERN Document Server

    Brown, Anthony M; Allan, D; Amans, J P; Armstrong, T P; Balzer, A; Berge, D; Boisson, C; Bousquet, J -J; Bryan, M; Buchholtz, G; Chadwick, P M; Costantini, H; Cotter, G; Daniel, M K; De Franco, A; De Frondat, F; Dournaux, J -L; Dumas, D; Fasola, G; Funk, S; Gironnet, J; Graham, J A; Greenshaw, T; Hervet, O; Hidaka, N; Hinton, J A; Huet, J -M; Jegouzo, I; Jogler, T; Kraus, M; Lapington, J S; Laporte, P; Lefaucheur, J; Markoff, S; Melse, T; Mohrmann, L; Molyneux, P; Nolan, S J; Okumura, A; Osborne, J P; Parsons, R D; Rosen, S; Ross, D; Rowell, G; Sato, Y; Sayede, F; Schmoll, J; Schoorlemmer, H; Servillat, M; Sol, H; Stamatescu, V; Stephan, M; Stuik, R; Sykes, J; Tajima, H; Thornhill, J; Tibaldo, L; Trichard, C; Vink, J; Watson, J J; White, R; Yamane, N; Zech, A; Zink, A; Zorn, J

    2016-01-01

    The Gamma-ray Cherenkov Telescope (GCT) is proposed for the Small-Sized Telescope component of the Cherenkov Telescope Array (CTA). GCT's dual-mirror Schwarzschild-Couder (SC) optical system allows the use of a compact camera with small form-factor photosensors. The GCT camera is ~0.4 m in diameter and has 2048 pixels; each pixel has a ~0.2 degree angular size, resulting in a wide field-of-view. The design of the GCT camera is high performance at low cost, with the camera housing 32 front-end electronics modules providing full waveform information for all of the camera's 2048 pixels. The first GCT camera prototype, CHEC-M, was commissioned during 2015, culminating in the first Cherenkov images recorded by a SC telescope and the first light of a CTA prototype. In this contribution we give a detailed description of the GCT camera and present preliminary results from CHEC-M's commissioning.

  6. Simple method for calibrating omnidirectional stereo with multiple cameras

    Science.gov (United States)

    Ha, Jong-Eun; Choi, I.-Sak

    2011-04-01

    Cameras can give useful information for the autonomous navigation of a mobile robot. Typically, one or two cameras are used for this task. Recently, an omnidirectional stereo vision system that can cover the whole surrounding environment of a mobile robot is adopted. They usually adopt a mirror that cannot offer uniform spatial resolution. In this paper, we deal with an omnidirectional stereo system which consists of eight cameras where each two vertical cameras constitute one stereo system. Camera calibration is the first necessary step to obtain 3D information. Calibration using a planar pattern requires many images acquired under different poses so it is a tedious step to calibrate all eight cameras. In this paper, we present a simple calibration procedure using a cubic-type calibration structure that surrounds the omnidirectional stereo system. We can calibrate all the cameras on an omnidirectional stereo system in just one shot.

  7. Determining Vision Graphs for Distributed Camera Networks Using Feature Digests

    Directory of Open Access Journals (Sweden)

    Cheng Zhaolin

    2007-01-01

    Full Text Available We propose a decentralized method for obtaining the vision graph for a distributed, ad-hoc camera network, in which each edge of the graph represents two cameras that image a sufficiently large part of the same environment. Each camera encodes a spatially well-distributed set of distinctive, approximately viewpoint-invariant feature points into a fixed-length "feature digest" that is broadcast throughout the network. Each receiver camera robustly matches its own features with the decompressed digest and decides whether sufficient evidence exists to form a vision graph edge. We also show how a camera calibration algorithm that passes messages only along vision graph edges can recover accurate 3D structure and camera positions in a distributed manner. We analyze the performance of different message formation schemes, and show that high detection rates ( can be achieved while maintaining low false alarm rates ( using a simulated 60-node outdoor camera network.

  8. Inclusive spectra of hadrons created by color tube fission; 1, Probability of tube fission

    CERN Document Server

    Gedalin, E V

    1997-01-01

    The probability of color tube fission that includes the tube surface small oscillation corrections is obtained with pre-exponential factor accuracy on the basis of previously constructed color tube model. Using these expressions the probability of the tube fission in $n$ point is obtained that is the basis for calculation of inclusive spectra of produced hadrons.

  9. Design and characterization of a low profile NaI(Tl) gamma camera for dedicated molecular breast tomosynthesis

    Science.gov (United States)

    Polemi, Andrew M.; Niestroy, Justin; Stolin, Alexander; Jaliparthi, Gangadhar; Wojcik, Randy; Majewski, Stan; Williams, Mark B.

    2016-10-01

    A new low profile gamma camera is being developed for use in a dual modality (x-ray transmission and gamma-ray emission) tomosynthesis system. Compared to the system's current gamma camera, the new camera has a larger field of view ( 20x25 cm) to better match the system's x-ray detector ( 23x29 cm), and is thinner (7.3 cm instead of 10.3 cm) permitting easier camera positioning near the top surface of the breast. It contains a pixelated NaI(Tl) array with a crystal pitch of 2.2 mm, which is optically coupled to a 4x5 array of Hamamatsu H8500C position sensitive photomultiplier tubes (PSPMTs). The manufacturer-provided connector board of each PSPMT was replaced with a custom designed board that a) reduces the 64 channel readout of the 8x8 electrode anode of the H8500C to 16 channels (8X and 8Y), b) performs gain non-uniformity correction, and c) reduces the height of the PSPMT-base assembly, 37.7 mm to 27.87 mm. The X and Y outputs of each module are connected in a lattice framework, and at two edges of this lattice, the X and Y outputs (32Y by 40X) are coupled to an amplifier/output board whose signals are fed via shielded ribbon cables to external ADCs. The camera uses parallel hole collimation. We describe the measured camera imaging performance, including intrinsic and extrinsic spatial resolution, detection sensitivity, uniformity of response, energy resolution for 140 keV gamma rays, and geometric linearity.

  10. Investigation of high resolution compact gamma camera module based on a continuous scintillation crystal using a novel charge division readout method

    Science.gov (United States)

    Dai, Qiu-Sheng; Zhao, Cui-Lan; Zhang, Hua-Lin; Qi, Yu-Jin

    2010-08-01

    The objective of this study is to investigate a high performance and lower cost compact gamma camera module for a multi-head small animal SPECT system. A compact camera module was developed using a thin Lutetium Oxyorthosilicate (LSO) scintillation crystal slice coupled to a Hamamatsu H8500 position sensitive photomultiplier tube (PSPMT). A two-stage charge division readout board based on a novel sub-tractive resistive readout with a truncated center-of-gravity (TCOG) positioning method was developed for the camera. The performance of the camera was evaluated using a flood 99mTc source with a four-quadrant bar-mask phantom. The preliminary experimental results show that the image shrinkage problem associated with the conventional resistive readout can be effectively overcome by the novel subtractive resistive readout with an appropriate fraction subtraction factor. The response output area (ROA) of the camera shown in the flood image was improved up to 34%, and an intrinsic spatial resolution better than 2 mm of detector was achieved. In conclusion, the utilization of a continuous scintillation crystal and a flat-panel PSPMT equipped with a novel subtractive resistive readout is a feasible approach for developing a high performance and lower cost compact gamma camera.

  11. X-ray imaging with ePix100a: a high-speed, high-resolution, low-noise camera

    Science.gov (United States)

    Blaj, G.; Caragiulo, P.; Dragone, A.; Haller, G.; Hasi, J.; Kenney, C. J.; Kwiatkowski, M.; Markovic, B.; Segal, J.; Tomada, A.

    2016-09-01

    The ePix100A camera is a 0.5 megapixel (704 x 768 pixels) camera for low noise x-ray detection applications requiring high spatial and spectral resolution. The camera is built around a hybrid pixel detector consisting of 4 ePix100a ASICs ip-chip bonded to one sensor. The pixels are 50 μm x 50 μm (active sensor size 35:4mm x 38:6 mm), with a noise of 180 eV rms, a range of 100 8 keV photons, and a current frame rate of 240 Hz (with an upgrade path towards 10 kHz). This performance leads to a camera combining a high dynamic range, high signal to noise ratio, high speed and excellent linearity and spectroscopic performance. While the ePix100A ASIC has been developed for pulsed source applications (e.g., free-electron lasers), it performs well with more common sources (e.g., x-ray tubes, synchrotron radiation). Several cameras have been produced and characterized and the results are reported here, along with x-ray imaging applications demonstrating the camera performance.

  12. Microstructural degradation in compound tubes

    Energy Technology Data Exchange (ETDEWEB)

    Salonen, J.; Auerkari, P. [VTT Manufacturing Technology, Espoo (Finland)

    1996-12-31

    In order to quantify microstructural degradation at high temperatures, samples of SA 210 / AISI 304 L compound tube material were annealed in the temperature range 540-720 deg C for 1 to 1 000 hours. The hardness of the annealed material was measured and the micro structure of the samples was investigated with optical and scanning electron microscopy. Microstructural degradation was characterised by the carbide structure in the ferritic-pearlitic base material and by the depth of decarburised and carburised zones of the compound tube interface. The observed changes were quantified in terms of their time and temperature dependence and diffusion coefficients of the process. The results can be used in estimating the extent of thermal exposure of high-temperature components after long-term service or after incidences of overheating. (orig.) (4 refs.)

  13. Capillary imbibition in parallel tubes

    Science.gov (United States)

    McRae, Oliver; Ramakrishnan, T. S.; Bird, James

    2016-11-01

    In modeling porous media two distinct approaches can be employed; the sample can be examined holistically, using global variables such as porosity, or it can be treated as a network of capillaries connected in series to various intermediate reservoirs. In forced imbibition this series-based description is sufficient to characterize the flow, due to the presence of an externally maintained pressure difference. However, in spontaneous imbibition, flow is driven by an internal capillary pressure, making it unclear whether a series-based model is appropriate. In this talk, we show using numerical simulations the dynamics of spontaneous imbibition in concentrically arranged capillary tubes. This geometry allows both tubes access to a semi-infinite reservoir but with inlets in close enough proximity to allow for interference. We compare and contrast the results of our simulations with theory and previous experiments. Schlumberger-Doll Research.

  14. Mechanical Instabilities of Biological Tubes

    CERN Document Server

    Hannezo, Edouard; Prost, Jacques; 10.1103/PhysRevLett.109.018101

    2012-01-01

    We study theoretically the shapes of biological tubes affected by various pathologies. When epithelial cells grow at an uncontrolled rate, the negative tension produced by their division provokes a buckling instability. Several shapes are investigated : varicose, enlarged, sinusoidal or sausage-like, all of which are found in pathologies of tracheal, renal tubes or arteries. The final shape depends crucially on the mechanical parameters of the tissues : Young modulus, wall-to-lumen ratio, homeostatic pressure. We argue that since tissues must be in quasistatic mechanical equilibrium, abnormal shapes convey information as to what causes the pathology. We calculate a phase diagram of tubular instabilities which could be a helpful guide for investigating the underlying genetic regulation.

  15. Flux tubes at Finite Temperature

    CERN Document Server

    Bicudo, Pedro; Cardoso, Marco

    2016-01-01

    We show the flux tubes produced by static quark-antiquark, quark-quark and quark-gluon charges at finite temperature. The sources are placed in the lattice with fundamental and adjoint Polyakov loops. We compute the square densities of the chromomagnetic and chromoelectric fields above and below the phase transition. Our results are gauge invariant and produced in pure gauge SU(3). The codes are written in CUDA and the computations are performed with GPUs.

  16. Film holder for radiographing tubing

    Science.gov (United States)

    Davis, Earl V.; Foster, Billy E.

    1976-01-01

    A film cassette is provided which may be easily placed about tubing or piping and readily held in place while radiographic inspection is performed. A pair of precurved light-impervious semi-rigid plastic sheets, hinged at one edge, enclose sheet film together with any metallic foils or screens. Other edges are made light-tight with removable caps, and the entire unit is held securely about the object to be radiographed with a releasable fastener such as a strip of Velcro.

  17. Gated SIT Vidicon Streak Tube

    Science.gov (United States)

    Dunbar, D. L.; Yates, G. J.; Black, J. P.

    1986-01-01

    A recently developed prototype streak tube designed to produce high gain and resolution by incorporating the streak and readout functions in one envelope thereby minimizing photon-to-charge transformations and eliminating external coupling losses is presented. The tube is based upon a grid-gated Silicon-Intensified-Target Vidicon (SITV) with integral Focus Projection Scan (FPS) TV readout. Demagnifying electron optics (m=0.63) in the image section map the 40-mm-diameter photocathode image unto a 25-mm-diameter silicon target where gains >= 103 are achieved with only 10 KV accelerating voltage. This is compared with much lower gains (~ 50) at much higher voltages (~ 30 KV) reported for streak tubes using phosphor screens. Because SIT technology is well established means for electron imaging in vacuum, such fundamental problems as "backside thinning" required for electron imaging unto CCDs do not exist. The high spatial resolution (~ 30 1p/mm), variable scan formats, and high speed electrostatic deflection (250 mm2 areas are routinely rastered with 256 scan lines in 1.6 ms) available from FPS readout add versatility not available in CCD devices. Theoretical gain and spatial resolution for this design (developed jointly by Los Alamos National Laboratory and General Electric Co.) are compared with similar calculations and measured data obtained for RCA 73435 streaks fiber optically coupled to (1) 25-mm-diameter SIT FPS vidicons and (2) 40-mm-diameter MCPTs (proximity-focused microchannel plate image intensifier tubes) fiber optically coupled to 18-mm-diameter Sb2S3 FPS vidicons. Sweep sensitivity, shutter ratio, and record lengths for nanosecond duration (20 to 200 ns) streak applications are discussed.

  18. Waterproof camera case for intraoperative photographs.

    Science.gov (United States)

    Raigosa, Mauricio; Benito-Ruiz, Jesús; Fontdevila, Joan; Ballesteros, José R

    2008-03-01

    Accurate photographic documentation has become essential in reconstructive and cosmetic surgery for both clinical and scientific purposes. Intraoperative photographs are important not only for record purposes, but also for teaching, publications, and presentations. Communication using images proves to be the superior way to persuade audiences. This article presents a simple and easy method for taking intraoperative photographs that uses a presterilized waterproof camera case. This method allows the user to take very good quality pictures with the photographic angle matching the surgeon's view, minimal interruption of the operative procedure, and minimal risk of contaminating the operative field.

  19. Thermal imaging cameras characteristics and performance

    CERN Document Server

    Williams, Thomas

    2009-01-01

    The ability to see through smoke and mist and the ability to use the variances in temperature to differentiate between targets and their backgrounds are invaluable in military applications and have become major motivators for the further development of thermal imagers. As the potential of thermal imaging is more clearly understood and the cost decreases, the number of industrial and civil applications being exploited is growing quickly. In order to evaluate the suitability of particular thermal imaging cameras for particular applications, it is important to have the means to specify and measur

  20. Aerodynamic drag from two tubes in side-by-side arrangement for different tube shapes

    Directory of Open Access Journals (Sweden)

    Олександр Михайлович Терех

    2016-06-01

    Full Text Available Experimental investigations of aerodynamic drag from two tubes in side-by-side arrangement for different tube shapes in the range of Reynolds numbers from 4000 to16000 are performed. Comparison of experimental data is executed. It is set, that the tubes of drop-shaped form have less aerodynamic drag and the tubes of flat-oval and dumb-bell forms have greater drag as compared to drag of circular tubes

  1. Useful Scaling Parameters for the Pulse Tube

    Science.gov (United States)

    Lee, J. M.; Kittel, P.; Timmerhaus, K. D.; Radebaugh, R.; Cheng, Pearl L. (Technical Monitor)

    1995-01-01

    A set of eight non-dimensional scaling parameters for use in evaluating the performance of Pulse Tube Refrigerators is presented. The parameters result after scaling the mass, momentum and energy conservation equations for an axisymmetric, two-dimensional system. The physical interpretation of the parameters are described, and their usefulness is outlined for the enthalpy flow tube (open tube of the pulse tube). The scaling parameters allow the experimentalist to characterize three types of transport: enthalpy flow, mass streaming and heat transfer between the gas and the tube. Also reported are the results from a flow visualization experiment in which steady mass streaming in compressible oscillating flow is observed.

  2. Preparation of chitosan nanofiber tube by electrospinning.

    Science.gov (United States)

    Matsuda, Atsushi; Kagata, Go; Kino, Rikako; Tanaka, Junzo

    2007-03-01

    Water-insoluble chitosan nanofiber sheets and tubes coated with chitosan-cast film were prepared by electrospinning. When as-spun chitosan nanofiber sheets and tubes were immersed in 28% ammonium aqueous solution, they became insoluble in water and showed nanofiber structures confirmed by SEM micrography. Mechanical properties of chitosan nanofiber sheets and tubes were improved by coating with chitosan-cast film, which gave them a compressive strength higher than that of crab-tendon chitosan, demonstrating that chitosan nanofiber tubes coated with chitosan-cast film are usable as nerve-regenerative guide tubes.

  3. Freezing in a vertical tube

    Energy Technology Data Exchange (ETDEWEB)

    Sparrow, E.M.; Broadbent, J.A.

    1983-05-01

    Fundamental heat transfer experiments were performed for freezing of an initially superheated or nonsuperheated liquid in a cooled vertical tube. Measurements were made which yielded information about the freezing front and the frozen mass, about the various energy components extracted from the tube, and about the decay of the initial liquid superheat. Four component energies were identified and evaluated from the experimental data, including the latent energy released by the phase change and sensibly energies released from the subcooled frozen solid and the superheated liquid. Initial superheating of the liquid tended to moderately diminish the frozen mass and latent energy extraction at short freezing times but had little effect on these quantitites at longer times. The extracted sensible energies associated with the superheating more than compensated for the aforementioned decrease in the latent energy. Although the latent energy is the largest contributor to the total extracted energy, the aggregate sensible energies can make a significant contribution, especially at large tube wall subcooling, large initial liquid superheating, and short freezing time. Natural convection effects in the superheated liquid were modest and were confined to short freezing times.

  4. Development boiling to sprinkled tube bundle

    Directory of Open Access Journals (Sweden)

    Kracík Petr

    2016-01-01

    Full Text Available This paper presents results of a studied heat transfer coefficient at the surface of a sprinkled tube bundle where boiling occurs. Research in the area of sprinkled exchangers can be divided into two major parts. The first part is research on heat transfer and determination of the heat transfer coefficient at sprinkled tube bundles for various liquids, whether boiling or not. The second part is testing of sprinkle modes for various tube diameters, tube pitches and tube materials and determination of individual modes’ interface. All results published so far for water as the falling film liquid apply to one to three tubes for which the mentioned relations studied are determined in rigid laboratory conditions defined strictly in advance. The sprinkled tubes were not viewed from the operational perspective where there are more tubes and various modes may occur in different parts with various heat transfer values. The article focuses on these processes. The tube is located in a low-pressure chamber where vacuum is generated using an exhauster via ejector. The tube consists of smooth copper tubes of 12 mm diameter placed horizontally one above another.

  5. Development boiling to sprinkled tube bundle

    Science.gov (United States)

    Kracík, Petr; Pospíšil, Jiří

    2016-03-01

    This paper presents results of a studied heat transfer coefficient at the surface of a sprinkled tube bundle where boiling occurs. Research in the area of sprinkled exchangers can be divided into two major parts. The first part is research on heat transfer and determination of the heat transfer coefficient at sprinkled tube bundles for various liquids, whether boiling or not. The second part is testing of sprinkle modes for various tube diameters, tube pitches and tube materials and determination of individual modes' interface. All results published so far for water as the falling film liquid apply to one to three tubes for which the mentioned relations studied are determined in rigid laboratory conditions defined strictly in advance. The sprinkled tubes were not viewed from the operational perspective where there are more tubes and various modes may occur in different parts with various heat transfer values. The article focuses on these processes. The tube is located in a low-pressure chamber where vacuum is generated using an exhauster via ejector. The tube consists of smooth copper tubes of 12 mm diameter placed horizontally one above another.

  6. Gastroenteric tube feeding: Techniques, problems and solutions

    Science.gov (United States)

    Blumenstein, Irina; Shastri, Yogesh M; Stein, Jürgen

    2014-01-01

    Gastroenteric tube feeding plays a major role in the management of patients with poor voluntary intake, chronic neurological or mechanical dysphagia or gut dysfunction, and patients who are critically ill. However, despite the benefits and widespread use of enteral tube feeding, some patients experience complications. This review aims to discuss and compare current knowledge regarding the clinical application of enteral tube feeding, together with associated complications and special aspects. We conducted an extensive literature search on PubMed, Embase and Medline using index terms relating to enteral access, enteral feeding/nutrition, tube feeding, percutaneous endoscopic gastrostomy/jejunostomy, endoscopic nasoenteric tube, nasogastric tube, and refeeding syndrome. The literature showed common routes of enteral access to include nasoenteral tube, gastrostomy and jejunostomy, while complications fall into four major categories: mechanical, e.g., tube blockage or removal; gastrointestinal, e.g., diarrhea; infectious e.g., aspiration pneumonia, tube site infection; and metabolic, e.g., refeeding syndrome, hyperglycemia. Although the type and frequency of complications arising from tube feeding vary considerably according to the chosen access route, gastrointestinal complications are without doubt the most common. Complications associated with enteral tube feeding can be reduced by careful observance of guidelines, including those related to food composition, administration rate, portion size, food temperature and patient supervision. PMID:25024606

  7. Gastroenteric tube feeding: techniques, problems and solutions.

    Science.gov (United States)

    Blumenstein, Irina; Shastri, Yogesh M; Stein, Jürgen

    2014-07-14

    Gastroenteric tube feeding plays a major role in the management of patients with poor voluntary intake, chronic neurological or mechanical dysphagia or gut dysfunction, and patients who are critically ill. However, despite the benefits and widespread use of enteral tube feeding, some patients experience complications. This review aims to discuss and compare current knowledge regarding the clinical application of enteral tube feeding, together with associated complications and special aspects. We conducted an extensive literature search on PubMed, Embase and Medline using index terms relating to enteral access, enteral feeding/nutrition, tube feeding, percutaneous endoscopic gastrostomy/jejunostomy, endoscopic nasoenteric tube, nasogastric tube, and refeeding syndrome. The literature showed common routes of enteral access to include nasoenteral tube, gastrostomy and jejunostomy, while complications fall into four major categories: mechanical, e.g., tube blockage or removal; gastrointestinal, e.g., diarrhea; infectious e.g., aspiration pneumonia, tube site infection; and metabolic, e.g., refeeding syndrome, hyperglycemia. Although the type and frequency of complications arising from tube feeding vary considerably according to the chosen access route, gastrointestinal complications are without doubt the most common. Complications associated with enteral tube feeding can be reduced by careful observance of guidelines, including those related to food composition, administration rate, portion size, food temperature and patient supervision.

  8. Tube erosion in bubbling fluidized beds

    Energy Technology Data Exchange (ETDEWEB)

    Levy, E.K. [Lehigh Univ., Bethlehem, PA (United States). Energy Research Center; Stallings, J.W. [Electric Power Research Inst., Palo Alto, CA (United States)

    1991-12-31

    This paper reports on experimental and theoretical studies that were preformed of the interaction between bubbles and tubes and tube erosion in fluidized beds. The results are applicable to the erosion of horizontal tubes in the bottom row of a tube bundle in a bubbling bed. Cold model experimental data show that erosion is caused by the impact of bubble wakes on the tubes, with the rate of erosion increasing with the velocity of wake impact with the particle size. Wake impacts resulting from the vertical coalescence of pairs of bubbles directly beneath the tube result in particularly high rates of erosion damage. Theoretical results from a computer simulation of bubbling and erosion show very strong effects of the bed geometry and bubbling conditions on computed rates of erosion. These results show, for example, that the rate of erosion can be very sensitive to the vertical location of the bottom row of tubes with respect to the distributor.

  9. Online camera-gyroscope autocalibration for cell phones.

    Science.gov (United States)

    Jia, Chao; Evans, Brian L

    2014-12-01

    The gyroscope is playing a key role in helping estimate 3D camera rotation for various vision applications on cell phones, including video stabilization and feature tracking. Successful fusion of gyroscope and camera data requires that the camera, gyroscope, and their relative pose to be calibrated. In addition, the timestamps of gyroscope readings and video frames are usually not well synchronized. Previous paper performed camera-gyroscope calibration and synchronization offline after the entire video sequence has been captured with restrictions on the camera motion, which is unnecessarily restrictive for everyday users to run apps that directly use the gyroscope. In this paper, we propose an online method that estimates all the necessary parameters, whereas a user is capturing video. Our contributions are: 1) simultaneous online camera self-calibration and camera-gyroscope calibration based on an implicit extended Kalman filter and 2) generalization of the multiple-view coplanarity constraint on camera rotation in a rolling shutter camera model for cell phones. The proposed method is able to estimate the needed calibration and synchronization parameters online with all kinds of camera motion and can be embedded in gyro-aided applications, such as video stabilization and feature tracking. Both Monte Carlo simulation and cell phone experiments show that the proposed online calibration and synchronization method converge fast to the ground truth values.

  10. Flow Vaporization of CO{sub 2} in Microchannel Tubes

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Jostein

    2002-07-01

    transfer coefficient. A data reduction scheme was developed to find the mean vaporization heat transfer coefficient h as well as the mean vapour fraction x in each test. Measured pressure drop between the inlet and outlet manifolds of the test section was corrected to find the net frictional pressure drop in the heated part of the tube. A special rig was built in order to observe two-phase flow patterns. A horizontal quartz glass tube with ID 0.98 mm was coated by transparent resistive coating of indium tin oxide (ITO), and connected to an open fluid circuit where liquid CO{sub 2} was taken from a heated storage cylinder, its pressure reduced and mass flow adjusted by valves before a preheater section that gave the desired x into the observation tube. Heat flux was obtained by applying DC power to the ITO film, and flow patterns were recorded at 4000-8000 frames per second by a digital video camera. Compared to ''standard'' test arrangements, this enabled recordings in the heated zone, and not in an adiabatic observation tube installed after or between heated tubes. In heat transfer dominated by nucleate boiling this makes quite a difference.

  11. Mars Cameras Make Panoramic Photography a Snap

    Science.gov (United States)

    2008-01-01

    If you wish to explore a Martian landscape without leaving your armchair, a few simple clicks around the NASA Web site will lead you to panoramic photographs taken from the Mars Exploration Rovers, Spirit and Opportunity. Many of the technologies that enable this spectacular Mars photography have also inspired advancements in photography here on Earth, including the panoramic camera (Pancam) and its housing assembly, designed by the Jet Propulsion Laboratory and Cornell University for the Mars missions. Mounted atop each rover, the Pancam mast assembly (PMA) can tilt a full 180 degrees and swivel 360 degrees, allowing for a complete, highly detailed view of the Martian landscape. The rover Pancams take small, 1 megapixel (1 million pixel) digital photographs, which are stitched together into large panoramas that sometimes measure 4 by 24 megapixels. The Pancam software performs some image correction and stitching after the photographs are transmitted back to Earth. Different lens filters and a spectrometer also assist scientists in their analyses of infrared radiation from the objects in the photographs. These photographs from Mars spurred developers to begin thinking in terms of larger and higher quality images: super-sized digital pictures, or gigapixels, which are images composed of 1 billion or more pixels. Gigapixel images are more than 200 times the size captured by today s standard 4 megapixel digital camera. Although originally created for the Mars missions, the detail provided by these large photographs allows for many purposes, not all of which are limited to extraterrestrial photography.

  12. Multi-band infrared camera systems

    Science.gov (United States)

    Davis, Tim; Lang, Frank; Sinneger, Joe; Stabile, Paul; Tower, John

    1994-12-01

    The program resulted in an IR camera system that utilizes a unique MOS addressable focal plane array (FPA) with full TV resolution, electronic control capability, and windowing capability. Two systems were delivered, each with two different camera heads: a Stirling-cooled 3-5 micron band head and a liquid nitrogen-cooled, filter-wheel-based, 1.5-5 micron band head. Signal processing features include averaging up to 16 frames, flexible compensation modes, gain and offset control, and real-time dither. The primary digital interface is a Hewlett-Packard standard GPID (IEEE-488) port that is used to upload and download data. The FPA employs an X-Y addressed PtSi photodiode array, CMOS horizontal and vertical scan registers, horizontal signal line (HSL) buffers followed by a high-gain preamplifier and a depletion NMOS output amplifier. The 640 x 480 MOS X-Y addressed FPA has a high degree of flexibility in operational modes. By changing the digital data pattern applied to the vertical scan register, the FPA can be operated in either an interlaced or noninterlaced format. The thermal sensitivity performance of the second system's Stirling-cooled head was the best of the systems produced.

  13. Women's Creation of Camera Phone Culture

    Directory of Open Access Journals (Sweden)

    Dong-Hoo Lee

    2005-01-01

    Full Text Available A major aspect of the relationship between women and the media is the extent to which the new media environment is shaping how women live and perceive the world. It is necessary to understand, in a concrete way, how the new media environment is articulated to our gendered culture, how the symbolic or physical forms of the new media condition women’s experiences, and the degree to which a ‘post-gendered re-codification’ can be realized within a new media environment. This paper intends to provide an ethnographic case study of women’s experiences with camera phones, examining the extent to which these experiences recreate or reconstruct women’s subjectivity or identity. By taking a close look at the ways in which women utilize and appropriate the camera phone in their daily lives, it focuses not only on women’s cultural practices in making meanings but also on their possible effect in the deconstruction of gendered techno-culture.

  14. Process simulation in digital camera system

    Science.gov (United States)

    Toadere, Florin

    2012-06-01

    The goal of this paper is to simulate the functionality of a digital camera system. The simulations cover the conversion from light to numerical signal and the color processing and rendering. We consider the image acquisition system to be linear shift invariant and axial. The light propagation is orthogonal to the system. We use a spectral image processing algorithm in order to simulate the radiometric properties of a digital camera. In the algorithm we take into consideration the transmittances of the: light source, lenses, filters and the quantum efficiency of a CMOS (complementary metal oxide semiconductor) sensor. The optical part is characterized by a multiple convolution between the different points spread functions of the optical components. We use a Cooke triplet, the aperture, the light fall off and the optical part of the CMOS sensor. The electrical part consists of the: Bayer sampling, interpolation, signal to noise ratio, dynamic range, analog to digital conversion and JPG compression. We reconstruct the noisy blurred image by blending different light exposed images in order to reduce the photon shot noise, also we filter the fixed pattern noise and we sharpen the image. Then we have the color processing blocks: white balancing, color correction, gamma correction, and conversion from XYZ color space to RGB color space. For the reproduction of color we use an OLED (organic light emitting diode) monitor. The analysis can be useful to assist students and engineers in image quality evaluation and imaging system design. Many other configurations of blocks can be used in our analysis.

  15. Infrared Camera Analysis of Laser Hardening

    Directory of Open Access Journals (Sweden)

    J. Tesar

    2012-01-01

    Full Text Available The improvement of surface properties such as laser hardening becomes very important in present manufacturing. Resulting laser hardening depth and surface hardness can be affected by changes in optical properties of material surface, that is, by absorptivity that gives the ratio between absorbed energy and incident laser energy. The surface changes on tested sample of steel block were made by engraving laser with different scanning velocity and repetition frequency. During the laser hardening the process was observed by infrared (IR camera system that measures infrared radiation from the heated sample and depicts it in a form of temperature field. The images from the IR camera of the sample are shown, and maximal temperatures of all engraved areas are evaluated and compared. The surface hardness was measured, and the hardening depth was estimated from the measured hardness profile in the sample cross-section. The correlation between reached temperature, surface hardness, and hardening depth is shown. The highest and the lowest temperatures correspond to the lowest/highest hardness and the highest/lowest hardening depth.

  16. FIDO Rover Retracted Arm and Camera

    Science.gov (United States)

    1999-01-01

    The Field Integrated Design and Operations (FIDO) rover extends the large mast that carries its panoramic camera. The FIDO is being used in ongoing NASA field tests to simulate driving conditions on Mars. FIDO is controlled from the mission control room at JPL's Planetary Robotics Laboratory in Pasadena. FIDO uses a robot arm to manipulate science instruments and it has a new mini-corer or drill to extract and cache rock samples. Several camera systems onboard allow the rover to collect science and navigation images by remote-control. The rover is about the size of a coffee table and weighs as much as a St. Bernard, about 70 kilograms (150 pounds). It is approximately 85 centimeters (about 33 inches) wide, 105 centimeters (41 inches) long, and 55 centimeters (22 inches) high. The rover moves up to 300 meters an hour (less than a mile per hour) over smooth terrain, using its onboard stereo vision systems to detect and avoid obstacles as it travels 'on-the-fly.' During these tests, FIDO is powered by both solar panels that cover the top of the rover and by replaceable, rechargeable batteries.

  17. Oil spill detection using hyperspectral infrared camera

    Science.gov (United States)

    Yu, Hui; Wang, Qun; Zhang, Zhen; Zhang, Zhi-jie; Tang, Wei; Tang, Xin; Yue, Song; Wang, Chen-sheng

    2016-11-01

    Oil spill pollution is a severe environmental problem that persists in the marine environment and in inland water systems around the world. Remote sensing is an important part of oil spill response. The hyperspectral images can not only provide the space information but also the spectral information. Pixels of interests generally incorporate information from disparate component that requires quantitative decomposition of these pixels to extract desired information. Oil spill detection can be implemented by applying hyperspectral camera which can collect the hyperspectral data of the oil. By extracting desired spectral signature from hundreds of band information, one can detect and identify oil spill area in vast geographical regions. There are now numerous hyperspectral image processing algorithms developed for target detection. In this paper, we investigate several most widely used target detection algorithm for the identification of surface oil spills in ocean environment. In the experiments, we applied a hyperspectral camera to collect the real life oil spill. The experimental results shows the feasibility of oil spill detection using hyperspectral imaging and the performance of hyperspectral image processing algorithms were also validated.

  18. The NectarCAM camera project

    CERN Document Server

    Glicenstein, J-F; Barrio, J-A; Blanch, O; Boix, J; Bolmont, J; Boutonnet, C; Cazaux, S; Chabanne, E; Champion, C; Chateau, F; Colonges, S; Corona, P; Couturier, S; Courty, B; Delagnes, E; Delgado, C; Ernenwein, J-P; Fegan, S; Ferreira, O; Fesquet, M; Fontaine, G; Fouque, N; Henault, F; Gascón, D; Herranz, D; Hermel, R; Hoffmann, D; Houles, J; Karkar, S; Khelifi, B; Knödlseder, J; Martinez, G; Lacombe, K; Lamanna, G; LeFlour, T; Lopez-Coto, R; Louis, F; Mathieu, A; Moulin, E; Nayman, P; Nunio, F; Olive, J-F; Panazol, J-L; Petrucci, P-O; Punch, M; Prast, J; Ramon, P; Riallot, M; Ribó, M; Rosier-Lees, S; Sanuy, A; Siero, J; Tavernet, J-P; Tejedor, L A; Toussenel, F; Vasileiadis, G; Voisin, V; Waegebert, V; Zurbach, C

    2013-01-01

    In the framework of the next generation of Cherenkov telescopes, the Cherenkov Telescope Array (CTA), NectarCAM is a camera designed for the medium size telescopes covering the central energy range of 100 GeV to 30 TeV. NectarCAM will be finely pixelated (~ 1800 pixels for a 8 degree field of view, FoV) in order to image atmospheric Cherenkov showers by measuring the charge deposited within a few nanoseconds time-window. It will have additional features like the capacity to record the full waveform with GHz sampling for every pixel and to measure event times with nanosecond accuracy. An array of a few tens of medium size telescopes, equipped with NectarCAMs, will achieve up to a factor of ten improvement in sensitivity over existing instruments in the energy range of 100 GeV to 10 TeV. The camera is made of roughly 250 independent read-out modules, each composed of seven photo-multipliers, with their associated high voltage base and control, a read-out board and a multi-service backplane board. The read-out b...

  19. Foreground extraction for moving RGBD cameras

    Science.gov (United States)

    Junejo, Imran N.; Ahmed, Naveed

    2017-02-01

    In this paper, we propose a simple method to perform foreground extraction for a moving RGBD camera. These cameras have now been available for quite some time. Their popularity is primarily due to their low cost and ease of availability. Although the field of foreground extraction or background subtraction has been explored by the computer vision researchers since a long time, the depth-based subtraction is relatively new and has not been extensively addressed as of yet. Most of the current methods make heavy use of geometric reconstruction, making the solutions quite restrictive. In this paper, we make a novel use RGB and RGBD data: from the RGB frame, we extract corner features (FAST) and then represent these features with the histogram of oriented gradients (HoG) descriptor. We train a non-linear SVM on these descriptors. During the test phase, we make used of the fact that the foreground object has distinct depth ordering with respect to the rest of the scene. That is, we use the positively classified FAST features on the test frame to initiate a region growing to obtain the accurate segmentation of the foreground object from just the RGBD data. We demonstrate the proposed method of a synthetic datasets, and demonstrate encouraging quantitative and qualitative results.

  20. From the Pinhole Camera to the Shape of a Lens: The Camera-Obscura Reloaded

    Science.gov (United States)

    Ziegler, Max; Priemer, Burkhard

    2015-01-01

    We demonstrate how the form of a plano-convex lens and a derivation of the thin lens equation can be understood through simple physical considerations. The basic principle is the extension of the pinhole camera using additional holes. The resulting images are brought into coincidence through the deflection of light with an arrangement of prisms.…

  1. Lights, Camera, AG-Tion: Promoting Agricultural and Environmental Education on Camera

    Science.gov (United States)

    Fuhrman, Nicholas E.

    2016-01-01

    Viewing of online videos and television segments has become a popular and efficient way for Extension audiences to acquire information. This article describes a unique approach to teaching on camera that may help Extension educators communicate their messages with comfort and personality. The S.A.L.A.D. approach emphasizes using relevant teaching…

  2. Development of event reconstruction algorithm for full-body gamma-camera based on SiPMs

    Science.gov (United States)

    Philippov, D. E.; Belyaev, V. N.; Buzhan, P. Zh; Ilyin, A. L.; Popova, E. V.; Stifutkin, A. A.

    2016-02-01

    The gamma-camera is the detector for nuclear medical imaging where the photomultiplier tubes (PMTs) could be replaced by the silicon photomultipliers (SiPMs). Common systems have the energy resolution about 10% and intrinsic spatial resolution about 3 mm (FWHM). In order to achieve the requirement energy and spatial resolution the classical Anger's logic should be modified. In case of a standard monolithic thallium activated sodium iodide scintillator (500x400x10 mm3) and SiPM readout it could be done with identification of the clusters. We show that this approach has a good results with the simulated data.

  3. Calibration of the Lunar Reconnaissance Orbiter Camera

    Science.gov (United States)

    Tschimmel, M.; Robinson, M. S.; Humm, D. C.; Denevi, B. W.; Lawrence, S. J.; Brylow, S.; Ravine, M.; Ghaemi, T.

    2008-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) onboard the NASA Lunar Reconnaissance Orbiter (LRO) spacecraft consists of three cameras: the Wide-Angle Camera (WAC) and two identical Narrow Angle Cameras (NAC-L, NAC-R). The WAC is push-frame imager with 5 visible wavelength filters (415 to 680 nm) at a spatial resolution of 100 m/pixel and 2 UV filters (315 and 360 nm) with a resolution of 400 m/pixel. In addition to the multicolor imaging the WAC can operate in monochrome mode to provide a global large- incidence angle basemap and a time-lapse movie of the illumination conditions at both poles. The WAC has a highly linear response, a read noise of 72 e- and a full well capacity of 47,200 e-. The signal-to-noise ratio in each band is 140 in the worst case. There are no out-of-band leaks and the spectral response of each filter is well characterized. Each NAC is a monochrome pushbroom scanner, providing images with a resolution of 50 cm/pixel from a 50-km orbit. A single NAC image has a swath width of 2.5 km and a length of up to 26 km. The NACs are mounted to acquire side-by-side imaging for a combined swath width of 5 km. The NAC is designed to fully characterize future human and robotic landing sites in terms of topography and hazard risks. The North and South poles will be mapped on a 1-meter-scale poleward of 85.5° latitude. Stereo coverage can be provided by pointing the NACs off-nadir. The NACs are also highly linear. Read noise is 71 e- for NAC-L and 74 e- for NAC-R and the full well capacity is 248,500 e- for NAC-L and 262,500 e- for NAC- R. The focal lengths are 699.6 mm for NAC-L and 701.6 mm for NAC-R; the system MTF is 28% for NAC-L and 26% for NAC-R. The signal-to-noise ratio is at least 46 (terminator scene) and can be higher than 200 (high sun scene). Both NACs exhibit a straylight feature, which is caused by out-of-field sources and is of a magnitude of 1-3%. However, as this feature is well understood it can be greatly reduced during ground

  4. Effect of tube-support interaction on the dynamic responses of heat exchanger tubes. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Y.S.; Jendrzejczyk, J.A.; Wambsganss, M.W.

    1977-01-01

    Operating heat exchangers have experienced tube damages due to excessive flow-induced vibration. The relatively small inherent tube-to-baffle hole clearances associated with manufacturing tolerances in heat exchangers affect the tube vibrational characteristics. In attempting a theoretical analysis, questions arise as to the effects of tube-baffle impacting on dynamic responses. Experiments were performed to determine the effects of tube-baffle impacting in vertical/horizontal tube orientation, and in air/water medium on the vibrational characteristics (resonant frequencies, mode shapes, and damping) and displacement response amplitudes of a seven-span tube model. The tube and support conditions were prototypic, and overall length approximately one-third that of a straight tube segment of the steam generator designed for the CRBR. The test results were compared with the analytical results based on the multispan beam with ''knife-edge'' supports.

  5. Pulse tube cooler having 1/4 wavelength resonator tube instead of reservoir

    Science.gov (United States)

    Gedeon, David R. (Inventor)

    2008-01-01

    An improved pulse tube cooler having a resonator tube connected in place of a compliance volume or reservoir. The resonator tube has a length substantially equal to an integer multiple of 1/4 wavelength of an acoustic wave in the working gas within the resonator tube at its operating frequency, temperature and pressure. Preferably, the resonator tube is formed integrally with the inertance tube as a single, integral tube with a length approximately 1/2 of that wavelength. Also preferably, the integral tube is spaced outwardly from and coiled around the connection of the regenerator to the pulse tube at a cold region of the cooler and the turns of the coil are thermally bonded together to improve heat conduction through the coil.

  6. Camera systems in human motion analysis for biomedical applications

    Science.gov (United States)

    Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.

    2015-05-01

    Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.

  7. A wide-angle camera module for disposable endoscopy

    Science.gov (United States)

    Shim, Dongha; Yeon, Jesun; Yi, Jason; Park, Jongwon; Park, Soo Nam; Lee, Nanhee

    2016-08-01

    A wide-angle miniaturized camera module for disposable endoscope is demonstrated in this paper. A lens module with 150° angle of view (AOV) is designed and manufactured. All plastic injection-molded lenses and a commercial CMOS image sensor are employed to reduce the manufacturing cost. The image sensor and LED illumination unit are assembled with a lens module. The camera module does not include a camera processor to further reduce its size and cost. The size of the camera module is 5.5 × 5.5 × 22.3 mm3. The diagonal field of view (FOV) of the camera module is measured to be 110°. A prototype of a disposable endoscope is implemented to perform a pre-clinical animal testing. The esophagus of an adult beagle dog is observed. These results demonstrate the feasibility of a cost-effective and high-performance camera module for disposable endoscopy.

  8. Disposition of camera parameters in vehicle navigation system

    Science.gov (United States)

    Yu, Houyun; Zhang, Weigong

    2010-10-01

    To resolve the calibration of onboard camera in the vehicle navigation system based on machine vision, a respective method for disposing of intrinsic and extrinsic parameters of the camera is presented. In view of that the intrinsic parameters are basically invariable during the car's moving, they can be firstly calibrated with a planar pattern as soon as the camera is installed. The installation location of onboard camera can be real-time adjusted according to the slope and vanishing point of lane lines in the picture. Then the quantity of such extrinsic parameters as direction angle, incline angle and level translation are adjusted to zero. This respective disposing method for camera parameters is applied to lane departure detection on the structural road, with which camera calibration is simplified and the measuring error due to extrinsic parameters is decreased. The correctness and feasibility of the method is proved by theoretical calculation and practical experiment.

  9. Performance Evaluation of a Dedicated Camera Suitable for Dynamic Radiopharmaceuticals Evaluation in Small Animals

    Energy Technology Data Exchange (ETDEWEB)

    Loudos, George; Majewski, Stanislaw; Wojcik, Randolph; Weisenberger, Andrew; Sakelios, Nikolas; Nikita, Konstantina; Uzunoglu, Nikolaos; Bouziotis, Penelope; Xanthopoulos, Stavros; Varvarigou, Alexandra

    2007-06-01

    As the result of a collaboration between the Detector and Imaging Group of Thomas Jefferson National Accelerator Facility (US), the Institute of Radioisotopes and Radiodiagnostic Products (IRRP) of N.C.S.R. ldquoDemokritosrdquo and the Biomedical Simulations and Imaging Applications Laboratory (BIOSIM) of National Technical University of Athens (Greece), a mouse sized camera optimized for Tc^99m imaging was developed. The detector was built in Jefferson Lab and transferred to Greece, where it was evaluated with phantoms and small animals. The system will be used initially for planar dynamic studies in small animals, in order to assess the performance of new radiolabeled biomolecules for oncological studies. The active area of the detector is approximately 48 mm times 96 mm. It is based on two flat-panel Hamamatsu H8500 position sensitive photomultiplier tubes (PSPMT), a pixelated NaI(Tl) scintillator and a high resolution lead parallel-hole collimator. The system was developed to optim

  10. Development of a DSP-based real-time position calculation circuit for a beta camera

    CERN Document Server

    Yamamoto, S; Kanno, I

    2000-01-01

    A digital signal processor (DSP)-based position calculation circuit was developed and tested for a beta camera. The previous position calculation circuit which employed flash analog-to-digital (A-D) converters for A-D conversion and ratio calculation produced significant line artifacts in the image due to the differential non-linearity of the A-D converters. The new position calculation circuit uses four A-D converters for A-D conversion of the analog signals from the position sensitive photomultiplier tube (PSPMT). The DSP reads the A-D signals and calculates the ratio of X sub a /(X sub a +X sub b) and Y sub a /(Y sub a +Y sub b) on an event-by-event basis. The DSP also magnifies the image to fit the useful field of view (FOV) and rejects the events out of the FOV. The line artifacts in the image were almost eliminated.

  11. A correction method of the spatial distortion in planar images from γ-Camera systems

    Science.gov (United States)

    Thanasas, D.; Georgiou, E.; Giokaris, N.; Karabarbounis, A.; Maintas, D.; Papanicolas, C. N.; Polychronopoulou, A.; Stiliaris, E.

    2009-06-01

    A methodology for correcting spatial distortions in planar images for small Field Of View (FOV) γ-Camera systems based on Position Sensitive Photomultiplier Tubes (PSPMT) and pixelated scintillation crystals is described. The process utilizes a correction matrix whose elements are derived from a prototyped planar image obtained through irradiation of the scintillation crystal by a 60Co point source and without a collimator. The method was applied to several planar images of a SPECT experiment with a simple phantom construction at different detection angles. The tomographic images are obtained using the Maximum-Likelihood Expectation-Maximization (MLEM) reconstruction technique. Corrected and uncorrected images are compared and the applied correction methodology is discussed.

  12. Two-Phase Algorithm for Optimal Camera Placement

    OpenAIRE

    Jun-Woo Ahn; Tai-Woo Chang; Sung-Hee Lee; Yong Won Seo

    2016-01-01

    As markers for visual sensor networks have become larger, interest in the optimal camera placement problem has continued to increase. The most featured solution for the optimal camera placement problem is based on binary integer programming (BIP). Due to the NP-hard characteristic of the optimal camera placement problem, however, it is difficult to find a solution for a complex, real-world problem using BIP. Many approximation algorithms have been developed to solve this problem. In this pape...

  13. Integrating Scene Parallelism in Camera Auto-Calibration

    Institute of Scientific and Technical Information of China (English)

    LIU Yong (刘勇); WU ChengKe (吴成柯); Hung-Tat Tsui

    2003-01-01

    This paper presents an approach for camera auto-calibration from uncalibrated video sequences taken by a hand-held camera. The novelty of this approach lies in that the line parallelism is transformed to the constraints on the absolute quadric during camera autocalibration. This makes some critical cases solvable and the reconstruction more Euclidean. The approach is implemented and validated using simulated data and real image data. The experimental results show the effectiveness of the approach.

  14. IR Camera Report for the 7 Day Production Test

    Energy Technology Data Exchange (ETDEWEB)

    Holloway, Michael Andrew [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-22

    The following report gives a summary of the IR camera performance results and data for the 7 day production run that occurred from 10 Sep 2015 thru 16 Sep 2015. During this production run our goal was to see how well the camera performed its task of monitoring the target window temperature with our improved alignment procedure and emissivity measurements. We also wanted to see if the increased shielding would be effective in protecting the camera from damage and failure.

  15. A compact gamma camera for biological imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, E L; Cella, J; Majewski, S; Popov, V; Qian, Jianguo; Saha, M S; Smith, M F; Weisenberger, A G; Welsh, R E

    2006-02-01

    A compact detector, sized particularly for imaging a mouse, is described. The active area of the detector is approximately 46 mm; spl times/ 96 mm. Two flat-panel Hamamatsu H8500 position-sensitive photomultiplier tubes (PSPMTs) are coupled to a pixellated NaI(Tl) scintillator which views the animal through a copper-beryllium (CuBe) parallel-hole collimator specially designed for {sup 125}I. Although the PSPMTs have insensitive areas at their edges and there is a physical gap, corrections for scintillation light collection at the junction between the two tubes results in a uniform response across the entire rectangular area of the detector. The system described has been developed to optimize both sensitivity and resolution for in-vivo imaging of small animals injected with iodinated compounds. We demonstrate an in-vivo application of this detector, particularly to SPECT, by imaging mice injected with approximately 10-15; spl mu/Ci of {sup 125}I.

  16. Design and Field Test of a Galvanometer Deflected Streak Camera

    Energy Technology Data Exchange (ETDEWEB)

    Lai, C C; Goosman, D R; Wade, J T; Avara, R

    2002-11-08

    We have developed a compact fieldable optically-deflected streak camera first reported in the 20th HSPP Congress. Using a triggerable galvanometer that scans the optical signal, the imaging and streaking function is an all-optical process without incurring any photon-electron-photon conversion or photoelectronic deflection. As such, the achievable imaging quality is limited mainly only by optical design, rather than by multiple conversions of signal carrier and high voltage electron-optics effect. All core elements of the camera are packaged into a 12 inch x 24 inch footprint box, a size similar to that of a conventional electronic streak camera. At LLNL's Site-300 Test Site, we have conducted a Fabry-Perot interferometer measurement of fast object velocity using this all-optical camera side-by-side with an intensified electronic streak camera. These two cameras are configured as two independent instruments for recording synchronously each branch of the 50/50 splits from one incoming signal. Given the same signal characteristics, the test result has undisputedly demonstrated superior imaging performance for the all-optical streak camera. It produces higher signal sensitivity, wider linear dynamic range, better spatial contrast, finer temporal resolution, and larger data capacity as compared with that of the electronic counterpart. The camera had also demonstrated its structural robustness and functional consistence to be well compatible with field environment. This paper presents the camera design and the test results in both pictorial records and post-process graphic summaries.

  17. Calibration of a Stereo Radiation Detection Camera Using Planar Homography

    Directory of Open Access Journals (Sweden)

    Seung-Hae Baek

    2016-01-01

    Full Text Available This paper proposes a calibration technique of a stereo gamma detection camera. Calibration of the internal and external parameters of a stereo vision camera is a well-known research problem in the computer vision society. However, few or no stereo calibration has been investigated in the radiation measurement research. Since no visual information can be obtained from a stereo radiation camera, it is impossible to use a general stereo calibration algorithm directly. In this paper, we develop a hybrid-type stereo system which is equipped with both radiation and vision cameras. To calibrate the stereo radiation cameras, stereo images of a calibration pattern captured from the vision cameras are transformed in the view of the radiation cameras. The homography transformation is calibrated based on the geometric relationship between visual and radiation camera coordinates. The accuracy of the stereo parameters of the radiation camera is analyzed by distance measurements to both visual light and gamma sources. The experimental results show that the measurement error is about 3%.

  18. Do speed cameras reduce speeding in urban areas?

    Science.gov (United States)

    Oliveira, Daniele Falci de; Friche, Amélia Augusta de Lima; Costa, Dário Alves da Silva; Mingoti, Sueli Aparecida; Caiaffa, Waleska Teixeira

    2015-11-01

    This observational study aimed to estimate the prevalence of speeding on urban roadways and to analyze associated factors. The sample consisted of 8,565 vehicles circulating in areas with and without fixed speed cameras in operation. We found that 40% of vehicles 200 meters after the fixed cameras and 33.6% of vehicles observed on roadways without speed cameras were moving over the speed limit (p cameras, more women drivers were talking on their cell phones and wearing seatbelts when compared to men (p < 0.05 for both comparisons), independently of speed limits. The results suggest that compliance with speed limits requires more than structural interventions.

  19. Heterogeneous treatment effects of speed cameras on road safety.

    Science.gov (United States)

    Li, Haojie; Graham, Daniel J

    2016-12-01

    This paper analyses how the effects of fixed speed cameras on road casualties vary across sites with different characteristics and evaluates the criteria for selecting camera sites. A total of 771 camera sites and 4787 potential control sites are observed for a period of 9 years across England. Site characteristics such as road class, crash history and site length are combined into a single index, referred to as a propensity score. We first estimate the average effect at each camera site using propensity score matching. The effects are then estimated as a function of propensity scores using local polynomial regression. The results show that the reduction in personal injury collisions ranges from 10% to 40% whilst the average effect is 25.9%, indicating that the effects of speed cameras are not uniform across camera sites and are dependent on site characteristics, as measured by propensity scores. We further evaluate the criteria for selecting camera sites in the UK by comparing the effects at camera sites meeting and not meeting the criteria. The results show that camera sites which meet the criteria perform better in reducing casualties, implying the current site selection criteria are rational.

  20. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    Directory of Open Access Journals (Sweden)

    Mark Shortis

    2015-12-01

    Full Text Available Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems.

  1. Camera traps can be heard and seen by animals.

    Science.gov (United States)

    Meek, Paul D; Ballard, Guy-Anthony; Fleming, Peter J S; Schaefer, Michael; Williams, Warwick; Falzon, Greg

    2014-01-01

    Camera traps are electrical instruments that emit sounds and light. In recent decades they have become a tool of choice in wildlife research and monitoring. The variability between camera trap models and the methods used are considerable, and little is known about how animals respond to camera trap emissions. It has been reported that some animals show a response to camera traps, and in research this is often undesirable so it is important to understand why the animals are disturbed. We conducted laboratory based investigations to test the audio and infrared optical outputs of 12 camera trap models. Camera traps were measured for audio outputs in an anechoic chamber; we also measured ultrasonic (n = 5) and infrared illumination outputs (n = 7) of a subset of the camera trap models. We then compared the perceptive hearing range (n = 21) and assessed the vision ranges (n = 3) of mammals species (where data existed) to determine if animals can see and hear camera traps. We report that camera traps produce sounds that are well within the perceptive range of most mammals' hearing and produce illumination that can be seen by many species.

  2. Mid-IR image acquisition using a standard CCD camera

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin; Sørensen, Knud Palmelund; Pedersen, Christian

    2010-01-01

    Direct image acquisition in the 3-5 µm range is realized using a standard CCD camera and a wavelength up-converter unit. The converter unit transfers the image information to the NIR range were state-of-the-art cameras exist.......Direct image acquisition in the 3-5 µm range is realized using a standard CCD camera and a wavelength up-converter unit. The converter unit transfers the image information to the NIR range were state-of-the-art cameras exist....

  3. 360 deg Camera Head for Unmanned Sea Surface Vehicles

    Science.gov (United States)

    Townsend, Julie A.; Kulczycki, Eric A.; Willson, Reginald G.; Huntsberger, Terrance L.; Garrett, Michael S.; Trebi-Ollennu, Ashitey; Bergh, Charles F.

    2012-01-01

    The 360 camera head consists of a set of six color cameras arranged in a circular pattern such that their overlapping fields of view give a full 360 view of the immediate surroundings. The cameras are enclosed in a watertight container along with support electronics and a power distribution system. Each camera views the world through a watertight porthole. To prevent overheating or condensation in extreme weather conditions, the watertight container is also equipped with an electrical cooling unit and a pair of internal fans for circulation.

  4. Calibration Techniques for Accurate Measurements by Underwater Camera Systems.

    Science.gov (United States)

    Shortis, Mark

    2015-12-07

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems.

  5. Counter-driver shock tube

    Science.gov (United States)

    Tamba, T.; Nguyen, T. M.; Takeya, K.; Harasaki, T.; Iwakawa, A.; Sasoh, A.

    2015-11-01

    A "counter-driver" shock tube was developed. In this device, two counter drivers are actuated with an appropriate delay time to generate the interaction between a shock wave and a flow in the opposite direction which is induced by another shock wave. The conditions for the counter drivers can be set independently. Each driver is activated by a separate electrically controlled diaphragm rupture device, in which a pneumatic piston drives a rupture needle with a temporal jitter of better than 1.1 ms. Operation demonstrations were conducted to evaluate the practical performance.

  6. Improved Traveling-Wave Tube

    Science.gov (United States)

    Rousseau, Art; Tammaru, Ivo; Vaszari, John

    1988-01-01

    New space traveling-wave tube (TWT) provides coherent source of 75 watts of continuous-wave power output over bandwidth of 5 GHz at frequency of 65 GHz. Coupled-cavity TWT provides 50 dB of saturated gain. Includes thermionic emitter, M-type dispenser cathode providing high-power electron beam. Beam focused by permanent magnets through center of radio-frequency cavity structure. Designed for reliable operation for 10 years, and overall efficiency of 35 percent minimizes prime power input and dissipation of heat.

  7. Closed End Launch Tube (CELT)

    Science.gov (United States)

    Lueck, Dale E.; Immer, Christopher D.

    2004-02-01

    A small-scale test apparatus has been built and tested for the CELT pneumatic launch assist concept presented at STAIF 2001. The 7.5 cm (3-inch) diameter × 305 M (1000 feet) long system accelerates and pneumatically brakes a 6.35 cm diameter projectile with variable weight (1.5 - 5 Kg). The acceleration and braking tube has been instrumented with optical sensors and pressure transducers at 14 stations to take data throughout the runs. Velocity and pressure profiles for runs with various accelerator pressures and projectile weights are given. This test apparatus can serve as an important experimental tool for verifying this concept.

  8. Smart Cameras for Remote Science Survey

    Science.gov (United States)

    Thompson, David R.; Abbey, William; Allwood, Abigail; Bekker, Dmitriy; Bornstein, Benjamin; Cabrol, Nathalie A.; Castano, Rebecca; Estlin, Tara; Fuchs, Thomas; Wagstaff, Kiri L.

    2012-01-01

    Communication with remote exploration spacecraft is often intermittent and bandwidth is highly constrained. Future missions could use onboard science data understanding to prioritize downlink of critical features [1], draft summary maps of visited terrain [2], or identify targets of opportunity for followup measurements [3]. We describe a generic approach to classify geologic surfaces for autonomous science operations, suitable for parallelized implementations in FPGA hardware. We map these surfaces with texture channels - distinctive numerical signatures that differentiate properties such as roughness, pavement coatings, regolith characteristics, sedimentary fabrics and differential outcrop weathering. This work describes our basic image analysis approach and reports an initial performance evaluation using surface images from the Mars Exploration Rovers. Future work will incorporate these methods into camera hardware for real-time processing.

  9. Relevance of ellipse eccentricity for camera calibration

    Science.gov (United States)

    Mordwinzew, W.; Tietz, B.; Boochs, F.; Paulus, D.

    2015-05-01

    Plane circular targets are widely used within calibrations of optical sensors through photogrammetric set-ups. Due to this popularity, their advantages and disadvantages are also well studied in the scientific community. One main disadvantage occurs when the projected target is not parallel to the image plane. In this geometric constellation, the target has an elliptic geometry with an offset between its geometric and its projected center. This difference is referred to as ellipse eccentricity and is a systematic error which, if not treated accordingly, has a negative impact on the overall achievable accuracy. The magnitude and direction of eccentricity errors are dependent on various factors. The most important one is the target size. The bigger an ellipse in the image is, the bigger the error will be. Although correction models dealing with eccentricity have been available for decades, it is mostly seen as a planning task in which the aim is to choose the target size small enough so that the resulting eccentricity error remains negligible. Besides the fact that advanced mathematical models are available and that the influence of this error on camera calibration results is still not completely investigated, there are various additional reasons why bigger targets can or should not be avoided. One of them is the growing image resolution as a by-product from advancements in the sensor development. Here, smaller pixels have a lower S/N ratio, necessitating more pixels to assure geometric quality. Another scenario might need bigger targets due to larger scale differences whereas distant targets should still contain enough information in the image. In general, bigger ellipses contain more contour pixels and therefore more information. This supports the target-detection algorithms to perform better even at non-optimal conditions such as data from sensors with a high noise level. In contrast to rather simple measuring situations in a stereo or multi-image mode, the impact

  10. CHAMP (Camera, Handlens, and Microscope Probe)

    Science.gov (United States)

    Mungas, Greg S.; Boynton, John E.; Balzer, Mark A.; Beegle, Luther; Sobel, Harold R.; Fisher, Ted; Klein, Dan; Deans, Matthew; Lee, Pascal; Sepulveda, Cesar A.

    2005-01-01

    CHAMP (Camera, Handlens And Microscope Probe)is a novel field microscope capable of color imaging with continuously variable spatial resolution from infinity imaging down to diffraction-limited microscopy (3 micron/pixel). As a robotic arm-mounted imager, CHAMP supports stereo imaging with variable baselines, can continuously image targets at an increasing magnification during an arm approach, can provide precision rangefinding estimates to targets, and can accommodate microscopic imaging of rough surfaces through a image filtering process called z-stacking. CHAMP was originally developed through the Mars Instrument Development Program (MIDP) in support of robotic field investigations, but may also find application in new areas such as robotic in-orbit servicing and maintenance operations associated with spacecraft and human operations. We overview CHAMP'S instrument performance and basic design considerations below.

  11. Evryscope Robotilter automated camera / ccd alignment system

    Science.gov (United States)

    Ratzloff, Jeff K.; Law, Nicholas M.; Fors, Octavi; Ser, Daniel d.; Corbett, Henry T.

    2016-08-01

    We have deployed a new class of telescope, the Evryscope, which opens a new parameter space in optical astronomy - the ability to detect short time scale events across the entire sky simultaneously. The system is a gigapixel-scale array camera with an 8000 sq. deg. field of view, 13 arcsec per pixel sampling, and the ability to detect objects brighter than g = 16 in each 2-minute exposure. The Evryscope is designed to find transiting exoplanets around exotic stars, as well as detect nearby supernovae and provide continuous records of distant relativistic explosions like gamma-ray-bursts. The Evryscope uses commercially available CCDs and optics; the machine and assembly tolerances inherent in the mass production of these parts introduce problematic variations in the lens / CCD alignment which degrades image quality. We have built an automated alignment system (Robotilters) to solve this challenge. In this paper we describe the Robotilter system, mechanical and software design, image quality improvement, and current status.

  12. Retinal oximetry with a multiaperture camera

    Science.gov (United States)

    Lemaillet, Paul; Lompado, Art; Ibrahim, Mohamed; Nguyen, Quan Dong; Ramella-Roman, Jessica C.

    2010-02-01

    Oxygen saturation measurements in the retina is an essential measurement in monitoring eye health of diabetic patient. In this paper, preliminary result of oxygen saturation measurements for a healthy patient retina is presented. The retinal oximeter used is based on a regular fundus camera to which was added an optimized optical train designed to perform aperture division whereas a filter array help select the requested wavelengths. Hence, nine equivalent wavelength-dependent sub-images are taken in a snapshot which helps minimizing the effects of eye movements. The setup is calibrated by using a set of reflectance calibration phantoms and a lookuptable (LUT) is computed. An inverse model based on the LUT is presented to extract the optical properties of a patient fundus and further estimate the oxygen saturation in a retina vessel.

  13. 3D Capturing with Monoscopic Camera

    Directory of Open Access Journals (Sweden)

    M. Galabov

    2014-12-01

    Full Text Available This article presents a new concept of using the auto-focus function of the monoscopic camera sensor to estimate depth map information, which avoids not only using auxiliary equipment or human interaction, but also the introduced computational complexity of SfM or depth analysis. The system architecture that supports both stereo image and video data capturing, processing and display is discussed. A novel stereo image pair generation algorithm by using Z-buffer-based 3D surface recovery is proposed. Based on the depth map, we are able to calculate the disparity map (the distance in pixels between the image points in both views for the image. The presented algorithm uses a single image with depth information (e.g. z-buffer as an input and produces two images for left and right eye.

  14. Robust multi-camera view face recognition

    CERN Document Server

    Kisku, Dakshina Ranjan; Gupta, Phalguni; Sing, Jamuna Kanta

    2010-01-01

    This paper presents multi-appearance fusion of Principal Component Analysis (PCA) and generalization of Linear Discriminant Analysis (LDA) for multi-camera view offline face recognition (verification) system. The generalization of LDA has been extended to establish correlations between the face classes in the transformed representation and this is called canonical covariate. The proposed system uses Gabor filter banks for characterization of facial features by spatial frequency, spatial locality and orientation to make compensate to the variations of face instances occurred due to illumination, pose and facial expression changes. Convolution of Gabor filter bank to face images produces Gabor face representations with high dimensional feature vectors. PCA and canonical covariate are then applied on the Gabor face representations to reduce the high dimensional feature spaces into low dimensional Gabor eigenfaces and Gabor canonical faces. Reduced eigenface vector and canonical face vector are fused together usi...

  15. Dark energy camera installation at CTIO: overview

    Science.gov (United States)

    Abbott, Timothy M.; Muñoz, Freddy; Walker, Alistair R.; Smith, Chris; Montane, Andrés.; Gregory, Brooke; Tighe, Roberto; Schurter, Patricio; van der Bliek, Nicole S.; Schumacher, German

    2012-09-01

    The Dark Energy Camera (DECam) has been installed on the V. M. Blanco telescope at Cerro Tololo Inter-American Observatory in Chile. This major upgrade to the facility has required numerous modifications to the telescope and improvements in observatory infrastructure. The telescope prime focus assembly has been entirely replaced, and the f/8 secondary change procedure radically changed. The heavier instrument means that telescope balance has been significantly modified. The telescope control system has been upgraded. NOAO has established a data transport system to efficiently move DECam's output to the NCSA for processing. The observatory has integrated the DECam highpressure, two-phase cryogenic cooling system into its operations and converted the Coudé room into an environmentally-controlled instrument handling facility incorporating a high quality cleanroom. New procedures to ensure the safety of personnel and equipment have been introduced.

  16. Shock tube measurements of the optical absorption of triatomic carbon, C3

    Science.gov (United States)

    Jones, J. J.

    1977-01-01

    The spectral absorption of C3 has been measured in a shock tube using a test gas mixture of acetylene diluted with argon. The absorption of a pulsed xenon light source was measured by means of eight photomultiplier channels to a spectrograph and an accompanying drum camera. The postshock test gas temperature and pressure were varied over the range 3300-4300 K and 0.36 to 2.13 atmospheres, respectively. The results showed appreciable absorption from C3 for the wavelength range 300 to 540 nanometers. The computed electronic oscillator strength varied from 0.12 to 0.06 as a function of temperature.

  17. Evaluation of Photo Multiplier Tube candidates for the Cherenkov Telescope Array

    Energy Technology Data Exchange (ETDEWEB)

    Mirzoyan, R. [Max-Planck-Institute for Physics, Föhringer Ring 6, 80805 Munich (Germany); Müller, D., E-mail: dmueller@mpp.mpg.de [Max-Planck-Institute for Physics, Föhringer Ring 6, 80805 Munich (Germany); Hanabata, Y. [Institute for Cosmic Ray Research, The University of Tokyo, Kashiwa, Chiba 277-8582 (Japan); Hose, J.; Menzel, U. [Max-Planck-Institute for Physics, Föhringer Ring 6, 80805 Munich (Germany); Nakajima, D.; Takahashi, M. [Institute for Cosmic Ray Research, The University of Tokyo, Kashiwa, Chiba 277-8582 (Japan); Teshima, M. [Max-Planck-Institute for Physics, Föhringer Ring 6, 80805 Munich (Germany); Institute for Cosmic Ray Research, The University of Tokyo, Kashiwa, Chiba 277-8582 (Japan); Toyama, T. [Max-Planck-Institute for Physics, Föhringer Ring 6, 80805 Munich (Germany); Yamamoto, T. [Department of Physics, Konan University, Okamoto 8-9-1, Higashinada-ku, Kobe, Hyogo 658-0072 (Japan)

    2016-07-11

    Photo Multiplier Tubes (PMTs) are the most wide spread detectors for fast, faint light signals. Six years ago, an improvement program for the PMT candidates for the Cherenkov Telescope Array (CTA) project was started with the companies Hamamatsu Photonics K.K. and Electron Tubes Enterprises Ltd. (ETE). For maximizing the performance of the CTA imaging cameras we need PMTs with outstanding good quantum efficiency, high photoelectron collection efficiency, short pulse width, very low afterpulse probability and transit time spread. We will report on the measurements of PMT R-12992-100 from Hamamatsu as their final product and the PMT D573KFLSA as one of the latest test versions from ETE as candidate PMTs for the CTA project.

  18. Successful tubes treatment of esophageal fistula

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Aim: To discuss the merits of "tubes treatment" for esophageal fistula (EF). Methods: A 66-year-old female who suffered from a bronchoesophageal and esophagothoratic fistula underwent a successful "three tubes treatment" (close chest drainage, negative pressure suction at the leak, and nasojejunal feeding tube), combination of antibiotics, antacid drugs and nutritional support. Another 55-year-old male patient developed an esophagopleural fistula (EPF) after esophageal carcinoma operation. He too was treated conservatively with the three tubes strategy as mentioned above towards a favorable outcome. Results:The two patients recovered with the tubes treatment, felt well and became able to eat and drink, presenting no complaint. Conclusion: Tubes treatment is an effective basic way for EF. It may be an alternative treatment option.

  19. CFD Simulation on Ethylene Furnace Reactor Tubes

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Different mathematical models for ethylene furnace reactor tubes were reviewed. On the basis of these models a new mathematical simulation approach for reactor tubes based on computational fluid dynamics (CFD) technique was presented. This approach took the flow, heat transfer, mass transfer and thermal cracking reactions in the reactor tubes into consideration. The coupled reactor model was solved with the SIMPLE algorithm. Some detailed information about the flow field, temperature field and concentration distribution in the reactor tubes was obtained, revealing the basic characteristics of the hydrodynamic phenomena and reaction behavior in the reactor tubes. The CFD approach provides the necessary information for conclusive decisions regarding the production optimization, the design and improvement of reactor tubes, and the new techniques implementation.

  20. Narratives From YouTube

    Directory of Open Access Journals (Sweden)

    Mikael Quennerstedt

    2013-10-01

    Full Text Available The aim of this paper is to explore what is performed in students’ and teachers’ actions in physical education practice in terms of “didactic irritations,” through an analysis of YouTube clips from 285 PE lessons from 27 different countries. Didactic irritations are occurrences that Rønholt describes as those demanding “didactic, pedagogical reflections and discussions, which in turn could lead to alternative thinking and understanding about teaching and learning.” Drawing on Barad’s ideas of performativity to challenge our habitual anthropocentric analytical gaze when looking at educational visual data, and using narrative construction, we also aim to give meaning to actions, relations, and experiences of the participants in the YouTube clips. To do this, we present juxtaposing narratives from teachers and students in terms of three “didactic irritations”: (a stories from a track, (b, stories from a game, and (c, stories from a bench. The stories re-present events-of-moving in the data offering insights into embodied experiences in PE practice, making students’ as well as teachers’ actions in PE practice understandable.